scholarly journals FX Job Recruitment

2021 ◽  
Vol 10 (02) ◽  
pp. 239-249
Author(s):  
Padmanaban Padmanaban ◽  
Sujitha Sujitha ◽  
Muppidathi @Priya

In this project present a secure and privacy-preserving access control to users, which guarantee any member in a group to anonymously utilize the cloud resource. Moreover, the real identities of data owners can be revealed by the group manager when disputes occur. In this project provide rigorous security analysis, and perform extensive simulations to demonstrate the efficiency of our scheme in terms of storage and computation overhead. Cloud computing provides an economical and efficient solution for sharing group resource among cloud users. Unfortunately, sharing data in a multi-job portal manner while preserving data and identity privacy from an un trusted cloud is still a challenging issue, due to the frequent change of the membership . The major aims of this method a secure multi-owner data sharing scheme. It implies that any user in the group can securely share data with others by the un trusted cloud. This scheme is able to support dynamic groups. Efficiently, specifically, new granted users can directly decrypt data files uploaded before their participation without contacting with data owners. User revocation can be easily achieved through a novel revocation list without updating the secret Keys of the remaining users. The size and computation overhead of encryption are constant and Independent with the number of revoked users. Job portal is developed for creating an interactive job vacancy for candidates. This web application is to be conceived in its current form as a dynamic site-requiring constant updates both from the seekers as well as the companies. On the whole the objective of the project is to enable jobseekers to place their resumes and companies to publish their vacancies.

Author(s):  
K. V. Uma Maheswari ◽  
Dr. Dhanaraj Cheelu

Cloud computing is recognized as an alternative to traditional information technology due to its intrinsic resource sharing and low maintenance characteristics. Cloud computing provides an economical and efficient solution for sharing group resource among cloud users. Unfortunately, when sharing the data in a group while preserving data, identity privacy is still a challenging issue due to frequent change in membership. In overcome this problem, a secure data sharing scheme for dynamic groups is proposed so that any user within a group can share the data in a secure manner by leveraging both the group signature and dynamic broadcast encryption techniques. It should enable any cloud user to anonymously share data with others within the group and support efficient member revocation. The storage overhead and encryption computation cost are dependent on the number of revoked users.


2020 ◽  
Vol 2 (2) ◽  
Author(s):  
Suzanna Schmeelk ◽  
Lixin Tao

Many organizations, to save costs, are movinheg to t Bring Your Own Mobile Device (BYOD) model and adopting applications built by third-parties at an unprecedented rate.  Our research examines software assurance methodologies specifically focusing on security analysis coverage of the program analysis for mobile malware detection, mitigation, and prevention.  This research focuses on secure software development of Android applications by developing knowledge graphs for threats reported by the Open Web Application Security Project (OWASP).  OWASP maintains lists of the top ten security threats to web and mobile applications.  We develop knowledge graphs based on the two most recent top ten threat years and show how the knowledge graph relationships can be discovered in mobile application source code.  We analyze 200+ healthcare applications from GitHub to gain an understanding of their software assurance of their developed software for one of the OWASP top ten moble threats, the threat of “Insecure Data Storage.”  We find that many of the applications are storing personally identifying information (PII) in potentially vulnerable places leaving users exposed to higher risks for the loss of their sensitive data.


2020 ◽  
Vol 8 ◽  
Author(s):  
Steven Bachman ◽  
Barnaby Walker ◽  
Sara Barrios ◽  
Alison Copeland ◽  
Justin Moat

The IUCN Red List of Threatened SpeciesTM (hereafter the Red List) is an important global resource for conservation that supports conservation planning, safeguarding critical habitat and monitoring biodiversity change (Rodrigues et al. 2006). However, a major shortcoming of the Red List is that most of the world's described species have not yet been assessed and published on the Red List (Bachman et al. 2019Eisenhauer et al. 2019). Conservation efforts can be better supported if the Red List is expanded to achieve greater coverage of mega-diverse groups of organisms such as plants, fungi and invertebrates. There is, therefore, an urgent need to speed up the Red List assessment and documentation workflow. One reason for this lack of species coverage is that a manual and relatively time-consuming procedure is usually employed to assess and document species. A recent update of Red List documentation standards (IUCN 2013) reduced the data requirements for publishing non-threatened or 'Least Concern' species on the Red List. The majority of the required fields for Least Concern plant species can be found in existing open-access data sources or can be easily calculated. There is an opportunity to consolidate these data and analyses into a simple application to fast-track the publication of Least Concern assessments for plants. There could be as many as 250,000 species of plants (60%) likely to be categorised as Least Concern (Bachman et al. 2019), for which automatically generated assessments could considerably reduce the outlay of time and valuable resources for Red Listing, allowing attention and resources to be dedicated to the assessment of those species most likely to be threatened. We present a web application, Rapid Least Concern, that addresses the challenge of accelerating the generation and documentation of Least Concern Red List assessments. Rapid Least Concern utilises open-source datasets, such as the Global Biodiversity Information Facility (GBIF) and Plants of the World Online (POWO) through a simple web interface. Initially, the application is intended for use on plants, but it could be extended to other groups, depending on the availability of equivalent datasets for these groups. Rapid Least Concern users can assess a single species or upload a list of species that are assessed in a batch operation. The batch operation can either utilise georeferenced occurrence data from GBIF or occurrence data provided by the user. The output includes a series of CSV files and a point map file that meet the minimum data requirements for a Least Concern Red List assessment (IUCN 2013). The CSV files are compliant with the IUCN Red List SIS Connect system that transfers the data files to the IUCN database and, pending quality control checks and review, publication on the Red List. We outline the knowledge gap this application aims to fill and describe how the application works. We demonstrate a use-case for Rapid Least Concern as part of an ongoing initiative to complete a global Red List assessment of all native species for the United Kingdom Overseas Territory of Bermuda.


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1657
Author(s):  
Ke Yuan ◽  
Yingjie Yan ◽  
Tong Xiao ◽  
Wenchao Zhang ◽  
Sufang Zhou ◽  
...  

In response to the rapid growth of credit-investigation data, data redundancy among credit-investigation agencies, privacy leakages of credit-investigation data subjects, and data security risks have been reported. This study proposes a privacy-protection scheme for a credit-investigation system based on blockchain technology, which realizes the secure sharing of credit-investigation data among multiple entities such as credit-investigation users, credit-investigation agencies, and cloud service providers. This scheme is based on blockchain technology to solve the problem of islanding of credit-investigation data and is based on zero-knowledge-proof technology, which works by submitting a proof to the smart contract to achieve anonymous identity authentication, ensuring that the identity privacy of credit-investigation users is not disclosed; this scheme is also based on searchable-symmetric-encryption technology to realize the retrieval of the ciphertext of the credit-investigation data. A security analysis showed that this scheme guarantees the confidentiality, the availability, the tamper-proofability, and the ciphertext searchability of credit-investigation data, as well as the fairness and anonymity of identity authentication in the credit-investigation data query. An efficiency analysis showed that, compared with similar identity-authentication schemes, the proof key of this scheme is smaller, and the verification time is shorter. Compared with similar ciphertext-retrieval schemes, the time for this scheme to generate indexes and trapdoors and return search results is significantly shorter.


2019 ◽  
Vol 21 (Supplement_6) ◽  
pp. vi79-vi79
Author(s):  
Laila Poisson ◽  
M C M Kouwenhoven ◽  
James Snyder ◽  
Kristin Alfaro-Munoz ◽  
Manpreet Kaur ◽  
...  

Abstract As an uncommon cancer, clinical and translational studies of glioma rely on multi-center collaborations, confirmatory studies, and meta-analyses. Unfortunately, interpretation of results across studies is hampered by the absence of uniformly coded clinical data. Common Data Elements (CDE) represent a set of clinical features for which the language has been standardized for consistent data capture across studies, institutions and registries. We constructed CDE for the longitudinal study of adult malignant glioma. To identify the minimum set of CDE needed to describe the clinical course of glioma, we surveyed clinical standards, ongoing trials, published studies, and data repositories for frequently used data elements. We harmonized the identified clinical variables, filled in gaps, and structured them in a modular schema, defining CDE for patient demographics, medical history, diagnosis, surgery, chemotherapy, radiotherapy, other treatments, and outcomes. Multidisciplinary experts from the Glioma Longitudinal AnalySiS (GLASS) consortium, representing clinical, molecular, and data research perspectives, were consulted regarding CDE. The validity and capture feasibility of the CDE were assessed through harmonization across published studies, then validated with single institution retrospective chart abstraction. The refined CDE library is implemented in the Research Electronic Data Capture (REDCap) System, a secure web application for building and managing online surveys and databases. The work was motivated by the GLASS consortium, which supports the aggregation and analysis of complex genetic datasets used to define molecular trajectories for glioma. The goal is that modular REDCap implementation of CDE allows broad adoption in glioma research. To accommodate novel aspects, the CDE sets can be expanded through additional modules. In contrast, for efficient initiation of focused studies, subsets of CDE can be selected. Broad adoption of CDE will improve the ability to compare results and share data between studies, thereby maximizing the value of existing data sources and small patient populations.


2012 ◽  
Vol 2012 ◽  
pp. 1-13 ◽  
Author(s):  
Khaled Loukhaoukha ◽  
Jean-Yves Chouinard ◽  
Abdellah Berdai

In the past few years, several encryption algorithms based on chaotic systems have been proposed as means to protect digital images against cryptographic attacks. These encryption algorithms typically use relatively small key spaces and thus offer limited security, especially if they are one-dimensional. In this paper, we proposed a novel image encryption algorithm based on Rubik's cube principle. The original image is scrambled using the principle of Rubik's cube. Then, XOR operator is applied to rows and columns of the scrambled image using two secret keys. Finally, the experimental results and security analysis show that the proposed image encryption scheme not only can achieve good encryption and perfect hiding ability but also can resist exhaustive attack, statistical attack, and differential attack.


2021 ◽  
Author(s):  
Mohammad Madine ◽  
Khaled Salah ◽  
Raja Jayaraman ◽  
Yousof Al-Hammadi ◽  
Junaid Arshad ◽  
...  

Blockchain technology has the potential to revolutionize industries by offering decentralized, transparent, data provenance, auditable, reliable, and trustworthy features. However, cross-chain interoperability is one of the crucial challenges preventing widespread adoption of blockchain applications. Cross-chain interoperability represents the ability for one blockchain network to interact and share data with another blockchain network. Contemporary cross-chain interoperability solutions are centralized and require re-engineering of the core blockchain stack to enable inter-communication and data sharing among heterogeneous blockchain networks. In this paper, we propose an application-based cross-chain interoperability solution that allows blockchain networks of any architecture type and industrial focus to inter-communicate, share data, and make requests. Our solution utilizes the decentralized applications as a distributed translation layer that is capable of communicating and understanding multiple blockchain networks, thereby delegating requests and parameters among them. The architecture uses incentivized verifier nodes that maintain the integrity of shared data facilitating them to be readable by the entities of their network. We define and describe the roles and requirements of major entities of inter-operating blockchain networks in the context of healthcare. We present a detailed explanation of the sequence of interactions needed to share an Electronic Medical Record (EMR) document from one blockchain network to another along with the required algorithms. We implement the proposed solution with Ethereum-based smart contracts for two hospitals and also present cost and security analysis for the cross-chain interoperability solution. We make our smart contracts code and testing scripts publicly available.


2020 ◽  
Vol 44 (1-2) ◽  
pp. 1-11
Author(s):  
Vicky Steeves ◽  
Rémi Rampin ◽  
Fernando Chirigati

The adoption of reproducibility remains low, despite incentives becoming increasingly common in different domains, conferences, and journals. The truth is, reproducibility is technically difficult to achieve due to the complexities of computational environments. To address these technical challenges, we created ReproZip, an open-source tool that automatically packs research along with all the necessary information to reproduce it, including data files, software, OS version, and environment variables. Everything is then bundled into an rpz file, which users can use to reproduce the work with ReproZip and a suitable unpacker (e.g.: using Vagrant or Docker). The rpz file is general and contains rich metadata: more unpackers can be added as needed, better guaranteeing long-term preservation. However, installing the unpackers can still be burdensome for secondary users of ReproZip bundles. In this paper, we will discuss how ReproZip and our new tool, ReproServer, can be used together to facilitate access to well-preserved, reproducible work. ReproServer is a web application that allows users to upload or provide a link to a ReproZip bundle, and then interact with/reproduce the contents from the comfort of their browser. Users are then provided a persistent link to the unpacked work on ReproServer which they can share with reviewers or colleagues.


Author(s):  
Prof. Asma Shaikh ◽  
Ms. Rasika Kotavadekar ◽  
Ms. Sanjita Sawant ◽  
Ms. Sayali Landge

The biggest invention of 21st century is the social media. It is biggest platform which is using to share data, files and documents. Even it is using to share thoughts, ideas and feelings using different tools and techniques. People are hyper connected with each other and they are continuously sharing the information. For criminals, deploying malware in such scenario is very easy and propagating malware through JPEG images and QR Code is one of the best and most advanced method. Using steganography techniques, criminals embedded the malicious codes with legitimate or innocent looking images. This malicious content is just few line of codes which exploit the vulnerability of application. It give remote access of this system to the attacker which can do criminal act. In this framework, our primary purpose is to find the presence of any code or data in image. After it, the major section of this framework based upon the finding of code and its adverse effects. This framework shows the corresponding solution to the malicious code presence in JPEG images and QR code which are spreading through online social networking sites.


2021 ◽  
Author(s):  
Mohammad Madine ◽  
Khaled Salah ◽  
Raja Jayaraman ◽  
Yousof Al-Hammadi ◽  
Junaid Arshad ◽  
...  

Blockchain technology has the potential to revolutionize industries by offering decentralized, transparent, data provenance, auditable, reliable, and trustworthy features. However, cross-chain interoperability is one of the crucial challenges preventing widespread adoption of blockchain applications. Cross-chain interoperability represents the ability for one blockchain network to interact and share data with another blockchain network. Contemporary cross-chain interoperability solutions are centralized and require re-engineering of the core blockchain stack to enable inter-communication and data sharing among heterogeneous blockchain networks. In this paper, we propose an application-based cross-chain interoperability solution that allows blockchain networks of any architecture type and industrial focus to inter-communicate, share data, and make requests. Our solution utilizes the decentralized applications as a distributed translation layer that is capable of communicating and understanding multiple blockchain networks, thereby delegating requests and parameters among them. The architecture uses incentivized verifier nodes that maintain the integrity of shared data facilitating them to be readable by the entities of their network. We define and describe the roles and requirements of major entities of inter-operating blockchain networks in the context of healthcare. We present a detailed explanation of the sequence of interactions needed to share an Electronic Medical Record (EMR) document from one blockchain network to another along with the required algorithms. We implement the proposed solution with Ethereum-based smart contracts for two hospitals and also present cost and security analysis for the cross-chain interoperability solution. We make our smart contracts code and testing scripts publicly available.


Sign in / Sign up

Export Citation Format

Share Document