scholarly journals The Cambridge Cognitive and Psychiatric Assessment Kit (CamCOPS): A Secure Open-Source Client–Server System for Mobile Research and Clinical Data Capture

2021 ◽  
Vol 12 ◽  
Author(s):  
Rudolf N. Cardinal ◽  
Martin Burchell

CamCOPS is a free, open-source client–server system for secure data capture in the domain of psychiatry, psychology, and the clinical neurosciences. The client is a cross-platform C++ application, suitable for mobile and offline (disconnected) use. It allows touchscreen data entry by subjects/patients, researchers/clinicians, or both together. It implements a large and extensible range of tasks, from simple questionnaires to complex animated tasks. The client uses encrypted data storage and sends data via an encrypted network connection to a CamCOPS server. Individual institutional users set up and run their own CamCOPS server, so no data is transferred outside the hosting institution's control. The server, written in Python, provides clinically oriented and research-oriented views of tasks, including the tracking of changes over time. It provides an audit trail, export facilities (such as to an institution's primary electronic health record system), and full structured data access subject to authorization. A single CamCOPS server can support multiple research/clinical groups, each having its own identity policy (e.g., fully identifiable for clinical use; de-identified/pseudonymised for research use). Intellectual property rules regarding third-party tasks vary and CamCOPS has several mechanisms to support compliance, including for tasks that may be permitted to some institutions but not others. CamCOPS supports task scheduling and home testing via a simplified user interface. We describe the software, report local information governance approvals within part of the UK National Health Service, and describe illustrative clinical and research uses.

2014 ◽  
Vol 53 (03) ◽  
pp. 202-207 ◽  
Author(s):  
M. Haag ◽  
L. R. Pilz ◽  
D. Schrimpf

SummaryBackground: Clinical trials (CT) are in a wider sense experiments to prove and establish clinical benefit of treatments. Nowadays electronic data capture systems (EDCS) are used more often bringing a better data management and higher data quality into clinical practice. Also electronic systems for the randomization are used to assign the patients to the treatments.Objectives: If the mentioned randomization system (RS) and EDCS are used, possibly identical data are collected in both, especially by stratified randomization. This separated data storage may lead to data inconsistency and in general data samples have to be aligned. The article discusses solutions to combine RS and EDCS. In detail one approach is realized and introduced.Methods: Different possible settings of combination of EDCS and RS are determined and the pros and cons for each solution are worked out. For the combination of two independent applications the necessary interfaces for the communication are defined. Thereby, existing standards are considered. An example realization is implemented with the help of open-source applications and state-of-the-art software development procedures.Results: Three possibilities of separate usage or combination of EDCS and RS are pre -sented and assessed: i) the complete independent usage of both systems; ii) realization of one system with both functions; and iii) two separate systems, which communicate via defined interfaces. In addition a realization of our preferred approach, the combination of both systems, is introduced using the open source tools RANDI2 and Open-Clinica.Conclusion: The advantage of a flexible independent development of EDCS and RS is shown based on the fact that these tool are very different featured. In our opinion the combination of both systems via defined interfaces fulfills the requirements of randomization and electronic data capture and is feasible in practice. In addition, the use of such a setting can reduce the training costs and the error-prone duplicated data entry.


Author(s):  
Poovizhi. M ◽  
Raja. G

Using Cloud Storage, users can tenuously store their data and enjoy the on-demand great quality applications and facilities from a shared pool of configurable computing resources, without the problem of local data storage and maintenance. However, the fact that users no longer have physical possession of the outsourced data makes the data integrity protection in Cloud Computing a formidable task, especially for users with constrained dividing resources. From users’ perspective, including both individuals and IT systems, storing data remotely into the cloud in a flexible on-demand manner brings tempting benefits: relief of the burden for storage management, universal data access with independent geographical locations, and avoidance of capital expenditure on hardware, software, and personnel maintenances, etc. To securely introduce an effective Sanitizer and third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to capably audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user; 2) The third party auditing process should take in no new vulnerabilities towards user data privacy. In this project, utilize and uniquely combine the public auditing protocols with double encryption approach to achieve the privacy-preserving public cloud data auditing system, which meets all integrity checking without any leakage of data. To support efficient handling of multiple auditing tasks, we further explore the technique of online signature to extend our main result into a multi-user setting, where TPA can perform multiple auditing tasks simultaneously. We can implement double encryption algorithm to encrypt the data twice and stored cloud server in Electronic Health Record applications.


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Jin Li ◽  
Songqi Wu ◽  
Yundan Yang ◽  
Fenghui Duan ◽  
Hui Lu ◽  
...  

In the process of sharing data, the costless replication of electric energy data leads to the problem of uncontrolled data and the difficulty of third-party access verification. This paper proposes a controlled sharing mechanism of data based on the consortium blockchain. The data flow range is controlled by the data isolation mechanism between channels provided by the consortium blockchain by constructing a data storage consortium chain to achieve trusted data storage, combining attribute-based encryption to complete data access control and meet the demands for granular data accessibility control and secure sharing; the data flow transfer ledger is built to record the original data life cycle management and effectively record the data transfer process of each data controller. Taking the application scenario of electric energy data sharing as an example, the scheme is designed and simulated on the Linux system and Hyperledger Fabric. Experimental results have verified that the mechanism can effectively control the scope of access to electrical energy data and realize the control of the data by the data owner.


10.2196/18580 ◽  
2020 ◽  
Vol 22 (8) ◽  
pp. e18580 ◽  
Author(s):  
Caleb J Ruth ◽  
Samantha Lee Huey ◽  
Jesse T Krisher ◽  
Amy Fothergill ◽  
Bryan M Gannon ◽  
...  

Background When we were unable to identify an electronic data capture (EDC) package that supported our requirements for clinical research in resource-limited regions, we set out to build our own reusable EDC framework. We needed to capture data when offline, synchronize data on demand, and enforce strict eligibility requirements and complex longitudinal protocols. Based on previous experience, the geographical areas in which we conduct our research often have unreliable, slow internet access that would make web-based EDC platforms impractical. We were unwilling to fall back on paper-based data capture as we wanted other benefits of EDC. Therefore, we decided to build our own reusable software platform. In this paper, we describe our customizable EDC framework and highlight how we have used it in our ongoing surveillance programs, clinic-based cross-sectional studies, and randomized controlled trials (RCTs) in various settings in India and Ecuador. Objective This paper describes the creation of a mobile framework to support complex clinical research protocols in a variety of settings including clinical, surveillance, and RCTs. Methods We developed ConnEDCt, a mobile EDC framework for iOS devices and personal computers, using Claris FileMaker software for electronic data capture and data storage. Results ConnEDCt was tested in the field in our clinical, surveillance, and clinical trial research contexts in India and Ecuador and continuously refined for ease of use and optimization, including specific user roles; simultaneous synchronization across multiple locations; complex randomization schemes and informed consent processes; and collecting diverse types of data (laboratory, growth measurements, sociodemographic, health history, dietary recall and feeding practices, environmental exposures, and biological specimen collection). Conclusions ConnEDCt is customizable, with regulatory-compliant security, data synchronization, and other useful features for data collection in a variety of settings and study designs. Furthermore, ConnEDCt is user friendly and lowers the risks for errors in data entry because of real time error checking and protocol enforcement.


2016 ◽  
Vol 24 (2) ◽  
pp. 398-402 ◽  
Author(s):  
Kavishwar B Wagholikar ◽  
Joshua C Mandel ◽  
Jeffery G Klann ◽  
Nich Wattanasin ◽  
Michael Mendis ◽  
...  

We have developed an interface to serve patient data from Informatics for Integrating Biology and the Bedside (i2b2) repositories in the Fast Healthcare Interoperability Resources (FHIR) format, referred to as a SMART-on-FHIR cell. The cell serves FHIR resources on a per-patient basis, and supports the “substitutable” modular third-party applications (SMART) OAuth2 specification for authorization of client applications. It is implemented as an i2b2 server plug-in, consisting of 6 modules: authentication, REST, i2b2-to-FHIR converter, resource enrichment, query engine, and cache. The source code is freely available as open source. We tested the cell by accessing resources from a test i2b2 installation, demonstrating that a SMART app can be launched from the cell that accesses patient data stored in i2b2. We successfully retrieved demographics, medications, labs, and diagnoses for test patients. The SMART-on-FHIR cell will enable i2b2 sites to provide simplified but secure data access in FHIR format, and will spur innovation and interoperability. Further, it transforms i2b2 into an apps platform.


2020 ◽  
Author(s):  
Caleb J Ruth ◽  
Samantha Lee Huey ◽  
Jesse T Krisher ◽  
Amy Fothergill ◽  
Bryan M Gannon ◽  
...  

BACKGROUND When we were unable to identify an electronic data capture (EDC) package that supported our requirements for clinical research in resource-limited regions, we set out to build our own reusable EDC framework. We needed to capture data when offline, synchronize data on demand, and enforce strict eligibility requirements and complex longitudinal protocols. Based on previous experience, the geographical areas in which we conduct our research often have unreliable, slow internet access that would make web-based EDC platforms impractical. We were unwilling to fall back on paper-based data capture as we wanted other benefits of EDC. Therefore, we decided to build our own reusable software platform. In this paper, we describe our customizable EDC framework and highlight how we have used it in our ongoing surveillance programs, clinic-based cross-sectional studies, and randomized controlled trials (RCTs) in various settings in India and Ecuador. OBJECTIVE This paper describes the creation of a mobile framework to support complex clinical research protocols in a variety of settings including clinical, surveillance, and RCTs. METHODS We developed ConnEDCt, a mobile EDC framework for iOS devices and personal computers, using Claris FileMaker software for electronic data capture and data storage. RESULTS ConnEDCt was tested in the field in our clinical, surveillance, and clinical trial research contexts in India and Ecuador and continuously refined for ease of use and optimization, including specific user roles; simultaneous synchronization across multiple locations; complex randomization schemes and informed consent processes; and collecting diverse types of data (laboratory, growth measurements, sociodemographic, health history, dietary recall and feeding practices, environmental exposures, and biological specimen collection). CONCLUSIONS ConnEDCt is customizable, with regulatory-compliant security, data synchronization, and other useful features for data collection in a variety of settings and study designs. Furthermore, ConnEDCt is user friendly and lowers the risks for errors in data entry because of real time error checking and protocol enforcement.


Author(s):  
David Meredith ◽  
Stephen Crouch ◽  
Gerson Galang ◽  
Ming Jiang ◽  
Hung Nguyen ◽  
...  

Data Transfer Service (DTS) is an open-source project that is developing a document-centric message model for describing a bulk data transfer activity, with an accompanying set of loosely coupled and platform-independent components for brokering the transfer of data between a wide range of (potentially incompatible) storage resources as scheduled, fault-tolerant batch jobs. The architecture scales from small embedded deployments on a single computer to large distributed deployments through an expandable ‘worker-node pool’ controlled through message-orientated middleware. Data access and transfer efficiency are maximized through the strategic placement of worker nodes at or between particular data sources/sinks. The design is inherently asynchronous, and, when third-party transfer is not available, it side-steps the bandwidth, concurrency and scalability limitations associated with buffering bytes directly through intermediary client applications. It aims to address geographical–topological deployment concerns by allowing service hosting to be either centralized (as part of a shared service) or confined to a single institution or domain. Established design patterns and open-source components are coupled with a proposal for a document-centric and open-standards-based messaging protocol. As part of the development of the message protocol, a bulk data copy activity document is proposed for the first time.


2014 ◽  
Vol 4 (2) ◽  
pp. 140-148 ◽  
Author(s):  
Jana Kvíderová

The increasing number of observations and samples led to development of systems for data storage and management. In this paper, design and experience with data manage-ment of the Sample database (SampleDTB) used in the Centre for Polar Ecology, Faculty of Science, University of South Bohemia, České Budějovice, Czech Republic, is presented. The SampleDTB was designed for microbiological, phycological or hydro-biological data. The SampleDTB consists of data tables including defined lists of cli-matic zones, habitats, communities and taxons, specific queries for datasets determina-tion and searches, forms for filling in samples and reports. The data tables contain detailed information on site, its environment, types of habitats and communities, in-cluding data on taxonomic diversity. The queries provide source data for reports or serve for searches for specific taxon, sample etc. Forms are used primarily for data entry or modifications. The reports provide summaries and charts for export, either for whole data set or for specific datasets. Data management resulted in system of sample numbering, site specification, and system for photographs storage. Possible future development will be focused on on-line data access, biovolume and diversity indices calculation, laboratory sample processing, and connection to culture collection database.


Author(s):  
Deepa S. Deulkar

Cloud based data storage is becoming very popular nowadays as one can easily download any document from anytime and anywhere. However documents security is very important in cloud storage as cloud is a third party server which can be accessed by administrators. There are many literatures available to improve document security on cloud and most of the literatures proposed various data encryption techniques. However, simply encrypting data (e.g., via AES) cannot fully address the practical need of data management. Besides, an effective access control over download request also needs to be considered so that Economic Denial of Sustainability (EDoS) attacks cannot be launched to hinder users from enjoying service. In this project, we consider the dual access control, in the context of cloud-based storage, in the sense that we design a control mechanism over both data access and download request without loss of security and efficiency. Along with dual access control we also focus on document security by using modified AES algorithm.


Sign in / Sign up

Export Citation Format

Share Document