scholarly journals GP Benchmark: Engineering a Crowd-Sourcing Platform for Real-Time Understanding of Personality and Cognitive Biases in Clinical Error

Author(s):  
Wesley Hutchinson ◽  
Sumi Helal ◽  
Christopher Bull
2014 ◽  
Vol 23 (01) ◽  
pp. 27-35 ◽  
Author(s):  
S. de Lusignan ◽  
S-T. Liaw ◽  
C. Kuziemsky ◽  
F. Mold ◽  
P. Krause ◽  
...  

Summary Background: Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective: To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results: We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions: Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance.


2020 ◽  
Author(s):  
Rodrigo Carbajales ◽  
Massimiliano Iurcev ◽  
Paolo Diviacco

<p>Low cost sensors and crowd-sourcing data could potentially revolutionise the way air pollution measurements are collected providing high density geolocated data. In fact, so far data have been collected mostly using dedicated fixed position monitoring stations. These latter rely on high quality instrumentation, well established practices and well trained personnel, which means that, due to its costs, this paradigm entails limitations in the resolution and extension of geographic sampling of an area.</p><p>The combination of low costs sensors and volunteer-based or opportunistic acquisition of data can, instead, possibly turn the cost issue into an advantage. This approach, however, introduces other limitations since low cost sensors provide less reliable data and crowd source acquisition are subjects to data gaps in space and time.</p><p>In order to overcome these issues redundant data from multiple platforms have to be made available. On one hand this allows statistics to be applied to identify and remove anomalous values, and on the other hand when multiple platforms are used, the chances to have a better coverage and more reliable data  increases.</p><p>To implement this approach OGS developed the full suite of tools that has been named COCAL that allow to follow the full path from the acquisition, transmission, storage, integration and real time visualization of the crowdsourced data.</p><p>Low cost sensors for the detection of suspended particulate matter size 2.5 and 10 µm, together with atmospheric pressure, humidity and temperature, have been combined with GPS positioning and transmission (being able to opt for GSM, WiFi or LoRaWAN transmission) unit in a black box that can be attached to any moving vehicle travelling in an area. This way large areas can be sampled with high geographic resolution.</p><p>Atmospheric data are collected in an InfluxDB database, which allows easy integration with TheThingsNetwork for LoRaWAN network management and directly with GSM and WiFi connections. Public users are provided with a real-time web interface based on OpenLayers for map visualization. Server based processing and conversion scripts generate both filtered data and aggregate data, by computing averages on a spatial and temporal grid.. Finally, automatic interpolation techniques like Inverse Distance Weighting or Natural Neighbours may provide detailed online maps with contouring and boundary definition. All products are available in near real-time through OGC compliant web services, suited for an easy integration with other repositories and services.</p>


2017 ◽  
Vol 23 (2) ◽  
pp. 441-470
Author(s):  
Cristian Babau ◽  
Marius Marcu ◽  
Mircea Gabriel Tihu ◽  
Daniel George Telbis ◽  
Vladimir Ioan Creţu

Traffic optimization is a subject that has become vital for the world we live in. People these days need to get from a starting point to a destination point as fast and as safe as possible. Traffic congestion plays a key role in the frustration of people and it results in lost time, reduced productivity and wasted resources. In our study we seek to address these issues by proposing a real-time road traffic planning system based on mobile context and crowd sourcing efforts. The first step toward this goal is real-time traffic characterization using data collected from mobile sensors of drivers, pedestrians, cyclists, passengers, etc.. We started developing a data collection and analysis system composed of a mobile application in order to collect user context data and a Web application to view and analyze the data. This new system will eventually give the users an automatically optimized route to the destination and predict the users’ traveling route based on live traffic conditions and historical data.


2019 ◽  
Vol 3 (1) ◽  
pp. e000418 ◽  
Author(s):  
Emma C Anderson ◽  
Joanna May Kesten ◽  
Isabel Lane ◽  
Alastair D Hay ◽  
Timothy Moss ◽  
...  

AimTo investigate primary care clinicians’ views of a prototype locally relevant, real-time viral surveillance system to assist diagnostic decision-making and antibiotic prescribing for paediatric respiratory tract infections (RTI). Clinicians’ perspectives on the content, anticipated use and impact were explored to inform intervention development.BackgroundChildren with RTIs are overprescribed antibiotics. Pressures on primary care and diagnostic uncertainty can lead to decisional biases towards prescribing. We hypothesise that real-time paediatric RTI surveillance data could reduce diagnostic uncertainty and help reduce unnecessary antibiotic prescribing.MethodologySemistructured one-to-one interviews with 21 clinicians from a range of urban general practitioner surgeries explored the clinical context and views of the prototype system. Transcripts were analysed using thematic analysis.ResultsThough clinicians self-identified as rational (not over)prescribers, cognitive biases influenced antibiotic prescribing decisions. Clinicians sought to avoid ‘anticipated regret’ around not prescribing for a child who then deteriorated. Clinicians were not aware of formal infection surveillance information sources (tending to assume many viruses are around), perceiving the information as novel and potentially useful. Perceptions of surveillance information as presented included: not relevant to decision-making/management; useful to confirm decisions post hoc; and increasing risks of missing sick children. Clinicians expressed wariness of using population-level data to influence individual patient decision-making and expressed preference for threat (high-risk) information identified by surveillance, rather than reassuring information about viral RTIs.ConclusionsMore work is needed to develop a surveillance intervention if it is to beneficially influence decision-making and antibiotic prescribing in primary care. Key challenges for developing interventions are how to address cognitive biases and how to communicate reassuring information to risk-oriented clinicians.


1979 ◽  
Vol 44 ◽  
pp. 41-47
Author(s):  
Donald A. Landman

This paper describes some recent results of our quiescent prominence spectrometry program at the Mees Solar Observatory on Haleakala. The observations were made with the 25 cm coronagraph/coudé spectrograph system using a silicon vidicon detector. This detector consists of 500 contiguous channels covering approximately 6 or 80 Å, depending on the grating used. The instrument is interfaced to the Observatory’s PDP 11/45 computer system, and has the important advantages of wide spectral response, linearity and signal-averaging with real-time display. Its principal drawback is the relatively small target size. For the present work, the aperture was about 3″ × 5″. Absolute intensity calibrations were made by measuring quiet regions near sun center.


Sign in / Sign up

Export Citation Format

Share Document