scholarly journals Introduction of Electronic Referral from Community Associated with More Timely Review by Secondary Services

2011 ◽  
Vol 02 (04) ◽  
pp. 546-564 ◽  
Author(s):  
S. White ◽  
K.J. Day ◽  
Y. Gu ◽  
M. Pollock ◽  
J. Warren

SummaryBackground: Electronic referral (eReferral) from community into public secondary healthcare services was introduced to 30 referring general medical practices and 28 hospital based services in late 2007.Objectives: To measure the extent of uptake of eReferral and its association with changes in referral processing.Methods: Analysis of transactional data from the eReferral message service and the patient information management system of the affected hospital; interview of clinical, operational and management stakeholders.Results: eReferral use rose steadily to 1000 transactions per month in 2008, thereafter showing moderate growth to 1200 per month in 2010. Rate of eReferral from the community in 2010 is estimated at 56% of total referrals to the hospital from general practice, and as 71% of referrals from those having done at least one referral electronically. Referral latency from letter date to hospital triage improves significantly from 2007 to 2009 (p<0.001), from a paper referral median of 8 days (inter-quartile range, IQR: 4–14) in 2007 to an eReferral median of 5 days (IQR: 2–9) and paper referral median of 6 days (IQR: 2–12) in 2009. Specialists upgrade the referrer-assigned eReferral priority in 19.2% of cases and downgrade it 18.6% of the time. Clinical users appreciate improvement of referral visibility (status and content access); however, both general practitioners and specialists point out system usability issues.Discussion: With eReferrals, a referral’s status can be checked, and its content read, by any authorized user at any time. The period of eReferral uptake was associated with significant speed-up in referral processing without changes in staffing levels. The eReferral system provides a foundation for further innovation in the community-secondary interface, such as electronic decision support and shared care planning systems.Conclusions: We observed substantial rapid voluntary uptake of eReferrals associated with faster, more reliable and more transparent referral processing.

2021 ◽  
Vol 79 (1) ◽  
Author(s):  
Anam Shahil Feroz ◽  
Adeel Khoja ◽  
Sarah Saleem

Abstract Background Community health workers (CHWs) are well-positioned to play a pivotal role in fighting the pandemic at the community level. The Covid-19 outbreak has led to a lot of stress and anxiety among CHWs as they are expected to perform pandemic related tasks along with the delivery of essential healthcare services. In addition, movement restrictions, lockdowns, social distancing, and lack of protective gear have significantly affected CHWs’ routine workflow and performance. To optimize CHWs’ functioning, there is a renewed interest in supporting CHWs with digital technology to ensure an appropriate pandemic response. Discussion The current situation has necessitated the use of digital tools for the delivery of Covid-19 related tasks and other essential healthcare services at the community level. Evidence suggests that there has been a significant digital transformation to support CHWs in these critical times such as remote data collection and health assessments, the use of short message service and voice message for health education, use of digital megaphones for encouraging behavior change, and digital contract tracing. A few LMICs such as Uganda and Ethiopia have been successful in operationalizing digital tools to optimize CHWs’ functioning for Covid-19 tasks and other essential health services. Conclusion Yet, in most LMICs, there are some challenges concerning the feasibility and acceptability of using digital tools for CHWs during the Covid-19 pandemic. In most cases, CHWs find it difficult to adopt and use digital health solutions due to lack of training on new digital tools, weak technical support, issues of internet connectivity, and other administrative related challenges. To address these challenges, engaging governments would be essential for training CHWs on user-friendly digital health solutions to improve routine workflow of CHWs during the Covid-19 pandemic.


2020 ◽  
Vol 3 (3) ◽  
pp. 33-42
Author(s):  
Ali M. Hasan ◽  
Abdulkareem A. Kadhim

Smart city has developed energy, environmental, and healthcare services.  It is also continuously providing new services to all citizens.  This paper is concerned with the design and implementation of smart meter system as the core for smart grid in smart city.  A system is proposed where the electricity supply is monitored by measuring its related parameters (voltage, current, power, energy consumption, and consumption bill) and issuing Short Message Service (SMS) notification of the consumption.   The designed system used PZEM-004T, Arduino Mega, Raspberry Pi and Node-Red platforms.  The data related to the measured parameters are successfully transmitted to the datacenter using Message Queuing Telemetry Transport (MQTT) protocol, stored in MySQL database using core python program, and displayed on Node-Red platform.  The test and verification of the system are performed using different scenarios showing that successful and accurate operation of the system components are achieved.   Finally, the designed system can be extended to cover large geographical areas and can be modified to serve for pre-paid arrangement in an effort to assist in reducing the electricity consumption in Iraq with the continuing crises of electricity.  


Author(s):  
Jay Prakash ◽  
T.V. Vijay Kumar

In today's world, business transactional data has become the critical part of all business-related decisions. For this purpose, complex analytical queries have been run on transactional data to get the relevant information, from therein, for decision making. These complex queries consume a lot of time to execute as data is spread across multiple disparate locations. Materializing views in the data warehouse can be used to speed up processing of these complex analytical queries. Materializing all possible views is infeasible due to storage space constraint and view maintenance cost. Hence, a subset of relevant views needs to be selected for materialization that reduces the response time of analytical queries. Optimal selection of subset of views is shown to be an NP-Complete problem. In this article, a non-Pareto based genetic algorithm, is proposed, that selects Top-K views for materialization from a multidimensional lattice. An experiments-based comparison of the proposed algorithm with the most fundamental view selection algorithm, HRUA, shows that the former performs comparatively better than the latter. Thus, materializing views selected by using the proposed algorithm would improve the query response time of analytical queries and thereby facilitate in decision making.


Author(s):  
Jay Prakash ◽  
T.V. Vijay Kumar

In today's world, business transactional data has become the critical part of all business-related decisions. For this purpose, complex analytical queries have been run on transactional data to get the relevant information, from therein, for decision making. These complex queries consume a lot of time to execute as data is spread across multiple disparate locations. Materializing views in the data warehouse can be used to speed up processing of these complex analytical queries. Materializing all possible views is infeasible due to storage space constraint and view maintenance cost. Hence, a subset of relevant views needs to be selected for materialization that reduces the response time of analytical queries. Optimal selection of subset of views is shown to be an NP-Complete problem. In this article, a non-Pareto based genetic algorithm, is proposed, that selects Top-K views for materialization from a multidimensional lattice. An experiments-based comparison of the proposed algorithm with the most fundamental view selection algorithm, HRUA, shows that the former performs comparatively better than the latter. Thus, materializing views selected by using the proposed algorithm would improve the query response time of analytical queries and thereby facilitate in decision making.


2018 ◽  
Vol 17 (3) ◽  
pp. 439
Author(s):  
I Putu Adi Pradnyana Wibawa ◽  
IA Dwi Giriantari ◽  
Made Sudarma

The growth of technology will impact to the growth of data beyond limits of database management tools. One of the system is  Information Management System for Hospital it’s a high complexity problem solving method related to the load of data. Parallel computing is one of the technique that been used in HPC. The focus of this research will be emphasized to the design of parallel computing using message – passing model as a search system process in Information Management System for Hospital to find patient data. The design of parallel computing will be done in a way to shared computing data to a number of CPU (Master and Slave), the parallel computing configuration used on CPU (Master and Slave) will be using some stage FOSTER method that are partition, communication, aglomerasi and mapping. The test will be done computing data processing time between sequential and parallel time. Parallel computing design will be tested using speedup and efficiency calculation. The result of designing and testing parallel computing using message-passing model proved the patient data processing speed using parallel program is capable to overcome 1 CPU using sequential network topologi. On speed up method , the test indicate an increase on data transfer speed up using 3 parallel computing CPU. While using efficiency testing method, the efficiency point reached its peak when using 2 and 3 CPU. The Occurrence of decrease on speedup and efficiency point were caused by the minimal amount of data if handled by 7 parallel computing CPU. The Conclusion for this method is, the increase amount of CPU involved in the data processing using parallel computing is not proportional to the amount of time to processing data itself. It happened because every data processing task in terms of the amount of data that’s handled have an ideal amount of CPU limits to do the task itself.


Author(s):  
Brian Cross

A relatively new entry, in the field of microscopy, is the Scanning X-Ray Fluorescence Microscope (SXRFM). Using this type of instrument (e.g. Kevex Omicron X-ray Microprobe), one can obtain multiple elemental x-ray images, from the analysis of materials which show heterogeneity. The SXRFM obtains images by collimating an x-ray beam (e.g. 100 μm diameter), and then scanning the sample with a high-speed x-y stage. To speed up the image acquisition, data is acquired "on-the-fly" by slew-scanning the stage along the x-axis, like a TV or SEM scan. To reduce the overhead from "fly-back," the images can be acquired by bi-directional scanning of the x-axis. This results in very little overhead with the re-positioning of the sample stage. The image acquisition rate is dominated by the x-ray acquisition rate. Therefore, the total x-ray image acquisition rate, using the SXRFM, is very comparable to an SEM. Although the x-ray spatial resolution of the SXRFM is worse than an SEM (say 100 vs. 2 μm), there are several other advantages.


Author(s):  
A. G. Jackson ◽  
M. Rowe

Diffraction intensities from intermetallic compounds are, in the kinematic approximation, proportional to the scattering amplitude from the element doing the scattering. More detailed calculations have shown that site symmetry and occupation by various atom species also affects the intensity in a diffracted beam. [1] Hence, by measuring the intensities of beams, or their ratios, the occupancy can be estimated. Measurement of the intensity values also allows structure calculations to be made to determine the spatial distribution of the potentials doing the scattering. Thermal effects are also present as a background contribution. Inelastic effects such as loss or absorption/excitation complicate the intensity behavior, and dynamical theory is required to estimate the intensity value.The dynamic range of currents in diffracted beams can be 104or 105:1. Hence, detection of such information requires a means for collecting the intensity over a signal-to-noise range beyond that obtainable with a single film plate, which has a S/N of about 103:1. Although such a collection system is not available currently, a simple system consisting of instrumentation on an existing STEM can be used as a proof of concept which has a S/N of about 255:1, limited by the 8 bit pixel attributes used in the electronics. Use of 24 bit pixel attributes would easily allowthe desired noise range to be attained in the processing instrumentation. The S/N of the scintillator used by the photoelectron sensor is about 106 to 1, well beyond the S/N goal. The trade-off that must be made is the time for acquiring the signal, since the pattern can be obtained in seconds using film plates, compared to 10 to 20 minutes for a pattern to be acquired using the digital scan. Parallel acquisition would, of course, speed up this process immensely.


2013 ◽  
Vol 3 (2) ◽  
pp. 35-40
Author(s):  
Carol Dudding

Whether in our professional or private lives, we are all aware of the system wide efforts to provide quality healthcare services while containing the costs. Telemedicine as a method of service delivery has expanded as a result of changes in reimbursement and service delivery models. The growth and sustainability of telehealth within speech-language pathology and audiology, like any other service, depends on the ability to be reimbursed for services provided. Currently, reimbursement for services delivered via telehealth is variable and depends on numerous factors. An understanding of these factors and a willingness to advocate for increased reimbursement can bolster the success of practitioners interested in the telehealth as a service delivery method.


2004 ◽  
Vol 63 (1) ◽  
pp. 17-29 ◽  
Author(s):  
Friedrich Wilkening ◽  
Claudia Martin

Children 6 and 10 years of age and adults were asked how fast a toy car had to be to catch up with another car, the latter moving with a constant speed throughout. The speed change was required either after half of the time (linear condition) or half of the distance (nonlinear condition), and responses were given either on a rating scale (judgment condition) or by actually producing the motion (action condition). In the linear condition, the data patterns for both judgments and actions were in accordance with the normative rule at all ages. This was not true for the nonlinear condition, where children’s and adults’ judgment and also children’s action patterns were linear, and only adults’ action patterns were in line with the nonlinearity principle. Discussing the reasons for the misconceptions and for the action-judgment dissociations, a claim is made for a new view on the development of children’s concepts of time and speed.


1986 ◽  
Vol 31 (6) ◽  
pp. 429-430
Author(s):  
Carlfred B. Broderick
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document