Transforming Biomedical Applications Through Smart Sensing and Artificial Intelligence

Author(s):  
Harini Akshaya T. J. ◽  
Suresh V. ◽  
Carmel Sobia M.

Electronic health records (EHR) have been adopted in many countries as they tend to play a major role in the healthcare systems. This is due to the fact that the high quality of data could be achieved at a very low cost. EHR is a platform where the data are stored digitally and the users could access and exchange data in a secured manner. The main objective of this chapter is to summarize recent development in wearable sensors integrated with the internet of things (IoT) system and their application to monitor patients with chronic disease and older people in their homes and community. The records are transmitted digitally through wireless communication devices through gateways and stored in the cloud computing environment.

2021 ◽  
Vol 9 (2) ◽  
pp. 229
Author(s):  
Georgy Mitrofanov ◽  
Nikita Goreyavchev ◽  
Roman Kushnarev

The emerging tasks of determining the features of bottom sediments, including the evolution of the seabed, require a significant improvement in the quality of data and methods for their processing. Marine seismic data has traditionally been perceived to be of high quality compared to land data. However, high quality is always a relative characteristic and is determined by the problem being solved. In a detailed study of complex processes, the interaction of waves with bottom sediments, as well as the processes of seabed evolution over short time intervals (not millions of years), we need very high accuracy of observations. If we also need significant volumes of research covering large areas, then a significant revision of questions about the quality of observations and methods of processing is required to improve the quality of data. The article provides an example of data obtained during high-precision marine surveys and containing a wide frequency range from hundreds of hertz to kilohertz. It is shown that these data, visually having a very high quality, have variations in wavelets at all analyzed frequencies. The corresponding variations reach tens of percent. The use of the method of factor decomposition in the spectral domain made it possible to significantly improve the quality of the data, reducing the variability of wavelets by several times.


Author(s):  
W. Ostrowski ◽  
K. Hanus

One of the popular uses of UAVs in photogrammetry is providing an archaeological documentation. A wide offer of low-cost (consumer) grade UAVs, as well as the popularity of user-friendly photogrammetric software allowing obtaining satisfying results, contribute to facilitating the process of preparing documentation for small archaeological sites. However, using solutions of this kind is much more problematic for larger areas. The limited possibilities of autonomous flight makes it significantly harder to obtain data for areas too large to be covered during a single mission. Moreover, sometimes the platforms used are not equipped with telemetry systems, which makes navigating and guaranteeing a similar quality of data during separate flights difficult. The simplest solution is using a better UAV, however the cost of devices of such type often exceeds the financial capabilities of archaeological expeditions. <br><br> The aim of this article is to present methodology allowing obtaining data for medium scale areas using only a basic UAV. The proposed methodology assumes using a simple multirotor, not equipped with any flight planning system or telemetry. Navigating of the platform is based solely on live-view images sent from the camera attached to the UAV. The presented survey was carried out using a simple GoPro camera which, from the perspective of photogrammetric use, was not the optimal configuration due to the fish eye geometry of the camera. Another limitation is the actual operational range of UAVs which in the case of cheaper systems, rarely exceeds 1 kilometre and is in fact often much smaller. Therefore the surveyed area must be divided into sub-blocks which correspond to the range of the drone. It is inconvenient since the blocks must overlap, so that they will later be merged during their processing. This increases the length of required flights as well as the computing power necessary to process a greater number of images. <br><br> These issues make prospection highly inconvenient, but not impossible. Our paper presents our experiences through two case studies: surveys conducted in Nepal under the aegis of UNESCO, and works carried out as a part of a Polish archaeological expedition in Cyprus, which both prove that the proposed methodology allows obtaining satisfying results. The article is an important voice in the ongoing debate between commercial and academic archaeologists who discuss the balance between the required standards of conducting archaeological works and economic capabilities of archaeological missions.


2018 ◽  
Vol 176 ◽  
pp. 01011
Author(s):  
YE Xin ◽  
JI Qian

The shared economy has been developing rapidly with low cost, low consumption and high environmental efficiency features under the background of internet waves. The shared economy model has emerged in housing, catering, and travel. As people look forward to high quality of life and their social interaction need, the shared kitchen platform arises at the right moment. This paper takes the shared kitchen as an example, focusing on the patients and their caregivers, the existing shared kitchens and its service platform as well as the space system design and human-computer interaction of the shared kitchen have been investigated and analysized. Taking the "high efficiency, the intelligence and the humanization" as the design principles, we are committed to exploring new directions for modular kitchen design under the background of shared economy.


2017 ◽  
Vol 8 (4) ◽  
pp. 23-29
Author(s):  
Rajaguru D. ◽  
Puviyarasi T. ◽  
Vengattaraman T.

The Internet of Things(IoT) such as the use of robots, sensors, actuators, electronic signalization and a variety of other internet-enabled physical devices may provide for new advanced smart applications to be used in construction in the very near future. Such applications require real-time responses and are therefore time-critical. Therefore, in order to support collaboration, control, monitoring, supply management, safety and other construction processes, they have to meet dependability requirements, including requirements for high Quality of Service (QoS). Dependability and high QoS can be achieved by using adequate number and quality of computing resources, such as processing, memory and networking elements, geographically close to the smart environments for handheld device computing (HDC).


Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4448 ◽  
Author(s):  
Günther Sagl ◽  
Bernd Resch ◽  
Andreas Petutschnig ◽  
Kalliopi Kyriakou ◽  
Michael Liedlgruber ◽  
...  

Wearable sensors are increasingly used in research, as well as for personal and private purposes. A variety of scientific studies are based on physiological measurements from such rather low-cost wearables. That said, how accurate are such measurements compared to measurements from well-calibrated, high-quality laboratory equipment used in psychological and medical research? The answer to this question, undoubtedly impacts the reliability of a study’s results. In this paper, we demonstrate an approach to quantify the accuracy of low-cost wearables in comparison to high-quality laboratory sensors. We therefore developed a benchmark framework for physiological sensors that covers the entire workflow from sensor data acquisition to the computation and interpretation of diverse correlation and similarity metrics. We evaluated this framework based on a study with 18 participants. Each participant was equipped with one high-quality laboratory sensor and two wearables. These three sensors simultaneously measured the physiological parameters such as heart rate and galvanic skin response, while the participant was cycling on an ergometer following a predefined routine. The results of our benchmarking show that cardiovascular parameters (heart rate, inter-beat interval, heart rate variability) yield very high correlations and similarities. Measurement of galvanic skin response, which is a more delicate undertaking, resulted in lower, but still reasonable correlations and similarities. We conclude that the benchmarked wearables provide physiological measurements such as heart rate and inter-beat interval with an accuracy close to that of the professional high-end sensor, but the accuracy varies more for other parameters, such as galvanic skin response.


2020 ◽  
Vol 9 (11) ◽  
pp. 3431
Author(s):  
Hans-Michael Hau ◽  
Jürgen Weitz ◽  
Ulrich Bork

The COVID-19 pandemic has tremendously changed private and professional interactions and behaviors worldwide. The effects of this pandemic and the actions taken have changed our healthcare systems, which consequently has affected medical education and surgical training. In the face of constant disruptions of surgical education and training during this pandemic outbreak, structured and innovative concepts and adapted educational curricula are important to ensure a high quality of medical treatment. While efforts were undertaken to prevent viral spreading, it is important to analyze and assess the effects of this crisis on medical education, surgical training and teaching at large and certainly in the field of surgical oncology. Against this background, in this paper we introduce practical and creative recommendations for the continuity of students’ and residents’ medical and surgical training and teaching. This includes virtual educational curricula, skills development classes, video-based feedback and simulation in the specialty field of surgical oncology. In conclusion, the effects of COVID 19 on Surgical Training and Teaching, certainly in the field of Surgical Oncology, are challenging.


Sensors ◽  
2018 ◽  
Vol 18 (12) ◽  
pp. 4486 ◽  
Author(s):  
Mohan Li ◽  
Yanbin Sun ◽  
Yu Jiang ◽  
Zhihong Tian

In sensor-based systems, the data of an object is often provided by multiple sources. Since the data quality of these sources might be different, when querying the observations, it is necessary to carefully select the sources to make sure that high quality data is accessed. A solution is to perform a quality evaluation in the cloud and select a set of high-quality, low-cost data sources (i.e., sensors or small sensor networks) that can answer queries. This paper studies the problem of min-cost quality-aware query which aims to find high quality results from multi-sources with the minimized cost. The measurement of the query results is provided, and two methods for answering min-cost quality-aware query are proposed. How to get a reasonable parameter setting is also discussed. Experiments on real-life data verify that the proposed techniques are efficient and effective.


Author(s):  
John M. Mackenzie

Over the past several years the capabilities of personal computers have advanced at a staggering rate. At the same time, the cost of the hardware has dropped to such a degree that one wonders whether such inexpensive hardware can perform adequately.The purpose of this discussion is to look at the minimum hardware necessary to do quality stereo imaging on CRT display devices and to discuss several important evaluation criteria in producing these stereo images.The most important criteria for producing high quality stereo pairs lies in the quality of the digitization of the image. Most TV rate imaging systems even after multiple frames are averaged are quite distorted and lack sufficient image detail. Slow scan imaging systems such as the one developed in this laboratory which use a gated integrator and can digitize at over one thousand pixels square with 256 gray levels produce images which are extremely close to photographic quality.


2017 ◽  
Vol 35 (8_suppl) ◽  
pp. 217-217
Author(s):  
Shaheena Mukhi ◽  
John Srigley ◽  
Corinne Daly ◽  
Mary Agent-Katwala

217 Background: To improve variability in diagnosing and treating cancer resection cases, six Canadian provinces implemented standardized pathology checklists to transition from narrative to synoptic reporting. In clinical practice, pathologists are electronically capturing data on the resected cancer specimens synoptically for breast, colorectal, lung, prostate, and endometrial cases. Though data were collected in a standardized format, consensus based indicators were unavailable to coordinate action across Canada. Objectives: We aimed to develop indicators to measure consistency of high quality cancer diagnosis, staging, prognosis and treatment, and coordinate action. Methods: A literature review was conducted with the input of clinical experts to inform the development of indicators. 50 clinicians from x jurisdictions reviewed, selected and ranked 33 indicators, initially drafted. Clinicians also provided input on the clinical validity of the indicators and set targets based on evidence. Clinicians reviewed the baseline data, confirmed the clinical usefulness of indicators, and assigned indicators into three pioneered domains. Results: 47 indicators were developed and categorized into one of three domains: descriptive, which provide data on intrinsic measures of a patient’s tumour, such as stage or tumour type; process, which measure the quality of data completeness, timeliness and compliance; and clinico-pathologic outcome, which examine surgeon or pathologist effect on the diagnostic pathway, such as margin positivity rates or adequacy of lymph node removal. Examples of indicators are: margin status; lymph node examined, involved and retrieval; histologic type and grade distribution; lympho-vascular invasion; pT3 margin positivity rate. Conclusions: The indicators have set a framework for: measuring consistency and inconsistency in diagnosing and staging cancer; for organizing conversations and multidisciplinary group discussions; and establishing the culture of quality improvement.


Author(s):  
Nilamadhab Mishra

The progressive data science and knowledge analytic tasks are gaining popularity across various intellectual applications. The main research challenge is to obtain insight from large-scale IoE data that can be used to produce cognitive actuations for the applications. The time to insight is very slow, quality of insight is poor, and cost of insight is high; on the other hand, the intellectual applications require low cost, high quality, and real-time frameworks and algorithms to massively transform their data into cognitive values. In this chapter, the author would like to discuss the overall data science and knowledge analytic contexts on IoE data that are generated from smart edge computing devices. In an IoE-driven e-BI application, the e-consumers are using the smart edge computing devices from which a huge volume of IoE data are generated, and this creates research challenges to traditional data science and knowledge analytic mechanisms. The consumer-end IoE data are considered the potential sources to massively turn into the e-business goldmines.


Sign in / Sign up

Export Citation Format

Share Document