scholarly journals CovNet: A Transfer Learning Framework for Automatic COVID-19 Detection From Crowd-Sourced Cough Sounds

2022 ◽  
Vol 3 ◽  
Author(s):  
Yi Chang ◽  
Xin Jing ◽  
Zhao Ren ◽  
Björn W. Schuller

Since the COronaVIrus Disease 2019 (COVID-19) outbreak, developing a digital diagnostic tool to detect COVID-19 from respiratory sounds with computer audition has become an essential topic due to its advantages of being swift, low-cost, and eco-friendly. However, prior studies mainly focused on small-scale COVID-19 datasets. To build a robust model, the large-scale multi-sound FluSense dataset is utilised to help detect COVID-19 from cough sounds in this study. Due to the gap between FluSense and the COVID-19-related datasets consisting of cough only, the transfer learning framework (namely CovNet) is proposed and applied rather than simply augmenting the training data with FluSense. The CovNet contains (i) a parameter transferring strategy and (ii) an embedding incorporation strategy. Specifically, to validate the CovNet's effectiveness, it is used to transfer knowledge from FluSense to COUGHVID, a large-scale cough sound database of COVID-19 negative and COVID-19 positive individuals. The trained model on FluSense and COUGHVID is further applied under the CovNet to another two small-scale cough datasets for COVID-19 detection, the COVID-19 cough sub-challenge (CCS) database in the INTERSPEECH Computational Paralinguistics challengE (ComParE) challenge and the DiCOVA Track-1 database. By training four simple convolutional neural networks (CNNs) in the transfer learning framework, our approach achieves an absolute improvement of 3.57% over the baseline of DiCOVA Track-1 validation of the area under the receiver operating characteristic curve (ROC AUC) and an absolute improvement of 1.73% over the baseline of ComParE CCS test unweighted average recall (UAR).

2020 ◽  
Author(s):  
Iason Katsamenis ◽  
Eftychios Protopapadakis ◽  
Athanasios Voulodimos ◽  
Anastasios Doulamis ◽  
Nikolaos Doulamis

We introduce a deep learning framework that can detect COVID-19 pneumonia in thoracic radiographs, as well as differentiate it from bacterial pneumonia infection. Deep classification models, such as convolutional neural networks (CNNs), require large-scale datasets in order to be trained and perform properly. Since the number of X-ray samples related to COVID-19 is limited, transfer learning (TL) appears as the go-to method to alleviate the demand for training data and develop accurate automated diagnosis models. In this context, networks are able to gain knowledge from pretrained networks on large-scale image datasets or alternative data-rich sources (i.e. bacterial and viral pneumonia radiographs). The experimental results indicate that the TL approach outperforms the performance obtained without TL, for the COVID-19 classification task in chest X-ray images.


2021 ◽  
Vol 11 (2) ◽  
pp. 472
Author(s):  
Hyeongmin Cho ◽  
Sangkyun Lee

Machine learning has been proven to be effective in various application areas, such as object and speech recognition on mobile systems. Since a critical key to machine learning success is the availability of large training data, many datasets are being disclosed and published online. From a data consumer or manager point of view, measuring data quality is an important first step in the learning process. We need to determine which datasets to use, update, and maintain. However, not many practical ways to measure data quality are available today, especially when it comes to large-scale high-dimensional data, such as images and videos. This paper proposes two data quality measures that can compute class separability and in-class variability, the two important aspects of data quality, for a given dataset. Classical data quality measures tend to focus only on class separability; however, we suggest that in-class variability is another important data quality factor. We provide efficient algorithms to compute our quality measures based on random projections and bootstrapping with statistical benefits on large-scale high-dimensional data. In experiments, we show that our measures are compatible with classical measures on small-scale data and can be computed much more efficiently on large-scale high-dimensional datasets.


Author(s):  
Jian Song ◽  
Chun-wei Gu

Energy shortage and environmental deterioration are two crucial issues that the developing world has to face. In order to solve these problems, conversion of low grade energy is attracting broad attention. Among all of the existing technologies, Organic Rankine Cycle (ORC) has been proven to be one of the most effective methods for the utilization of low grade heat sources. Turbine is a key component in ORC system and it plays an important role in system performance. Traditional turbine expanders, the axial flow turbine and the radial inflow turbine are typically selected in large scale ORC systems. However, in small and micro scale systems, traditional turbine expanders are not suitable due to large flow loss and high rotation speed. In this case, Tesla turbine allows a low-cost and reliable design for the organic expander that could be an attractive option for small scale ORC systems. A 1-D model of Tesla turbine is presented in this paper, which mainly focuses on the flow characteristics and the momentum transfer. This study improves the 1-D model, taking the nozzle limit expansion ratio into consideration, which is related to the installation angle of the nozzle and the specific heat ratio of the working fluid. The improved model is used to analyze Tesla turbine performance and predict turbine efficiency. Thermodynamic analysis is conducted for a small scale ORC system. The simulation results reveal that the ORC system can generate a considerable net power output. Therefore, Tesla turbine can be regarded as a potential choice to be applied in small scale ORC systems.


2020 ◽  
Author(s):  
Brian Post ◽  
Phillip Chesser ◽  
Alex Roschli ◽  
Lonnie Love ◽  
Katherine Gaul

2015 ◽  
Vol 2 (2) ◽  
pp. 513-536 ◽  
Author(s):  
I. Grooms ◽  
Y. Lee

Abstract. Superparameterization (SP) is a multiscale computational approach wherein a large scale atmosphere or ocean model is coupled to an array of simulations of small scale dynamics on periodic domains embedded into the computational grid of the large scale model. SP has been successfully developed in global atmosphere and climate models, and is a promising approach for new applications. The authors develop a 3D-Var variational data assimilation framework for use with SP; the relatively low cost and simplicity of 3D-Var in comparison with ensemble approaches makes it a natural fit for relatively expensive multiscale SP models. To demonstrate the assimilation framework in a simple model, the authors develop a new system of ordinary differential equations similar to the two-scale Lorenz-'96 model. The system has one set of variables denoted {Yi}, with large and small scale parts, and the SP approximation to the system is straightforward. With the new assimilation framework the SP model approximates the large scale dynamics of the true system accurately.


2021 ◽  
Vol 2115 (1) ◽  
pp. 012026
Author(s):  
Sonam Solanki ◽  
Gunendra Mahore

Abstract In the current process of producing vermicompost on a large-scale, the main challenge is to keep the worms alive. This is achieved by maintaining temperature and moisture in their living medium. It is a difficult task to maintain these parameters throughout the process. Currently, this is achieved by building infrastructure but this method requires a large initial investment and long-run maintenance. Also, these methods are limited to small-scale production. For large-scale production, a unit is developed which utilises natural airflow with water and automation. The main aim of this unit is to provide favourable conditions to worms in large-scale production with very low investment and minimum maintenance in long term. The key innovation of this research is that the technology used in the unit should be practical and easy to adopt by small farmers. For long-term maintenance of the technology lesser number of parts are used.


Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3055
Author(s):  
Olivier Pieters ◽  
Tom De Swaef ◽  
Peter Lootens ◽  
Michiel Stock ◽  
Isabel Roldán-Ruiz ◽  
...  

The study of the dynamic responses of plants to short-term environmental changes is becoming increasingly important in basic plant science, phenotyping, breeding, crop management, and modelling. These short-term variations are crucial in plant adaptation to new environments and, consequently, in plant fitness and productivity. Scalable, versatile, accurate, and low-cost data-logging solutions are necessary to advance these fields and complement existing sensing platforms such as high-throughput phenotyping. However, current data logging and sensing platforms do not meet the requirements to monitor these responses. Therefore, a new modular data logging platform was designed, named Gloxinia. Different sensor boards are interconnected depending upon the needs, with the potential to scale to hundreds of sensors in a distributed sensor system. To demonstrate the architecture, two sensor boards were designed—one for single-ended measurements and one for lock-in amplifier based measurements, named Sylvatica and Planalta, respectively. To evaluate the performance of the system in small setups, a small-scale trial was conducted in a growth chamber. Expected plant dynamics were successfully captured, indicating proper operation of the system. Though a large scale trial was not performed, we expect the system to scale very well to larger setups. Additionally, the platform is open-source, enabling other users to easily build upon our work and perform application-specific optimisations.


2020 ◽  
Vol 104 (2) ◽  
pp. 1581-1596
Author(s):  
Thomas Heinze

Abstract Dynamics of snow avalanches or landslides can be described by rapid granular flow. Experimental investigations of granular flow at laboratory scale are often required to analyze flow behaviour and to develop adequate mathematical and numerical models. Most investigations use image-based analysis, and additional sensors such as pressure gauges are not always possible. Testing various scenarios and parameter variations such as different obstacle shapes and positions as well as basal topography and friction usually requires either the construction of a new laboratory setups for each test or a cumbersome reconstruction. In this work, a highly flexible and modular laboratory setup is presented based on LEGO bricks. The flexibility of the model is demonstrated, and possible extensions for future laboratory tests are outlined. The setup is able to reproduce published laboratory experiments addressing current scientific research topics, such as overflow of a rigid reflector, flow on a bumpy surface and against a rigid wall using standard image-based analysis. This makes the setup applicable for quick scenario testing, e.g. for hypothesis testing or for low-cost testing prior to large-scale experiments, and it can contribute to the validation of external results and to benchmarks of numerical models. Small-scale laboratory setups are also very useful for demonstration purposes such as education and public outreach, both crucial in the context of natural hazards. The presented setup enables variation of parameters such as of slope length, channel width, height and shape, inclination, bed friction, obstacle position and shape, as well as density, composition, amount and grain size of flowing mass. Observable quantities are flow type, flow height, flow path and flow velocity, as well as runout distance, size and shape of the deposited material. Additional sensors allow further quantitative assessments, such as local pressure values.


2020 ◽  
Author(s):  
Florencia López Boo ◽  
Jane Leer ◽  
Akito Kamei

Expanding small-scale interventions without lowering quality and attenuating impact is a critical policy challenge. Community monitoring overs a low-cost quality assurance mechanism by making service providers account-able to local citizens, rather than distant administrators. This paper provides experimental evidence from a home visit parenting program implemented at scale by the Nicaraguan government, with two types of monitoring: (a) institutional monitoring; and (b) community monitoring. We find d a positive intent-to-treat effect on child development, but only among groups randomly assigned to community monitoring. Our findings show promise for the use of community monitoring to ensure quality in large-scale government-run social programs.


2018 ◽  
Vol 20 (4) ◽  
pp. 737-742 ◽  

<p>Biomining is the common term used to define processes that utilize biological systems to facilitate the extraction of metals from ores. Nowadays, a biomining concept can be defined as a two stage combined biological systems (1st stage bioleaching and 2nd stage biosorption) in order to perform the extraction and recovery of the metals from secondary sources such as industrial and mining waste, waste electrical and electronic equipment (WEEE), bottom ash and end of life vehicles. Overwhelming demand and limited sources of metals have resulted in searching new sources so that attentions have been shifted from mining process towards recycling of secondary resources for the recovery of metals. There are several metallurgical processes for metal recovery from the secondary sources such as pyrometallurgical processing, hydrometallurgical and bio/hydrometal-lurgical processing. Biomining processes are estimated to be relatively low-cost, environmentally friendly and suitable for both large scale as well as small scale applications under the bio/hydrometallurgical processing. Thus, the process involves physical separation (pre-treatment) and biomining (bioleaching and biosorption) and hydrometallurgical processes for recovery of base metals, rare earth elements (REEs) and precious metals from e-waste was evaluated.</p>


Sign in / Sign up

Export Citation Format

Share Document