scholarly journals Design and validation of an e-textile-based wearable system for remote health monitoring

ACTA IMEKO ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 220
Author(s):  
Armando Coccia ◽  
Federica Amitrano ◽  
Leandro Donisi ◽  
Giuseppe Cesarelli ◽  
Gaetano Pagano ◽  
...  

<p class="Abstract">The paper presents a new e-textile-based system, named SWEET Shirt, for the remote monitoring of biomedical signals. The system includes a textile sensing shirt, an electronic unit for data transmission, a custom-made Android application for real-time signal visualisation and a software desktop for advanced digital signal processing. The device allows for the acquisition of electrocardiographic, bicep electromyographic and trunk acceleration signals. The sensors, electrodes, and bus structures are all integrated within the textile garment, without any discomfort for users. A wide-ranging set of algorithms for signal processing were also developed for use within the system, allowing clinicians to rapidly obtain a complete and schematic overview of a patient’s clinical status. The aim of this work was to present the design and development of the device and to provide a validation analysis of the electrocardiographic measurement and digital processing. The results demonstrate that the information contained in the signals recorded by the novel system is comparable to that obtained via a standard medical device commonly used in clinical environments. Similarly encouraging results were obtained in the comparison of the variables derived from the signal processing.</p>

Sensors ◽  
2020 ◽  
Vol 20 (22) ◽  
pp. 6691
Author(s):  
Federica Amitrano ◽  
Armando Coccia ◽  
Carlo Ricciardi ◽  
Leandro Donisi ◽  
Giuseppe Cesarelli ◽  
...  

This paper presents a new wearable e-textile based system, named SWEET Sock, for biomedical signals remote monitoring. The system includes a textile sensing sock, an electronic unit for data transmission, a custom-made Android application for real-time signal visualization, and a software desktop for advanced digital signal processing. The device allows the acquisition of angular velocities of the lower limbs and plantar pressure signals, which are postprocessed to have a complete and schematic overview of patient’s clinical status, regarding gait and postural assessment. In this work, device performances are validated by evaluating the agreement between the prototype and an optoelectronic system for gait analysis on a set of free walk acquisitions. Results show good agreement between the systems in the assessment of gait cycle time and cadence, while the presence of systematic and proportional errors are pointed out for swing and stance time parameters. Worse results were obtained in the comparison of spatial metrics. The “wearability” of the system and its comfortable use make it suitable to be used in domestic environment for the continuous remote health monitoring of de-hospitalized patients but also in the ergonomic assessment of health workers, thanks to its low invasiveness.


2012 ◽  
Vol 241-244 ◽  
pp. 1751-1755
Author(s):  
Yin Bing Zhu ◽  
Ke Jing Cao ◽  
Bao Li

Auto-search is one of the key steps in digital signal processing for Loran-C receivers, however, for digital sampling Loran-C signal, the principle search algorithm is unable to realize signal search veraciously because of the asynchronism between sampling clock and transmitting station clock. For this question, an auto-search algorithm based on subsection correlation for Loran-C is presented after analyzing the principle search algorithm. The experiment results show that for the received digital Loran-C signal, there are several correlation and accumulation values of master and secondary stations to exceed the search thresholds; the maximum correlation and accumulation value of the presented algorithm is far higher than that of the principle algorithm. That is to say, the presented algorithm can search the arrival time of master and secondary station successfully, solve the problem of clock asynchronism effectively, and enhance the search sensitivity of the receiver, which have great significance for digital processing of Loran-C signal and the engineering realization of Loran-C digital receiver.


Author(s):  
José Luis Rojo-Álvarez ◽  
Manel Martínez-Ramón ◽  
Gustavo Camps-Valls ◽  
Carlos E. Martínez-Cruz ◽  
Carlos Figuera

Digital signal processing (DSP) of time series using SVM has been addressed in the literature with a straightforward application of the SVM kernel regression, but the assumption of independently distributed samples in regression models is not fulfilled by a time-series problem. Therefore, a new branch of SVM algorithms has to be developed for the advantageous application of SVM concepts when we process data with underlying time-series structure. In this chapter, we summarize our past, present, and future proposal for the SVM-DSP frame-work, which consists of several principles for creating linear and nonlinear SVM algorithms devoted to DSP problems. First, the statement of linear signal models in the primal problem (primal signal models) allows us to obtain robust estimators of the model coefficients in classical DSP problems. Next, nonlinear SVM-DSP algorithms can be addressed from two different approaches: (a) reproducing kernel Hilbert spaces (RKHS) signal models, which state the signal model equation in the feature space, and (b) dual signal models, which are based on the nonlinear regression of the time instants with appropriate Mercer’s kernels. This way, concepts like filtering, time interpolation, and convolution are considered and analyzed, and they open the field for future development on signal processing algorithms following this SVM-DSP framework.


Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3070 ◽  
Author(s):  
Raúl Caulier-Cisterna ◽  
Manuel Blanco-Velasco ◽  
Rebeca Goya-Esteban ◽  
Sergio Muñoz-Romero ◽  
Margarita Sanromán-Junquera ◽  
...  

During the last years, attention and controversy have been present for the first commercially available equipment being used in Electrocardiographic Imaging (ECGI), a new cardiac diagnostic tool which opens up a new field of diagnostic possibilities. Previous knowledge and criteria of cardiologists using intracardiac Electrograms (EGM) should be revisited from the newly available spatial–temporal potentials, and digital signal processing should be readapted to this new data structure. Aiming to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology, we previously presented two results: First, spatial consistency can be observed even for very basic cardiac signal processing stages (such as baseline wander and low-pass filtering); second, useful bipolar EGMs can be obtained by a digital processing operator searching for the maximum amplitude and including a time delay. In addition, this work aims to demonstrate the functionality of ECGI for cardiac electrophysiology from a twofold view, namely, through the analysis of the EGM waveforms, and by studying the ventricular repolarization properties. The former is scrutinized in terms of the clustering properties of the unipolar an bipolar EGM waveforms, in control and myocardial infarction subjects, and the latter is analyzed using the properties of T-wave alternans (TWA) in control and in Long-QT syndrome (LQTS) example subjects. Clustered regions of the EGMs were spatially consistent and congruent with the presence of infarcted tissue in unipolar EGMs, and bipolar EGMs with adequate signal processing operators hold this consistency and yielded a larger, yet moderate, number of spatial–temporal regions. TWA was not present in control compared with an LQTS subject in terms of the estimated alternans amplitude from the unipolar EGMs, however, higher spatial–temporal variation was present in LQTS torso and epicardium measurements, which was consistent through three different methods of alternans estimation. We conclude that spatial–temporal analysis of EGMs in ECGI will pave the way towards enhanced usefulness in the clinical practice, so that atomic signal processing approach should be conveniently revisited to be able to deal with the great amount of information that ECGI conveys for the clinician.


2019 ◽  
Vol 214 ◽  
pp. 02006 ◽  
Author(s):  
Nico Madysa

The design of readout electronics for the LAr calorimeters of the ATLAS detector to be operated at the future High-Luminosity LHC (HL-LHC) requires a detailed simulation of the full readout chain in order to find optimal solutions for the analog and digital processing of the detector signals. Due to the long duration of the LAr calorimeter pulses relative to the LHC bunch crossing time, out-of-time signal pileup needs to be taken into account. For this purpose, the simulation framework AREUS has been developed. It models analog-to-digital conversion, gain selection, and digital signal processing at bit precision, including digitization noise and detailed electronics effects. Trigger and object reconstruction algorithms are taken into account in the optimization process. The software implementation of AREUS, the concepts of its main functional blocks, as well as optimization considerations will be presented. Various approaches to introduce parallelism into AREUS will be compared against each other.


2016 ◽  
pp. 222-235 ◽  
Author(s):  
Terrence D. Lagerlund

Digital computers can perform types of signal processing not readily available with analog devices, such as ordinary electrical circuits. This includes making the process of obtaining, storing, retrieving, and viewing clinical neurophysiology data easier; aiding in extracting information from waveforms that is not readily obtainable with visual analysis alone; and improving quantification of key features of waveforms. These processes are useful in accurate clinical diagnosis of electroencephalographic (EEG), electromyographic (EMG), and evoked potential studies, and it also lend themselves to serial comparisons between studies performed on the same subject at different times or between two groups of subjects in scientific investigations. Digital computers may also partially automate the interpretation of clinical neurophysiology studies. This chapter reviews the principles of digitization, the design of digitally based instruments for clinical neurophysiology, and several common uses of digital processing, including averaging, digital filtering, and some types of time-domain and frequency-domain analysis.


2014 ◽  
Vol 926-930 ◽  
pp. 2992-2995
Author(s):  
Zheng Pu Zhang ◽  
Xing Feng Guo ◽  
Bo Tian

Compressive sensing is a new type of digital signal processing method. The novel objective of compressive Sensing is to reconstruct a signal accurately and efficiently from far fewer sampling points got by Nyquist sampling theorem. Compressive sensing theory combines the process of sampling and compression to reduce the complexity of signal processing, which is widely used in many fields. so there are wide application prospects in the areas of radar image, wireless sensor network (WSN), radio frequency communication, medical image processing, image device collecting and so on. One of the important tasks in CS is how to recover the signals more accurately and effectively, which is concerned by many researchers. Compressive sensing started late; there are many problems and research directions worthy of our in-depth research. At present, many researchers shove focused on reconstruction algorithms. Reconstruction algorithms are the core of compressive sensing, which are of great significance to reconstructing compressed signals and verifying the accuracy in sampling. These papers introduce CosaMP algorithm; and then study and analyze the Gaussian noise as the main content. Finally, the given signal and random signal, for example, we give a series of comparison results.


Geophysics ◽  
2005 ◽  
Vol 70 (4) ◽  
pp. 7JA-30JA ◽  
Author(s):  
Enders A. Robinson

The beginning of digital signal processing took place in the years 1950 to 1954. Using an econometric model, E. A. Robinson in 1951 came up with the method of deconvolution, which he tested on 32 seismic traces. Norbert Wiener, George Wadsworth, Paul Samuelson, and Robert Solow were his advisors. On the basis of this work, the MIT president's office in 1952 set up and sponsored the Geophysical Analysis Group (GAG) in the Department of Geology and Geophysics. GAG was made up of graduate students doing research in digital signal processing. In 1953, a consortium of oil and geophysical companies took over the sponsorship. At first, GAG used the MIT Whirlwind digital computer. In order to do the larger amount of computing required by the consortium, the Computer Service Section of Raytheon Manufacturing Company was enlisted in 1953. The Raytheon people who played key roles were Richard Clippinger, Bernard Dimsdale, and Joseph H. Levin, all of whom had worked on ENIAC, the world's first electronic digital computer. As originally built, ENIAC did not use programs stored in memory as does a modern computer; instead, the programming was done by rewiring the physical components for each new problem. In 1948, Clippinger was responsible for converting ENIAC into the world's first operational stored-program computer. ENIAC had 20 accumulators but no other random access memory (RAM). The programs were stored in the function tables, which acted as programmable read-only memory(PROM). For GAG work in 1953, Raytheon used the British Ferranti Mark 1 computer (which was the commercial version of the Manchester Mark 1 computer, for which Alan Turing played a key role). This computer was installed at the University of Toronto to help in the design of the St. Lawrence Seaway. Raytheon was plagued by frequent breakdowns of the computer but still produced several hundred seismic deconvolutions for the summer GAG meeting in 1953. The consortium was pleased with the geophysical results but was disheartened by the unreliability of the current state of digital technology. As a result, GAG was directed to find analog ways to do deconvolution. Instead, GAG found that all of the analog methods, and in particular, electric frequency filtering, could be done by digital signal processing. In fact, the digital way provided greater accuracy than the analog way. At the spring meeting in 1954, GAG proposed that all analog processing be thrown out and replaced by digital signal processing. Raytheon was at the meeting and offered to obtain or build all the elements required for digital signal processing, from input to output. The conversion to digital was not done at the time. However, that step did happen in the early 1960s, and exploration geophysics has the distinction of being the first science to experience a total digital revolution. Digital processing today provides seismic images of the interior of the Earth so startling that they compare to images of the stars made by the Hubble telescope. (In fact, the digital method of deconvolution first developed in geophysics made possible the digital correction of the lens of the Hubble telescope.)


2013 ◽  
Vol 300-301 ◽  
pp. 1669-1672
Author(s):  
Yong Li

In order to improve the quality and precision of seismic data, taking out or suppressing interference wave consisting in the earthquake wave will be one important link in the digital processing of seismic data .The fast Fournier transformation resolves big points N into certain dot’s DFT combinations .And then breaking a large number of multiply operations to add operations and a small quality of multiply operations, thus the computation speed of the Discrete Fourier Transformation (DFT) will be enhanced greatly. The widespread uses of FFT make it to be a powerful tool in digital signal processing. The present paper will introduce the quite comprehensive narration of the principle of filter, the characteristic of fast Fournier transformation algorithm principle as well as the realization.


Sign in / Sign up

Export Citation Format

Share Document