scholarly journals An Artificial Sweating System for Sweat Sensor Testing Applications

Electronics ◽  
2019 ◽  
Vol 8 (6) ◽  
pp. 606 ◽  
Author(s):  
Andrew Brueck ◽  
Kyle Bates ◽  
Trent Wood ◽  
William House ◽  
Zackary Martinez ◽  
...  

This research proposes a completely automated, computer-controlled fluid mixing and dispensing system, which is suitable for testing sweat sensing devices, as an alternative to requiring human trials during the development phase of a sweat sensor device. An arm mold was designed and implemented with dragon skin and pores to simulate sweating action. The relay controlled mixing tanks allow for the different concentration of fluid solutions at various rates of fluid dispensing through pores. The onboard single board computer controls a dozen electronic relays and it switches and presents an easy to use graphical user interface to allow end users to conduct the experiments with ease and not require further programming. With the recent advances in sweat sensors, this platform offers a unique way of testing sensing devices during development, allowing for researchers to focus on their design parameters one at a time before actual validation through human trials are conducted. The current device can provide sweat rates from 1 µL/min to 500 µL/min. Furthermore, concentrations of 10 mM up to 200 mM of salt concentrations were able to be repeatedly produced. In an ANOVA test with salt concentrations varying from 40–60 mM, a p-value of 0.365 shows that the concentration does not have any effect on the flow rate. Similarly, a p-value of 0.329 and 0.167 for different relative humidity and temperature shows that the system does not present a statistical difference. Lastly, when the interactions among all the factors were considered, a p-value of 0.416 clearly presents that the system performance is insensitive to different factors, thus validating the system reliability.

2019 ◽  
Author(s):  
Don van den Bergh ◽  
Johnny van Doorn ◽  
Maarten Marsman ◽  
Tim Draws ◽  
Erik-Jan van Kesteren ◽  
...  

Analysis of variance (ANOVA) is the standard procedure for statistical inference in factorial designs. Typically, ANOVAs are executed using frequentist statistics, where p-values determine statistical significance in an all-or-none fashion. In recent years, the Bayesian approach to statistics is increasingly viewed as a legitimate alternative to the p-value. However, the broad adoption of Bayesian statistics –and Bayesian ANOVA in particular– is frustrated by the fact that Bayesian concepts are rarely taught in applied statistics courses. Consequently, practitioners may be unsure how to conduct a Bayesian ANOVA and interpret the results. Herewe provide a guide for executing and interpreting a Bayesian ANOVA with JASP, an open-source statistical software program with a graphical user interface. We explain the key concepts of the Bayesian ANOVA using twoempirical examples.


2019 ◽  
Vol 11 (2) ◽  
pp. 138-148
Author(s):  
Adri Senen ◽  
Titi Ratnasari ◽  
Dwi Anggaini

The level of reliability of a distribution system is very important to ensure the continuity of electricity supply to consumers. This research was conducted to calculate the Saidi reliability index and the Saifi 20 kV distribution system in Pinang PLN in 2017 with the help of the Matlab R2008a application. A GUI is one of the Matlab programs that is widely used in describing information and commands, and its users are possible to form and format themselves according to the needs of the system itself. With the help of this software, it is hoped that it will be easier to calculate the system reliability index, namely the calculation of SAIDI and SAIFI, then the results can be made so that the database is more organized. The calculation results obtained will show the highest and lowest SAIDI and SAIFI values ​​that occur in each of the current months, making it easier to evaluate, analyze and improve the reliability of the electrical power system network in the future, especially in PT.


Author(s):  
Eric C. Marineau ◽  
Marcelo Reggio

This paper presents the characteristics and the usefulness of a MATLAB toolbox in teaching the effects of the design parameters on the performance and characteristics of axial and radial compressors, turbines and pumps. The teaching and learning of the working principles of turbomachines are challenging subjects as the understanding and application of various concepts from fluid dynamics, thermodynamics and dimensional analysis are required. It is believed that the current toolbox will help students acquire durable and intuitive knowledge as it provides an effective discovery environment. In this environment, the users progressively gain insight into the laws and rules of a system by manipulating the variables and visualizing the resulting consequences. For each type of components, distinct sets of input variables can be chosen, each one corresponding to a different design problem. Based on the mean radius analysis and using the input parameters provided by the user, the toolbox provides a preliminary design. The results are displayed in different type of graphics such as velocity triangles, efficiency contours and a sketch of the designed turbomachine. The toolbox Graphical User Interface (GUI) insures user-friendliness such that the user can strictly focus on the content.


Author(s):  
Gordon J. Savage ◽  
Young Kap Son

The application of reliability-based design optimization (RBDO) to degrading systems is challenging because of the continual interplay between calculating time-variant reliability (to ensure reliability policies are met) and moving the design point to optimize various objectives, such as cost, weight, size and so forth. The time needed for Monte Carlo Simulation (MCS) is lengthy when reliability calculations are required for each iteration of the design point. The common methods used to date to improve efficiency have some shortcomings: First, most approaches approximate probability via a method that invokes the most-likely failure point (MLFP), and second, tolerances are almost always excluded from the list of design parameters (hence only so-called parameter design is performed), and, without tolerances, true monetary costs cannot be determined, especially in manufactured systems. Herein, the efficiency of RBDO for degrading systems is greatly improved by essentially uncoupling the time-variant reliability problem from the optimization problem. First, a meta-model is built to relate time-variant reliability to the design space. Design of experiment techniques helps to select a few judicious training sets. Second, the meta-model is accessed to quickly evaluate objectives and reliability constraints in the optimization process. The set-theory approach (with MCS) is invoked to find the system reliability accurately and efficiently for multiple competing performance measures. For a case study, the seminal roller clutch with degradation due to wear is examined. The meta-model method, using both moving least-squares and kriging (using DACE in Matlab), is compared to the traditional approach whereby reliability is determined by MCS at each optimization iteration. The case study shows that both means and tolerances are found that correctly minimize a monetary cost objective and yet ensure that reliability policies are met. The meta-model approach is simple, accurate and very fast, suggesting an attractive means for RBDO of time-variant systems.


2020 ◽  
Vol 10 (15) ◽  
pp. 5112 ◽  
Author(s):  
Bong-Sul Lee ◽  
Abera Tullu ◽  
Ho-Yon Hwang

An optimization study of an electric vertical takeoff and landing personal air vehicle (eVTOL PAV) was performed during the conceptual design stage using the design of experiments method. In defining the initial problem, a design target parameter was set. The PAV subsystem was based on a configuration tradeoff study matrix, which was used to effectively conduct configuration selection. Initial sizing was performed using the PAV sizing program developed by this research team using Microsoft Excel and Visual Basic for Application (VBA). A screening test was performed to find parameters with high sensitivity among independent design parameters. The response surface method was used to model design target parameters, and a regression equation was estimated using the experimental design method. A Monte Carlo simulation was performed to confirm the feasibility of the generated model. To optimize the design independent parameter, a satisfaction function was selected, and the appropriateness of the data was determined using a Pareto plot and p-value.


Author(s):  
M.F. Schmid ◽  
R. Dargahi ◽  
M. W. Tam

Electron crystallography is an emerging field for structure determination as evidenced by a number of membrane proteins that have been solved to near-atomic resolution. Advances in specimen preparation and in data acquisition with a 400kV microscope by computer controlled spot scanning mean that our ability to record electron image data will outstrip our capacity to analyze it. The computed fourier transform of these images must be processed in order to provide a direct measurement of amplitudes and phases needed for 3-D reconstruction.In anticipation of this processing bottleneck, we have written a program that incorporates a menu-and mouse-driven procedure for auto-indexing and refining the reciprocal lattice parameters in the computed transform from an image of a crystal. It is linked to subsequent steps of image processing by a system of data bases and spawned child processes; data transfer between different program modules no longer requires manual data entry. The progress of the reciprocal lattice refinement is monitored visually and quantitatively. If desired, the processing is carried through the lattice distortion correction (unbending) steps automatically.


Author(s):  
R. J. Lee ◽  
J. S. Walker

Electron microscopy (EM), with the advent of computer control and image analysis techniques, is rapidly evolving from an interpretative science into a quantitative technique. Electron microscopy is potentially of value in two general aspects of environmental health: exposure and diagnosis.In diagnosis, electron microscopy is essentially an extension of optical microscopy. The goal is to characterize cellular changes induced by external agents. The external agent could be any foreign material, chemicals, or even stress. The use of electron microscopy as a diagnostic tool is well- developed, but computer-controlled electron microscopy (CCEM) has had only limited impact, mainly because it is fairly new and many institutions lack the resources to acquire the capability. In addition, major contributions to diagnosis will come from CCEM only when image analysis (IA) and processing algorithms are developed which allow the morphological and textural changes recognized by experienced medical practioners to be quantified. The application of IA techniques to compare cellular structure is still in a primitive state.


Author(s):  
Robert W. Mackin

This paper presents two advances towards the automated three-dimensional (3-D) analysis of thick and heavily-overlapped regions in cytological preparations such as cervical/vaginal smears. First, a high speed 3-D brightfield microscope has been developed, allowing the acquisition of image data at speeds approaching 30 optical slices per second. Second, algorithms have been developed to detect and segment nuclei in spite of the extremely high image variability and low contrast typical of such regions. The analysis of such regions is inherently a 3-D problem that cannot be solved reliably with conventional 2-D imaging and image analysis methods.High-Speed 3-D imaging of the specimen is accomplished by moving the specimen axially relative to the objective lens of a standard microscope (Zeiss) at a speed of 30 steps per second, where the stepsize is adjustable from 0.2 - 5μm. The specimen is mounted on a computer-controlled, piezoelectric microstage (Burleigh PZS-100, 68/μm displacement). At each step, an optical slice is acquired using a CCD camera (SONY XC-11/71 IP, Dalsa CA-D1-0256, and CA-D2-0512 have been used) connected to a 4-node array processor system based on the Intel i860 chip.


Sign in / Sign up

Export Citation Format

Share Document