scholarly journals From 3D tissue data to impedance using Simpleware ScanFE+IP and COMSOL Multiphysics – a tutorial

2019 ◽  
Vol 2 (1) ◽  
pp. 13-32 ◽  
Author(s):  
Fred-Johan Pettersen ◽  
Jan Olav Høgetveit

Abstract Tools such as Simpleware ScanIP+FE and COMSOL Multiphysics allow us to gain a better understanding of bioimpedance measurements without actually doing the measurements. This tutorial will cover the steps needed to go from a 3D voxel data set to a model that can be used to simulate a transfer impedance measurement. Geometrical input data used in this tutorial are from MRI scan of a human thigh, which are converted to a mesh using Simpleware ScanIP+FE. The mesh is merged with electrical properties for the relevant tissues, and a simulation is done in COMSOL Multiphysics. Available numerical output data are transfer impedance, contribution from different tissues to final transfer impedance, and voltages at electrodes. Available volume output data are normal and reciprocal current densities, potential, sensitivity, and volume impedance sensitivity. The output data are presented as both numbers and graphs. The tutorial will be useful even if data from other sources such as VOXEL-MAN or CT scans are used.

2016 ◽  
Vol 64 (1) ◽  
pp. 7-13 ◽  
Author(s):  
Onic Islam Shuvo ◽  
Md Naimul Islam

One of the major problems with Electrical Impedance Tomography (EIT) is the lack of spatial sensitivity within the measured volume. In this paper, sensitivity distribution of the tetrapolar impedance measurement system was visualized considering a cylindrical phantom consisting of homogeneous and inhomogeneous medium. Previously, sensitivity distribution was analysed analytically only for the homogeneous medium considering simple geometries and the distribution was found to be complex1,2. However, for the inhomogeneous volume conductors sensitivity analysis needs to be done using finite element method (FEM). In this paper, the results of sensitivity analysis based on finite element method using COMSOL Multiphysics simulation software are presented. A cylindrical non-uniform, inhomogeneous phantom, which mimics the human upper arm, was chosen to do the experiments by varying different parameters of interest. A successful method for controlling the region of interest was found where the sensitivity was maximum. Refining the finite element mesh size and introducing multifrequency input current (up to 1 MHz) this simulation method can be further improved.Dhaka Univ. J. Sci. 64(1): 7-13, 2016 (January)


2019 ◽  
Vol 270 ◽  
pp. 04015
Author(s):  
Edy Anto Soentoro ◽  
Nina Pebriana

Reservoir operations, especially those which regulate the outflow (release) volume, are crucial for the fulfillment of the purpose to build the reservoir. To get the best results, outflow (release) discharges need to be optimized to meet the objectives of the reservoir operation. A fuzzy rule-based model was used in this study because it can deal with uncertainty constraints and objects without clear or well-defined boundaries. The objective of this study is to determine the maximum total release volume based on water availability (i.e., a monthly release is equal to or more than monthly demand). The case study is located at Darma reservoir. A fuzzy rule-based model was used to optimize the monthly release volume, and the result was compared with that of NLP and the demand. The Sugeno fuzzy method was used to generate fuzzy rules from a given input-output data set that consisted of demand, inflow, storage, and release. The results of this study showed that the release of Sugeno method and the demand have the same basic pattern, in which the release fulfill the demand. The overall result showed that the fuzzy rule-based model with Sugeno method can be used for optimization based on real-life experiences from experts that are used to working in the field.


2004 ◽  
Vol 65 (3) ◽  
pp. 273-288
Author(s):  
Dimosthenis Anagnostopoulos ◽  
Vassilis Dalakas ◽  
Mara Nikolaidou

2011 ◽  
Vol 21 (03) ◽  
pp. 247-263 ◽  
Author(s):  
J. P. FLORIDO ◽  
H. POMARES ◽  
I. ROJAS

In function approximation problems, one of the most common ways to evaluate a learning algorithm consists in partitioning the original data set (input/output data) into two sets: learning, used for building models, and test, applied for genuine out-of-sample evaluation. When the partition into learning and test sets does not take into account the variability and geometry of the original data, it might lead to non-balanced and unrepresentative learning and test sets and, thus, to wrong conclusions in the accuracy of the learning algorithm. How the partitioning is made is therefore a key issue and becomes more important when the data set is small due to the need of reducing the pessimistic effects caused by the removal of instances from the original data set. Thus, in this work, we propose a deterministic data mining approach for a distribution of a data set (input/output data) into two representative and balanced sets of roughly equal size taking the variability of the data set into consideration with the purpose of allowing both a fair evaluation of learning's accuracy and to make reproducible machine learning experiments usually based on random distributions. The sets are generated using a combination of a clustering procedure, especially suited for function approximation problems, and a distribution algorithm which distributes the data set into two sets within each cluster based on a nearest-neighbor approach. In the experiments section, the performance of the proposed methodology is reported in a variety of situations through an ANOVA-based statistical study of the results.


2003 ◽  
Vol 07 (01) ◽  
pp. 15-23
Author(s):  
Tomotaka Nakajima ◽  
Richard E. Hughes ◽  
Kai-Nan An

The goal of this study was to visualize the supraspinatus tendon three-dimensionally using fast spin-echo (FSE) MRI and validate the accuracy of measuring the dimensions of the supraspinatus tendon based on 3D reconstructed images. Nine cadaver shoulders (51–84 y/o, mean 70.0 y/o) were imaged at conventional T2-weighted spin-echo (CSE), gradient echo (GRE), and 3D T2-weighted FSE sequences. Each "object" of the supraspinatus muscle, tendon and scapula was three-dimensionally reconstructed using ANALYZE™ image data processing software. The FSE images revealed significantly higher contrast of the tendon and contrast-to-noise ratios of the fat-to-tendon and fat-to-muscle. The length of the anterior, middle, and posterior portions of the tendon were measured in two ways: (1) from the three-dimensional reconstructed images, and (2) directly from the cadaver specimen using calipers. No statistically significant differences were found between the ANALYZE™ and caliper measurements using a paired t-test for the anterior (p = 0.55), middle (p = 0.57) and posterior (p = 0.44) portions of the supraspinatus. The 3D FSE sequence exhibits higher spatial resolution, spends shorter acquisition time, and constructs a voxel data set. These advantages can prevent blurring artifacts when imaging the supraspinatus tendon of a human body. Tendon length measurements derived from three-dimensional reconstructions using ANALYZE™ were found to be good estimates of actual tendon length. Therefore, the combination of FSE sequence and 3D image data processing provides a method for noninvasive quantitative analysis of supraspinatus tendon morphology. The results lay the groundwork for future quantitative studies of cuff pathology.


2017 ◽  
Author(s):  
Bernardo A. Mello ◽  
Yuhai Tu

To decipher molecular mechanisms in biological systems from system-level input-output data is challenging especially for complex processes that involve interactions among multiple components. Here, we study regulation of the multi-domain (P1-5) histidine kinase CheA by the MCP chemoreceptors. We develop a network model to describe dynamics of the system treating the receptor complex with CheW and P3P4P5 domains of CheA as a regulated enzyme with two substrates, P1 and ATP. The model enables us to search the hypothesis space systematically for the simplest possible regulation mechanism consistent with the available data. Our analysis reveals a novel dual regulation mechanism wherein besides regulating ATP binding the receptor activity has to regulate one other key reaction, either P1 binding or phosphotransfer between P1 and ATP. Furthermore, our study shows that the receptors only control kinetic rates of the enzyme without changing its equilibrium properties. Predictions are made for future experiments to distinguish the remaining two dual-regulation mechanisms. This systems-biology approach of combining modeling and a large input-output data-set should be applicable for studying other complex biological processes.


Sign in / Sign up

Export Citation Format

Share Document