scholarly journals A comparative soil liquefaction analysis with a Matlab® based algorithm: soiLique

2021 ◽  
Vol 25 (3) ◽  
pp. 323-340
Author(s):  
Ekrem Bekin ◽  
Ferhat Ozcep

Soil liquefaction is one of the ground failures induced by earthquakes. Determining the safety factor and the settlements are the most common analyses to decrease liquefaction-induced failures and hazards. Scientists have suggested numerous empirical formulas to detect and mitigate liquefaction-based hazards, and they have been used over the decades. This study aims to present a user-friendly and interactive program for deterministic soil liquefaction analyses. The algorithm presented in this study, soiLique, is the first MATLAB® program, including a graphical user interface that provides the deterministic liquefaction analysis with the computation of parameters propounded with the formulas. One of the advantages of soiLique is that it allows picking the physical property of every layer (i.e., fine or coarse), which provides dealing with liquefaction prone layer(s) directly when necessary. Not only can one calculate parameters regarding soil liquefaction with the help of this program, but one also can see graphically supported results. The robustness of soiLique is checked with another soil liquefaction analysis program, SoilEngineering, which was introduced by Ozcep (2010). Calculations were done separately using real SPT data and synthetic data such as VS measurements and CPT data. The real SPT data and synthetic VS data were used to compare soiLique and SoilEngineering (Ozcep, 2010). The present study presents an example of CPT data analysis but could not be used for comparison. Comparisons reveal that outputs of soiLique and results of SoilEngineering showed a good agreement.

2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Ermanno Cordelli ◽  
Paolo Soda ◽  
Giulio Iannello

Abstract Background Biological phenomena usually evolves over time and recent advances in high-throughput microscopy have made possible to collect multiple 3D images over time, generating $$3D+t$$ 3 D + t (or 4D) datasets. To extract useful information there is the need to extract spatial and temporal data on the particles that are in the images, but particle tracking and feature extraction need some kind of assistance. Results This manuscript introduces our new freely downloadable toolbox, the Visual4DTracker. It is a MATLAB package implementing several useful functionalities to navigate, analyse and proof-read the track of each particle detected in any $$3D+t$$ 3 D + t stack. Furthermore, it allows users to proof-read and to evaluate the traces with respect to a given gold standard. The Visual4DTracker toolbox permits the users to visualize and save all the generated results through a user-friendly graphical user interface. This tool has been successfully used in three applicative examples. The first processes synthetic data to show all the software functionalities. The second shows how to process a 4D image stack showing the time-lapse growth of Drosophila cells in an embryo. The third example presents the quantitative analysis of insulin granules in living beta-cells, showing that such particles have two main dynamics that coexist inside the cells. Conclusions Visual4DTracker is a software package for MATLAB to visualize, handle and manually track $$3D+t$$ 3 D + t stacks of microscopy images containing objects such cells, granules, etc.. With its unique set of functions, it remarkably permits the user to analyze and proof-read 4D data in a friendly 3D fashion. The tool is freely available at https://drive.google.com/drive/folders/19AEn0TqP-2B8Z10kOavEAopTUxsKUV73?usp=sharing


2021 ◽  
Vol 11 (11) ◽  
pp. 5283
Author(s):  
Jui-Ching Chou ◽  
Hsueh-Tusng Yang ◽  
Der-Guey Lin

Soil-liquefaction-related hazards can damage structures or lead to an extensive loss of life and property. Therefore, the stability and safety of structures against soil liquefaction are essential for evaluation in earthquake design. In practice, the simplified liquefaction analysis procedure associated with numerical simulation analysis is the most used approach for evaluating the behavior of structures or the effectiveness of mitigation plans. First, the occurrence of soil liquefaction is evaluated using the simplified procedure. If soil liquefaction occurs, the resulting structural damage or the following mitigation plan is evaluated using the numerical simulation analysis. Rational and comparable evaluation results between the simplified liquefaction analysis procedure and the numerical simulation analysis are achieved by ensuring that the liquefaction constitutive model used in the numerical simulation has a consistent liquefaction resistance with the simplified liquefaction analysis procedure. In this study, two frequently used liquefaction constitutive models (Finn model and UBCSAND model) were calibrated by fitting the liquefaction triggering curves of most used simplified liquefaction analysis procedures (NCEER, HBF, JRA96, and T-Y procedures) in Taiwan via FLAC program. In addition, the responses of two calibrated models were compared and discussed to provide guidelines for selecting an appropriate liquefaction constitutive model in future projects.


2021 ◽  
Author(s):  
De-Xing Zhu ◽  
Hong-Ming Liu ◽  
Yang-Yang Xu ◽  
You-Tian Zou ◽  
Xi-Jun Wu ◽  
...  

Abstract In the present work, considering the preformation probability of the emitted two protons in the parent nucleus, we extend the Coulomb and proximity potential model (CPPM) to systematically study two-proton (2p) radioactivity half-lives of the nuclei close to proton drip line, while the proximity potential is chosen as Prox.81 proposed by Blocki et al. in 1981. Furthermore, we apply this model to predict the half-lives of possible 2p radioactive candidates whose 2p radioactivity is energetically allowed or observed but not yet quantified in the evaluated nuclear properties table NUBASE2016. The predicted results are in good agreement with those from other theoretical models and empirical formulas, namely the effective liquid drop model (ELDM), generalized liquid drop model (GLDM), Gamow-like model, Sreeja formula and Liu formula.


Author(s):  
Johannes Palmer ◽  
Aaron Schartner ◽  
Andrey Danilov ◽  
Vincent Tse

Abstract Magnetic Flux Leakage (MFL) is a robust technology with high data coverage. Decades of continuous sizing improvement allowed for industry-accepted sizing reliability. The continuous optimization of sizing processes ensures accurate results in categorizing metal loss features. However, the identified selection of critical anomalies is not always optimal; sometimes anomalies are dug up too early or unnecessarily, this can be caused by the feature type in the field (true metal loss shape) being incorrectly identified which affects sizing and tolerance. In addition, there is the possibility for incorrectly identifying feature types causing false under-calls. Today, complex empirical formulas together with multifaceted lookup tables fed by pull tests, synthetic data, dig verifications, machine learning, artificial intelligence and last but not least human expertise translate MFL signals into metal loss assessments with high levels of success. Nevertheless, two important principal elements are limiting the possible MFL sizing optimization. One is the empirical character of the signal interpretation. The other is the implicitly induced data and result simplification. The reason to go this principal route for many years is simple: it is methodologically impossible to calculate the metal source geometry directly from the signals. In addition, the pure number of possible relevant geometries is so large that simplification is necessary and inevitable. Moreover, the second methodological reason is the ambiguity of the signal, which defines the target of metal loss sizing as the most probable solution. However, even under the best conditions, the most probable one is not necessarily the correct one. This paper describes a novel, fundamentally different approach as a basic alternative to the common MFL-analysis approach described above. A calculation process is presented, which overcomes the empirical nature of traditional approaches by using a result optimization method that relies on intense computing and avoids any simplification. Additionally, the strategy to overcome MFL ambiguity will be shown. Together with the operator, detailed blind-test examples demonstrate the enormous level of detail, repeatability and accuracy of this groundbreaking technological method with the potential to reduce tool tolerance, increase sizing accuracy, increase growth rate accuracy, and help optimize the dig program to target critical features with greater confidence.


2020 ◽  
pp. 580-592
Author(s):  
Libi Hertzberg ◽  
Assif Yitzhaky ◽  
Metsada Pasmanik-Chor

This article describes how the last decade has been characterized by the production of huge amounts of different types of biological data. Following that, a flood of bioinformatics tools have been published. However, many of these tools are commercial, or require computational skills. In addition, not all tools provide intuitive and highly accessible visualization of the results. The authors have developed GEView (Gene Expression View), which is a free, user-friendly tool harboring several existing algorithms and statistical methods for the analysis of high-throughput gene, microRNA or protein expression data. It can be used to perform basic analysis such as quality control, outlier detection, batch correction and differential expression analysis, through a single intuitive graphical user interface. GEView is unique in its simplicity and highly accessible visualization it provides. Together with its basic and intuitive functionality it allows Bio-Medical scientists with no computational skills to independently analyze and visualize high-throughput data produced in their own labs.


2019 ◽  
Author(s):  
A Trullo ◽  
J Dufourt ◽  
M Lagha

Abstract Motivation During development, progenitor cells undergo multiple rounds of cellular divisions during which transcriptional programs must be faithfully propagated. Investigating the timing of transcriptional activation, which is a highly stochastic phenomenon, requires the analysis of large amounts of data. In order to perform automatic image analysis of transcriptional activation, we developed a software that segments and tracks both small and large objects, leading the user from raw data up to the results in their final form. Results MitoTrack is a user-friendly open-access integrated software that performs the specific dual task of reporting the precise timing of transcriptional activation while keeping lineage tree history for each nucleus of a living developing embryo. The software works automatically but provides the possibility to easily supervise, correct and validate each step. Availability and implementation MitoTrack is an open source Python software, embedded within a graphical user interface (download here). Supplementary information Supplementary data are available at Bioinformatics online.


2007 ◽  
Vol 22 (38) ◽  
pp. 2909-2916
Author(s):  
G. LÓPEZ CASTRO ◽  
J. PESTIEAU

We propose some empirical formulas relating the masses of the heaviest particles in the standard model (the W, Z, H bosons and the t quark) to the charge of the positron e and the Higgs condensate v. The relations for the masses of gauge bosons mW = (1+e)v/4 and [Formula: see text] are in good agreement with experimental values. By requiring the electroweak standard model to be free from quadratic divergences at the one-loop level, we find: [Formula: see text] and [Formula: see text], or the very simple ratio (mt/mH)2 = e.


2009 ◽  
Vol 53 (3) ◽  
pp. 547-560 ◽  
Author(s):  
K. S. Vipin ◽  
T. G. Sitharam ◽  
P. Anbazhagan

Sign in / Sign up

Export Citation Format

Share Document