method model
Recently Published Documents


TOTAL DOCUMENTS

335
(FIVE YEARS 104)

H-INDEX

23
(FIVE YEARS 3)

F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 1196
Author(s):  
Iseult Lynch ◽  
Penny Nymark ◽  
Philip Doganis ◽  
Mary Gulumian ◽  
Tae-Hyun Yoon ◽  
...  

Nanotoxicology is a relatively new field of research concerning the study and application of nanomaterials to evaluate the potential for harmful effects in parallel with the development of applications. Nanotoxicology as a field spans materials synthesis and characterisation, assessment of fate and behaviour, exposure science, toxicology / ecotoxicology, molecular biology and toxicogenomics, epidemiology, safe and sustainable by design approaches, and chemoinformatics and nanoinformatics, thus requiring scientists to work collaboratively, often outside their core expertise area. This interdisciplinarity can lead to challenges in terms of interpretation and reporting, and calls for a platform for sharing of best-practice in nanotoxicology research. The F1000Research Nanotoxicology collection, introduced via this editorial, will provide a place to share accumulated best practice, via original research reports including no-effects studies, protocols and methods papers, software reports and living systematic reviews, which can be updated as new knowledge emerges or as the domain of applicability of the method, model or software is expanded. This editorial introduces the Nanotoxicology Collection in F1000Research. The aim of the collection is to provide an open access platform for nanotoxicology researchers, to support an improved culture of data sharing and documentation of evolving protocols, biological and computational models, software tools and datasets, that can be applied and built upon to develop predictive models and move towards in silico nanotoxicology and nanoinformatics. Submissions will be assessed for fit to the collection and subjected to the F1000Research open peer review process.


2021 ◽  
Vol 9 ◽  
Author(s):  
Anders Bryn ◽  
Trine Bekkby ◽  
Eli Rinde ◽  
Hege Gundersen ◽  
Rune Halvorsen

Information about the distribution of a study object (e.g., species or habitat) is essential in face of increasing pressure from land or sea use, and climate change. Distribution models are instrumental for acquiring such information, but also encumbered by uncertainties caused by different sources of error, bias and inaccuracy that need to be dealt with. In this paper we identify the most common sources of uncertainties and link them to different phases in the modeling process. Our aim is to outline the implications of these uncertainties for the reliability of distribution models and to summarize the precautions needed to be taken. We performed a step-by-step assessment of errors, biases and inaccuracies related to the five main steps in a standard distribution modeling process: (1) ecological understanding, assumptions and problem formulation; (2) data collection and preparation; (3) choice of modeling method, model tuning and parameterization; (4) evaluation of models; and, finally, (5) implementation and use. Our synthesis highlights the need to consider the entire distribution modeling process when the reliability and applicability of the models are assessed. A key recommendation is to evaluate the model properly by use of a dataset that is collected independently of the training data. We support initiatives to establish international protocols and open geodatabases for distribution models.


Author(s):  
Huanpei Lyu ◽  
Libin Zhang ◽  
Dapeng Tan ◽  
Fang Xu

Fault-tolerant control should be considered during assembly to ensure stability and efficiency of the assembly process. The paper proposes a fault-tolerant method to improve stability and efficiency during the assembly of small and complex products. The fault-tolerant method model was initially constructed, then an adaptive artificial potential field control algorithm (AAPF) was introduced to control related assembly tasks based on changes in assembly information. Next, active and passive fault tolerance methods were integrated using a least squares support vector machine (LS-SVM). Finally, the assembly of a 2P circuit breaker controller assembly with leakage protection was used as an example to verify the proposed assembly method. The experimental results demonstrated that the AAPF fault-tolerant method showed promising fault-tolerance capabilities for the assembly of small and complex products. Not only could it coordinate the number of tasks for each assembly robot, but it also effectively reduced the number of tasks that accumulated due to faults. The method proposed in this paper could effectively guarantee assembly stability and efficiency during small and complex product assembly.


2021 ◽  
Vol 2085 (1) ◽  
pp. 012008
Author(s):  
Jimin Yu ◽  
Zhi Yong ◽  
Yousi Wang

Abstract In order to solve the problem of path tracking of a quadrotor UAV, this paper proposes a track tracking control method which combines Model Predictive Control algorithm and PD control method. Model Predictive Control algorithm can generate control input for formation flight and track the specified trajectory. PD control can achieve rapid response to attitude and adjust error quickly. The simulation results verify the effectiveness of the proposed control method.


2021 ◽  
Vol 33 (6) ◽  
pp. 1-16
Author(s):  
Guihe He

Since entering the 21st century, with the rapid development of Internet technology, the network platform of e-commerce has also undergone great changes. With the rapid development of e-commerce economy, the widespread existence of credit risks in the trading process of e-commerce environmental goods has caused great harm to the healthy development of e-commerce, and has gradually become the biggest bottleneck restricting the development of e-commerce in China.Based on the basic model of e-commerce market, and proposes four types of credit mechanism to maintain social relations in the process of e-commerce environmental goods trading: that is, credit mechanism without government intervention; As can be seen from the experimental data, the Alpha coefficient of each potential variable set in the experiment in this paper is above 0.6, which is within the acceptable range, which verifies that the method model proposed in this paper to maintain social relations in the process of e-commerce environmental goods trading is effective in application.


2021 ◽  
Vol 33 (6) ◽  
pp. 0-0

Since entering the 21st century, with the rapid development of Internet technology, the network platform of e-commerce has also undergone great changes. With the rapid development of e-commerce economy, the widespread existence of credit risks in the trading process of e-commerce environmental goods has caused great harm to the healthy development of e-commerce, and has gradually become the biggest bottleneck restricting the development of e-commerce in China.Based on the basic model of e-commerce market, and proposes four types of credit mechanism to maintain social relations in the process of e-commerce environmental goods trading: that is, credit mechanism without government intervention; As can be seen from the experimental data, the Alpha coefficient of each potential variable set in the experiment in this paper is above 0.6, which is within the acceptable range, which verifies that the method model proposed in this paper to maintain social relations in the process of e-commerce environmental goods trading is effective in application.


2021 ◽  
Vol 11 (19) ◽  
pp. 9180
Author(s):  
Siangruei Wu ◽  
Yihong Wu ◽  
Haoyun Chang ◽  
Florence T. Su ◽  
Hengchun Liao ◽  
...  

Semantic segmentation of medical images with deep learning models is rapidly being developed. In this study, we benchmarked state-of-the-art deep learning segmentation algorithms on our clinical stereotactic radiosurgery dataset. The dataset consists of 1688 patients with various brain lesions (pituitary tumors, meningioma, schwannoma, brain metastases, arteriovenous malformation, and trigeminal neuralgia), and we divided the dataset into a training set (1557 patients) and test set (131 patients). This study demonstrates the strengths and weaknesses of deep-learning algorithms in a fairly practical scenario. We compared the model performances concerning their sampling method, model architecture, and the choice of loss functions, identifying suitable settings for their applications and shedding light on the possible improvements. Evidence from this study led us to conclude that deep learning could be promising in assisting the segmentation of brain lesions even if the training dataset was of high heterogeneity in lesion types and sizes.


2021 ◽  
Vol 18 (18) ◽  
pp. 5097-5115
Author(s):  
Teresa Vogl ◽  
Amy Hrdina ◽  
Christoph K. Thomas

Abstract. Accurately measuring the turbulent transport of reactive and conservative greenhouse gases, heat, and organic compounds between the surface and the atmosphere is critical for understanding trace gas exchange and its response to changes in climate and anthropogenic activities. The relaxed eddy accumulation (REA) method enables measuring the land surface exchange when fast-response sensors are not available, broadening the suite of trace gases that can be investigated. The β factor scales the concentration differences to the flux, and its choice is central to successfully using REA. Deadbands are used to select only certain turbulent motions to compute the flux. This study evaluates a variety of different REA approaches with the goal of formulating recommendations applicable over a wide range of surfaces and meteorological conditions for an optimal choice of the β factor in combination with a suitable deadband. Observations were collected across three contrasting ecosystems offering stark differences in scalar transport and dynamics: a mid-latitude grassland ecosystem in Europe, a loose gravel surface of the Dry Valleys of Antarctica, and a spruce forest site in the European mid-range mountains. We tested a total of four different REA models for the β factor: the first two methods, referred to as model 1 and model 2, derive βp based on a proxy p for which high-frequency observations are available (sensible heat Ts). In the first case, a linear deadband is applied, while in the second case, we are using a hyperbolic deadband. The third method, model 3, employs the approach first published by Baker et al. (1992), which computes βw solely based upon the vertical wind statistics. The fourth method, model 4, uses a constant βp, const derived from long-term averaging of the proxy-based βp factor. Each β model was optimized with respect to deadband size before intercomparison. To our best knowledge, this is the first study intercomparing these different approaches over a range of different sites. With respect to overall REA performance, we found that the βw and constant βp, const performed more robustly than the dynamic proxy-dependent approaches. The latter models still performed well when scalar similarity between the proxy (here Ts) and the scalar of interest (here water vapor) showed strong statistical correlation, i.e., during periods when the distribution and temporal behavior of sources and sinks were similar. Concerning the sensitivity of the different β factors to atmospheric stability, we observed that βT slightly increased with increasing stability parameter z/L when no deadband is applied, but this trend vanished with increasing deadband size. βw was unrelated to dynamic stability and displayed a generally low variability across all sites, suggesting that βw can be considered a site-independent constant. To explain why the βw approach seems to be insensitive towards changes in atmospheric stability, we separated the contribution of w′ kurtosis to the flux uncertainty. For REA applications without deeper site-specific knowledge of the turbulent transport and degree of scalar similarity, we recommend using either the βp, const or βw models when the uncertainty of the REA flux quantification is not limited by the detection limit of the instrument. For conditions when REA sampling differences are close to the instrument's detection limit, the βp models using a hyperbolic deadband are the recommended choice.


Sign in / Sign up

Export Citation Format

Share Document