iterative model
Recently Published Documents


TOTAL DOCUMENTS

331
(FIVE YEARS 91)

H-INDEX

23
(FIVE YEARS 3)

2021 ◽  
Author(s):  
Erick Matsen ◽  
Peter L. Ralph

Although the rates at which positions in the genome mutate are known to depend not only on the nucleotide to be mutated, but also on neighboring nucleotides, it remains challenging to do phylogenetic inference using models of context-dependent mutation. In these models, the effects of one mutation may in principle propagate to faraway locations, making it difficult to compute exact likelihoods. This paper shows how to use bounds on the propagation of dependency to compute likelihoods of mutation of a given segment of genome by marginalizing over sufficiently long flanking sequence. This can be used for maximum likelihood or Bayesian inference. Protocols examining residuals and iterative model refinement are also discussed. Tools for efficiently working with these models are provided in an R package, that could be used in other applications. The method is used to examine context dependence of mutations since the common ancestor of humans and chimpanzee.


2021 ◽  
Vol 9 ◽  
Author(s):  
Ryan P. McClure ◽  
R. Quinn Thomas ◽  
Mary E. Lofton ◽  
Whitney M. Woelmer ◽  
Cayelan C. Carey

Near-term, ecological forecasting with iterative model refitting and uncertainty partitioning has great promise for improving our understanding of ecological processes and the predictive skill of ecological models, but to date has been infrequently applied to predict biogeochemical fluxes. Bubble fluxes of methane (CH4) from aquatic sediments to the atmosphere (ebullition) dominate freshwater greenhouse gas emissions, but it remains unknown how best to make robust near-term CH4 ebullition predictions using models. Near-term forecasting workflows have the potential to address several current challenges in predicting CH4 ebullition rates, including: development of models that can be applied across time horizons and ecosystems, identification of the timescales for which predictions can provide useful information, and quantification of uncertainty in predictions. To assess the capacity of near-term, iterative forecasting workflows to improve ebullition rate predictions, we developed and tested a near-term, iterative forecasting workflow of CH4 ebullition rates in a small eutrophic reservoir throughout one open-water period. The workflow included the repeated updating of a CH4 ebullition forecast model over time with newly-collected data via iterative model refitting. We compared the CH4 forecasts from our workflow to both alternative forecasts generated without iterative model refitting and a persistence null model. Our forecasts with iterative model refitting estimated CH4 ebullition rates up to 2 weeks into the future [RMSE at 1-week ahead = 0.53 and 0.48 loge(mg CH4 m−2 d−1) at 2-week ahead horizons]. Forecasts with iterative model refitting outperformed forecasts without refitting and the persistence null model at both 1- and 2-week forecast horizons. Driver uncertainty and model process uncertainty contributed the most to total forecast uncertainty, suggesting that future workflow improvements should focus on improved mechanistic understanding of CH4 models and drivers. Altogether, our study suggests that iterative forecasting improves week-to-week CH4 ebullition predictions, provides insight into predictability of ebullition rates into the future, and identifies which sources of uncertainty are the most important contributors to the total uncertainty in CH4 ebullition predictions.


Author(s):  
Ting Su ◽  
Zhuoxu Cui ◽  
Jiecheng Yang ◽  
Yunxin Zhang ◽  
Jian Liu ◽  
...  

Abstract Sparse-view CT is a promising approach in reducing the X-ray radiation dose in clinical CT imaging. However, the CT images reconstructed from the conventional filtered backprojection (FBP) algorithm suffer from severe streaking artifacts. Iterative reconstruction (IR) algorithms have been widely adopted to mitigate these streaking artifacts, but they may prolong the CT imaging time due to the intense data-specific computations. Recently, model-driven deep learning (DL) CT image reconstruction method, which unrolls the iterative optimization procedures into the deep neural network, has shown exciting prospect in improving the image quality and shortening the reconstruction time. In this work, we explore the generalized unrolling scheme for such iterative model to further enhance its performance on sparse-view CT imaging. By using it, the iteration parameters, regularizer term, data-fidelity term and even the mathematical operations are all assumed to be learned and optimized via the network training. Results from the numerical and experimental sparse-view CT imaging demonstrate that the newly proposed network with the maximum generalization provides the best reconstruction performance.


2021 ◽  
Author(s):  
◽  
Charles William Barrie

<p>This thesis explores the nature of a landscape design process that could ensure the resilience and sustainability of suburban public space. Utilising a literature review and two large case study projects, the research presents an argument that: • public landscapes must be seen as multi-dimensional complex systems emerging from the co-evolution of different players in the landscape community with the dynamics of their wider ecosystem; and • the sustainable design of these spaces is dependent on collaborative decision-making, the engagement and empowerment of the local community, and the restoration of ongoing responsive interaction with the site.  This approach is referred to as 'deep landscape design' and is expanded through the presentation of a number of guiding principles which it is hoped will support designers, council staff and community leaders to implement it. These guiding principles describe a facilitated, nested and iterative model of design in which: • the physical, ecological and cultural dimensions of landscape can be integrated holistically; • multiple engagement methods are established enabling the inclusion of a large range of community partners; and  • those engaged in the design of the space are able to reflect on the impacts of their decisions and make changes accordingly.  The research suggests that through the inclusion of deep design principles, small projects with a specific focus can initiate a process of increasing community knowledge, skill, and ownership in the design and maintenance of landscapes. A process which is necessary for the sustainability and resilience of public spaces.</p>


2021 ◽  
Author(s):  
◽  
Charles William Barrie

<p>This thesis explores the nature of a landscape design process that could ensure the resilience and sustainability of suburban public space. Utilising a literature review and two large case study projects, the research presents an argument that: • public landscapes must be seen as multi-dimensional complex systems emerging from the co-evolution of different players in the landscape community with the dynamics of their wider ecosystem; and • the sustainable design of these spaces is dependent on collaborative decision-making, the engagement and empowerment of the local community, and the restoration of ongoing responsive interaction with the site.  This approach is referred to as 'deep landscape design' and is expanded through the presentation of a number of guiding principles which it is hoped will support designers, council staff and community leaders to implement it. These guiding principles describe a facilitated, nested and iterative model of design in which: • the physical, ecological and cultural dimensions of landscape can be integrated holistically; • multiple engagement methods are established enabling the inclusion of a large range of community partners; and  • those engaged in the design of the space are able to reflect on the impacts of their decisions and make changes accordingly.  The research suggests that through the inclusion of deep design principles, small projects with a specific focus can initiate a process of increasing community knowledge, skill, and ownership in the design and maintenance of landscapes. A process which is necessary for the sustainability and resilience of public spaces.</p>


Healthcare ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1466
Author(s):  
Estíbaliz Jiménez-Arberas ◽  
Luis-Javier Márquez-Álvarez ◽  
Isabel Fernández-Méndez ◽  
María-Luisa Ruiz-Fernández

Mali is one of the poorest countries in sub-Saharan Africa. Limited infrastructure renders access to health care difficult. There is a need to establish functional ways to improve Malian people’s health and treat disability. From this point of view, our project aims to implement a remote occupational therapy service for the beneficiaries of the Kalana clinic in Mali through international cooperation. Using a spiral iterative model, a proposal for a remote occupational therapy service was developed and refined for a multidisciplinary context. The International Classification of Functioning, Disability, and Health (ICF) was used as a means to work from a multidisciplinary approach to treat all needs. The results are exemplified with a case report and qualitative impressions of the services.


2021 ◽  
Vol 72 ◽  
pp. 533-612
Author(s):  
Benjamin Krarup ◽  
Senka Krivic ◽  
Daniele Magazzeni ◽  
Derek Long ◽  
Michael Cashmore ◽  
...  

In automated planning, the need for explanations arises when there is a mismatch between a proposed plan and the user’s expectation. We frame Explainable AI Planning as an iterative plan exploration process, in which the user asks a succession of contrastive questions that lead to the generation and solution of hypothetical planning problems that are restrictions of the original problem. The object of the exploration is for the user to understand the constraints that govern the original plan and, ultimately, to arrive at a satisfactory plan. We present the results of a user study that demonstrates that when users ask questions about plans, those questions are usually contrastive, i.e. “why A rather than B?”. We use the data from this study to construct a taxonomy of user questions that often arise during plan exploration. Our approach to iterative plan exploration is a process of successive model restriction. Each contrastive user question imposes a set of constraints on the planning problem, leading to the construction of a new hypothetical planning problem as a restriction of the original. Solving this restricted problem results in a plan that can be compared with the original plan, admitting a contrastive explanation. We formally define model-based compilations in PDDL2.1 for each type of constraint derived from a contrastive user question in the taxonomy, and empirically evaluate the compilations in terms of computational complexity. The compilations were implemented as part of an explanation framework supporting iterative model restriction. We demonstrate its benefits in a second user study.


Author(s):  
A. V. Cherepanov ◽  
G. A. Rekhtina

The problem fields of research in the framework of the topic are the lack of classification of training methods in intra-organization training, the insufficient quality of the use of tools in the practice of intra-organization training, as well as the use of tools for evaluating and analyzing training activities. The highlighted aspects allowed us to determine the relevant vectors of the research: the competence of the coach, the systematization of training methods and techniques, the features of the use of the tools, the methodology of the tools for evaluation of training sessions. The most important characteristic of the competence of a corporate coach is the possession of the training technology tools at a high level. The article stands out the main competences of a corporate coach, such as focus on results, effective communication, effective self-presentation, persuasion and influence, confidence and stress resistance, creating a motivating educational environment. The authors pay particular attention to the phenomenon of pedagogical artistry, internal and external conditions of its development. The article introduces the basic training techniques (informational, stimulation, exercises for practical performance of work, group-dynamic exercises) and related training methods into the system. The authors consider the criteria and limitations that a trainer should to take into account when choosing training technology tools. The article introduces distinctions of tools when considering their arsenal (planning tools; direct implementation of the process; control (monitoring), evaluation and analysis of the results; post-training support). The authors indicate the following features of the use of training technology tools in intra-organization training: the formation of target guidelines for each stage of training implementation; determining tasks and the algorithm for training implementation; drawing up a training program; chronological planning of training units; planning of the necessary methodical support of training units; compliance with the principle of training planning based on an iterative model with one or more contours; the use of business and simulation games, exercises and tasks; using multiple studies scenarios for conducting classes; presenting information depending on the features of its perception; taking into account the target audience; taking into account the age characteristics of the audience, etc.


2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Jie Li ◽  
Wei Wang ◽  
Shizhi Long ◽  
Xin Liu ◽  
Long Huang ◽  
...  

To explore the effect of the full iterative model reconstruction algorithm (IMR) on chest CT image processing and its adoption value in the clinical diagnosis of lung cancer patients, multislice spiral CT (MSCT) scans were performed on 96 patients with pulmonary nodules. Reconstruction was performed by hybrid iterative reconstruction (iDose4) and IMR2 algorithms. Then, the image contrast, spatial resolution, density resolution, image uniformity, and noise of the CT reconstructed image were recorded. The benign and malignant pulmonary nodules of patients were collected and classified into malignant pulmonary nodule group and benign pulmonary nodule group, and the differences in chest CT imaging characteristics between the two groups were compared. The subject’s receiver operating characteristic (ROC) curve was used to analyze the diagnostic sensitivity, specificity, and area under the curve (AUC) of CT for benign and malignant pulmonary nodules. It was found that the spatial resolution, density resolution, image uniformity, and contrast of the CT image reconstructed by the IMR2 algorithm were remarkably greater than those of the iDose4 algorithm, and the noise was considerably less than that of the iDose4 algorithm ( P < 0.05 ). Among 96 patients with pulmonary nodules, 65 were malignant nodules, including 15 squamous cell carcinoma, 31 adenocarcinoma, and 19 small cell carcinomas. There were 31 cases of benign nodules, including 14 cases of hamartoma, 10 cases of tuberculous granuloma, 2 cases of sclerosing hemangioma, and 5 cases of diffuse lymphocyte proliferation. The pulmonary nodule malignant group and the pulmonary nodule benign group had statistical differences in pulmonary nodule size, nodule morphology, burr sign, lobular sign, vascular sign, bronchial sign, and pleural depression sign ( P < 0.05 ). The sensitivity, specificity, and area under the curve (AUC) of IMR2 algorithm processing chest CT images for liver cancer diagnosis were 85.7%, 82.3%, and 0.815, respectively, which were significantly higher than the original CT images ( P < 0.05 ). In short, chest MSCT based on the IMR2 algorithm can greatly improve the diagnosis efficiency of lung cancer and had practical significance for the timely detection of early lung cancer.


Sign in / Sign up

Export Citation Format

Share Document