computational aspect
Recently Published Documents


TOTAL DOCUMENTS

53
(FIVE YEARS 13)

H-INDEX

7
(FIVE YEARS 0)

2021 ◽  
Vol 5 (OOPSLA) ◽  
pp. 1-29
Author(s):  
Stefan Malewski ◽  
Michael Greenberg ◽  
Éric Tanter

Dynamically-typed languages offer easy interaction with ad hoc data such as JSON and S-expressions; statically-typed languages offer powerful tools for working with structured data, notably algebraic datatypes , which are a core feature of typed languages both functional and otherwise. Gradual typing aims to reconcile dynamic and static typing smoothly. The gradual typing literature has extensively focused on the computational aspect of types, such as type safety, effects, noninterference, or parametricity, but the application of graduality to data structuring mechanisms has been much less explored. While row polymorphism and set-theoretic types have been studied in the context of gradual typing, algebraic datatypes in particular have not, which is surprising considering their wide use in practice. We develop, formalize, and prototype a novel approach to gradually structured data with algebraic datatypes. Gradually structured data bridges the gap between traditional algebraic datatypes and flexible data management mechanisms such as tagged data in dynamic languages, or polymorphic variants in OCaml. We illustrate the key ideas of gradual algebraic datatypes through the evolution of a small server application from dynamic to progressively more static checking, formalize a core functional language with gradually structured data, and establish its metatheory, including the gradual guarantees.


Energies ◽  
2021 ◽  
Vol 14 (19) ◽  
pp. 6108
Author(s):  
Artun Sel ◽  
Bilgehan Sel ◽  
Umit Coskun ◽  
Cosku Kasnakoglu

In this study, two different parameter estimation algorithms are studied and compared. Iterated EKF and a nonlinear optimization algorithm based on on-line search methods are implemented to estimate parameters of a given permanent magnet synchronous motor whose dynamics are assumed to be known and nonlinear. In addition to parameters, initial conditions of the dynamical system are also considered to be unknown, and that comprises one of the differences of those two algorithms. The implementation of those algorithms for the problem and adaptations of the methods are detailed for some other variations of the problem that are reported in the literature. As for the computational aspect of the study, a convexity study is conducted to obtain the spherical neighborhood of the unknown terms around their correct values in the space. To obtain such a range is important to determine convexity properties of the optimization problem given in the estimation problem. In this study, an EKF-based parameter estimation algorithm and an optimization-based method are designed for a given nonlinear dynamical system. The design steps are detailed, and the efficacies and shortcomings of both algorithms are discussed regarding the numerical simulations.


PLoS Biology ◽  
2021 ◽  
Vol 19 (8) ◽  
pp. e3001318
Author(s):  
Stefano Scaramuzza ◽  
Daniel Castaño-Díez

Subtomogram averaging (STA) is a powerful image processing technique in electron tomography used to determine the 3D structure of macromolecular complexes in their native environments. It is a fast growing technique with increasing importance in structural biology. The computational aspect of STA is very complex and depends on a large number of variables. We noticed a lack of detailed guides for STA processing. Also, current publications in this field often lack a documentation that is practical enough to reproduce the results with reasonable effort, which is necessary for the scientific community to grow. We therefore provide a complete, detailed, and fully reproducible processing protocol that covers all aspects of particle picking and particle alignment in STA. The command line–based workflow is fully based on the popular Dynamo software for STA. Within this workflow, we also demonstrate how large parts of the processing pipeline can be streamlined and automatized for increased throughput. This protocol is aimed at users on all levels. It can be used for training purposes, or it can serve as basis to design user-specific projects by taking advantage of the flexibility of Dynamo by modifying and expanding the given pipeline. The protocol is successfully validated using the Electron Microscopy Public Image Archive (EMPIAR) database entry 10164 from immature HIV-1 virus-like particles (VLPs) that describe a geometry often seen in electron tomography.


2021 ◽  
Vol 10 (2) ◽  
Author(s):  
Bernardo Lejano ◽  
James Matthew De Jesus ◽  
Arvin Patrick Yu

Cold-Formed Steel (CFS) is a good construction material because of its high strength-to-weight ratio, that is, it exhibits efficient load carrying capabilities in combination with its lightweight characteristics. Although CFS is already being used in construction, information on structural performance of locally-produced CFS in the Philippines is scarce. To date, the authors have not found any experimental study done in the Philippines regarding the structural performance of locally-produced CFS. In this study, C-section and Z-section are being studied since these members exhibit buckling failures that may be difficult to predict due to complexity of their section geometry. The objective of this paper is to present the performance of these CFS sections when subjected to concentric axial compression both experimentally and computationally. For the experimental part, the CFS members were subjected to axial compression using a hydraulic jack. High-speed video cameras were used to capture the different failure modes. For the computational aspect, provisions found in the National Structural Code of the Philippines (NSCP) were used to calculate the compression strength of the members. A total of 80 C-section specimens with 5 different lengths and 5 different thicknesses were tested. It was found that the strength calculations using the NSCP provisions were not consistent with the results of the compression tests. For shorter lengths, distortional buckling prevailed as the main failure, while for longer lengths, torsional-flexural buckling occurred. All of the predicted strengths were highly conservative. For the Z-section, a total of 180 specimens with 6 different lengths and 6 different thicknesses were tested. Torsional-flexural buckling was observed in majority of the specimens. Although most of the failure modes were predicted correctly, it was found that the predicted strengths using the NSCP were relatively high compared to the experimental results, thus non-conservative. Finite Element Method (FEM) analyses using ANSYS were conducted. Findings indicate that the experiment results agreed well with the FEM results.


With the same methodology of the previous chapter, in this chapter there is an outline of vertical path about geometric topics typical of the secondary school. Of course, the algorithm and computational aspect in the MatCos 3.X environment are more developed with respect to classical arguments, surely of interest. In particular, the presentation of conics in both the Euclidean and Cartesian plan is emphasized, based on construction algorithms by points, which can be easily implemented in the MatCos 3.X programming environment. Even solid geometry, or in three dimensions, will be characterized by effective construction algorithms of the solid figures presented. Some of these algorithms are general in nature.


2020 ◽  
Vol 19 (04) ◽  
pp. 2040008
Author(s):  
Chao Dong ◽  
Milka Montes ◽  
Wael M. Al-Sawai

Xanthine Oxidoreductase (XOR) exists in a variety of organisms from bacteria to humans and catalyzes the oxidation of hypoxanthine to xanthine and from xanthine to uric acid. Excessive uric acid could lead to gout and hyperuricemia. In this paper, we have reviewed the recent computational studies on xanthine oxidase inhibition. Computational methods, such as molecular dynamics (molecular mechanics), quantum mechanics, and quantum mechanics/molecular mechanics (QM/MM), have been employed to investigate the binding affinity of xanthine oxidase with synthesized and isolated nature inhibitors. The limitations of different computational methods for xanthine oxidase inhibition studies were also discussed. Implications of the computational approach could be used to help to understand the existing arguments on substrate/product orientation in xanthine oxidase inhibition, which allows designing new inhibitors with higher efficacy.


Author(s):  
Ho Le Huy Phuc ◽  
Le Van Canh ◽  
Phan Duc Hung

This study presents a novel application of mesh-free method using the smoothed-radial basis functions for the computational homogenization analysis of materials. The displacement field corresponding to the scattered nodes within the representative volume element (RVE) is split into two parts including mean term and fluctuation term, and then the fluctuation one is approximated using the integrated radial basis function (iRBF) method. Due to the use of the stabilized conforming nodal integration (SCNI) technique, the strain rate is smoothed at discreted nodes; therefore, all constrains in resulting problems are enforced at nodes directly. Taking advantage of the shape function satisfies Kronecker-delta property, the periodic boundary conditions well-known as the most appropriate procedure for RVE are similarly imposed as in the finite element method. Several numerical examples are investigated to observe the computational aspect of iRBF procedure. The good agreement of the results in comparison with those reported in other studies demonstrates the accuracy and reliability of proposed approach. Keywords: homogenization analysis; mesh-free method; radial point interpolation method; SCNI scheme.


2019 ◽  
Author(s):  
Lisa-Katrin Schätzle ◽  
Ali Hadizadeh Esfahani ◽  
Andreas Schuppert

AbstractTranslational models directly relating drug response-specific processes observed in vitro to their in vivo role in cancer patients constitute a crucial part of the development of personalized medication. Unfortunately, ongoing research is often confined by the irreproducibility of the results in other contexts. While the inconsistency of pharmacological data has received great attention recently, the computational aspect of this crisis still deserves closer examination. Notably, studies often focus only on isolated model characteristics instead of examining the overall workflow and the interplay of individual model components. Here, we present a systematic investigation of translational models using the R-package FORESEE. Our findings confirm that with the current exploitation of the available data and the prevailing trend of optimizing methods to only one specific use case, modeling solutions will continue to suffer from non-transferability. Instead, the conduct of developing translational approaches urgently needs to change to retrieve clinical relevance in the future.


Author(s):  
Richard Stec ◽  
Antonin Novak ◽  
Premysl Sucha ◽  
Zdenek Hanzalek

Many real-world scheduling problems are characterized by uncertain parameters. In this paper, we study a classical parallel machine scheduling problem where the processing time of jobs is given by a normal distribution. The objective is to maximize the probability that jobs are completed before a given common due date. This study focuses on the computational aspect of this problem, and it proposes a Branch-and-Price approach for solving it. The advantage of our method is that it scales very well with the increasing number of machines and is easy to implement. Furthermore, we propose an efficient lower bound heuristics. The experimental results show that our method outperforms the existing approaches.


Sign in / Sign up

Export Citation Format

Share Document