scholarly journals A SHORELINE EVOLUTION MODEL BASED ON EQUILIBRIUM FORMULATIONS

Author(s):  
Camilo Jaramillo Cardona ◽  
Jara Martinez Sanchez ◽  
Mauricio Gonzalez Rodriguez ◽  
Raul Medina Santamaria

Traditionally, the shoreline hindcast under the influence of changing marine conditions has been considered by means of existing robust shoreline evolution models, such as one-line, multi-line, combined or 3D models. All of them require long data series, many calibration parameters and are computationally intensive. This study presents a new shoreline evolution model considering the integration of cross-shore, planform and rotation equilibrium-based models, applicable over time-scales spanning days, months or several years. The new model successfully reaches the general erosion-accretion trend at a qualitative and quantitative level. As the main conclusion, this is a simple equilibrium-based shoreline evolution model that requires few calibration parameters and is computationally efficient and versatile.Recorded Presentation from the vICCE (YouTube Link): https://youtu.be/zkQ7AoAWmEE

2021 ◽  
pp. 026461962110364
Author(s):  
Francis William

This study evaluated the adapted science and mathematics books for students with visual impairment in inclusive classrooms in Tanzania secondary schools. It was conducted in 14 regions using qualitative and quantitative approaches. Data were generated from a sample of 19 heads of school, 103 students, and 77 teachers. The findings of the study show that the books are appropriate for students with visual impairment. However, they lacked enough tactile illustrations and pictures. Further findings revealed that lack of braille knowledge among some teachers limited them from using the books. As such, a few teachers who had braille knowledge had started to use the books while those with limited knowledge did not. Most teachers reported lack of appropriate pedagogy for handling special needs in inclusive classrooms. Therefore, although the books are appropriate, a lot needs to be desired in building teachers’ capacity to use the books. Various inclusive methodological knowledge to teachers needs to be ensured. Furthermore, the books must be improved to include more tactile graphics and pictures to make them more reader-friendly for students with visual impairment. Other educational books, including three-dimensional (3D) models, should be part of the adapted books.


Author(s):  
Chalongrath Pholsiri ◽  
Chetan Kapoor ◽  
Delbert Tesar

Robot Capability Analysis (RCA) is a process in which force/motion capabilities of a manipulator are evaluated. It is very useful in both the design and operational phases of robotics. Traditionally, ellipsoids and polytopes are used to both graphically and numerically represent these capabilities. Ellipsoids are computationally efficient but tend to underestimate while polytopes are accurate but computationally intensive. This article proposes a new approach to RCA called the Vector Expansion (VE) method. The VE method offers accurate estimates of robot capabilities in real time and therefore is very suitable in applications like task-based decision making or online path planning. In addition, this method can provide information about the joint that is limiting a robot capability at a given time, thus giving an insight as to how to improve the performance of the robot. This method is then used to estimate capabilities of 4-DOF planar robots and the results discussed and compared with the conventional ellipsoid method. The proposed method is also successfully applied to the 7-DOF Mitsubishi PA10-7C robot.


2020 ◽  
Author(s):  
Ashley Dinauer ◽  
Florian Adolphi ◽  
Fortunat Joos

Abstract. Despite intense focus on the ~ 190 permil drop in atmospheric Δ14C across the deglacial “mystery interval”, the specific mechanisms responsible for the apparent Δ14C excess in the glacial atmosphere have received considerably less attention. The computationally efficient Bern3D earth system model of intermediate complexity, designed for long-term climate simulations, allows us to address a very fundamental but still elusive question concerning the atmospheric Δ14C record: How can we explain the persistence of relatively high Δ14C values during the millennia after the Laschamp event? Large uncertainties in the pre-Holocene 14C production rate, as well as in the older portion of the Δ14C record, complicate our qualitative and quantitative interpretation of the glacial Δ14C elevation. Here we begin with sensitivity experiments that investigate the controls on atmospheric Δ14C in more idealized settings. We show that the long-term process of sedimentation may be much more important to the simulation of Δ14C than had been previously thought. In order to provide a bounded estimate of glacial Δ14C change, the Bern3D model was integrated with five available estimates of the 14C production rate as well as reconstructed and hypothesized paleoclimate forcing. Model results demonstrate that none of the available reconstructions of past changes in 14C production can reproduce the elevated Δ14C levels during the last glacial. In order to increase atmospheric Δ14C to glacial levels, a drastic reduction of air-sea exchange efficiency in the polar regions must be assumed, though discrepancies remain for the portion of the record younger than ~ 33 kyr BP. We end with an illustration of how the 14C production rate would have had to evolve to be consistent with the Δ14C record, by combining an atmospheric radiocarbon budget with the Bern3D model. The overall conclusion is that the remaining discrepancies with respect to glacial Δ14C may be linked to an underestimation of 14C production and/or a biased-high reconstruction of Δ14C over the time period of interest. Alternatively, we appear to still be missing an important carbon cycle process for atmospheric Δ14C.


2013 ◽  
pp. 389-409
Author(s):  
P. Daphne Tsatsoulis ◽  
Aaron Jaech ◽  
Robert Batie ◽  
Marios Savvides

Conventional access control solutions rely on a single authentication to verify a user’s identity but do nothing to ensure the authenticated user is indeed the same person using the system afterwards. Without continuous monitoring, unauthorized individuals have an opportunity to “hijack” or “tailgate” the original user’s session. Continuous authentication attempts to remedy this security loophole. Biometrics is an attractive solution for continuous authentication as it is unobtrusive yet still highly accurate. This allows the authorized user to continue about his routine but quickly detects and blocks intruders. This chapter outlines the components of a multi-biometric based continuous authentication system. Our application employs a biometric hand-off strategy where in the first authentication step a strong biometric robustly identifies the user and then hands control to a less computationally intensive face recognition and tracking system that continuously monitors the presence of the user. Using multiple biometrics allows the system to benefit from the strengths of each modality. Since face verification accuracy degrades as more time elapses between the training stage and operation time, our proposed hand-off strategy permits continuous robust face verification with relatively simple and computationally efficient classifiers. We provide a detailed evaluation of verification performance using different pattern classification algorithms and show that the final multi-modal biometric hand-off scheme yields high verification performance.


2019 ◽  
pp. 443-468
Author(s):  
Michele Russo ◽  
Anna Maria Manferdini

This contribution presents the results of investigations on the reliability of techniques based on the Structure from Motion approach used for 3D digitizations of build heritage. In particular, we tested the performances of different SfM technologies within an architectural survey context and we developed a procedure with the purpose of easing the work of surveyors called to restore digital representations of artifacts at different scales of complexity. The restored 3D models were compared among each other and with a gold standard acquisition. These analysis led to qualitative and quantitative evaluations and to considerations on times and skills required by all tested technologies. In this work strengths and weaknesses are highlighted and the integration of different technologies is presented, as it represents the best solution in many and recurrent multi-scalar contexts.


2019 ◽  
Vol 60 (9) ◽  
pp. 1953-1960 ◽  
Author(s):  
Misato Ohtani ◽  
Andreas Wachter

Abstract Post-transcriptional RNA quality control is a vital issue for all eukaryotes to secure accurate gene expression, both on a qualitative and quantitative level. Among the different mechanisms, nonsense-mediated mRNA decay (NMD) is an essential surveillance system that triggers degradation of both aberrant and physiological transcripts. By targeting a substantial fraction of all transcripts for degradation, including many alternative splicing variants, NMD has a major impact on shaping transcriptomes. Recent progress on the transcriptome-wide profiling and physiological analyses of NMD-deficient plant mutants revealed crucial roles for NMD in gene regulation and environmental responses. In this review, we will briefly summarize our current knowledge of the recognition and degradation of NMD targets, followed by an account of NMD’s regulation and physiological functions. We will specifically discuss plant-specific aspects of RNA quality control and its functional contribution to the fitness and environmental responses of plants.


2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Amr Ismail ◽  
Mostafa Herajy ◽  
Elsayed Atlam ◽  
Monika Heiner

Three-dimensional modelling of biological systems is imperative to study the behaviour of dynamic systems that require the analysis of how their components interact in space. However, there are only a few formal tools that offer a convenient modelling of such systems. The traditional approach to construct and simulate 3D models is to build a system of partial differential equations (PDEs). Although this approach may be computationally efficient and has been employed by many researchers over the years, it is not always intuitive since it does not provide a visual depiction of the modelled systems. Indeed, a visual modelling can help to conceive a mental image which eventually contributes to the understanding of the problem under study. Coloured Hybrid Petri Nets (HPNC) are a high-level representation of classical Petri nets that offer hybrid as well as spatial modelling of biological systems. In addition to their graphical representations, HPNC models are also scalable. This paper shows how HPNC can be used to construct and simulate systems that require three-dimensional as well as hybrid (stochastic/continuous) modelling. We use calcium diffusion in three dimensions to illustrate our main ideas. More specifically, we show that creating 3D models using HPNC can yield more flexible models as the structure can be easily scaled up and down by just modifying a few parameters. This advantage of convenient model configuration facilitates the design of different experiments without the need to alter the model structure.


Crystals ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 478 ◽  
Author(s):  
Alexander Kozlov ◽  
Andrew V. Martin ◽  
Harry M. Quiney

X-ray free-electron laser pulses initiate a complex series of changes to the electronic and nuclear structure of matter on femtosecond timescales. These damage processes include widespread ionization, the formation of a quasi-plasma state and the ultimate explosion of the sample due to Coulomb forces. The accurate simulation of these dynamical effects is critical in designing feasible XFEL experiments and interpreting the results. Current molecular dynamics simulations are, however, computationally intensive, particularly when they treat unbound electrons as classical point particles. On the other hand, plasma simulations are computationally efficient but do not model atomic motion. Here we present a hybrid approach to XFEL damage simulation that combines molecular dynamics for the nuclear motion and plasma models to describe the evolution of the low-energy electron continuum. The plasma properties of the unbound electron gas are used to define modified inter-ionic potentials for the molecular dynamics, including Debye screening and drag forces. The hybrid approach is significantly faster than damage simulations that treat unbound electrons as classical particles, enabling simulations to be performed on large sample volumes.


Robotica ◽  
1993 ◽  
Vol 11 (2) ◽  
pp. 111-118 ◽  
Author(s):  
Greg R. Luecke ◽  
John F. Gardner

SUMMARYAlmost all industrial robot applications in use today are controlled using a control law that is simple and computationally efficient, local joint error feedback. When two or more open chain manipulators cooperate to manipulate the same object - such as in mechanical grippers, walking machines, and cooperating manipulator systems - closed kinematic chain, redundantly actuated mechanisms are formed. Control approaches for this type of system focus on the more computationally intensive computed torque or inverse plant control laws, due to the concern over instability caused by the unspecified distribution of control forces in the redundant actuator space, and due to the constrained motion caused by the closed kinematic chains.


1968 ◽  
Vol 17 (2) ◽  
pp. 333-358 ◽  
Author(s):  
P. Parisi ◽  
M. Di Bacco

SummaryA twin study was undertaken with the twofold aim (a) of studying the hereditary behaviour of digital dermatoglyphic traits both at the qualitative and quantitative level, and (b) of working out a method for discriminating MZ and DZ twins by means of fingerprints.Fingerprints of 50 MZ (25 ♂ and 25 ♀) and 50 DZ (25 ♂ and 25 ♀) twin pairs were thus examined and analyzed by means of a special methodology and of a 7044/K32 IBM computer.The qualitative analysis has shown a significantly higher concordance in MZ than in DZ twin pairs, with a certain variability of single finger concordance values. The quantitative analysis has shown significantly higher correlation values in MZ than in DZ twin pairs, with very limited confidence intervals in the former. Single ridge counts apparently behave as cumulative counts on the five or ten fingers, although with an obviously higher random variability.Digital dermatoglyphics thus appear to show practically complete genetic conditioning, which, rather than at a cumulative level for the ten fingers, as is largely believed, appears to act on single finger quali-quantitative traits. The total finger ridge count, rather than a trait, only appears to be a useful, but artificial cumulative value. Actually, applied to the diagnosis of zygosity, it provides, by itself, a fairly high, general probability (0.86) of a correct diagnosis.


Sign in / Sign up

Export Citation Format

Share Document