straightforward solution
Recently Published Documents


TOTAL DOCUMENTS

62
(FIVE YEARS 22)

H-INDEX

10
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Sharut Gupta ◽  
Praveer Singh ◽  
Ken Chang ◽  
Liangqiong Qu ◽  
Mehak Aggarwal ◽  
...  

Abstract Model brittleness is a key concern when deploying deep learning models in real-world medical settings. A model that has high performance at one dataset may suffer a significant decline in performance when tested at on different datasets. While pooling datasets from multiple hospitals and re-training may provide a straightforward solution, it is often infeasible and may compromise patient privacy. An alternative approach is to fine-tune the model on subsequent datasets after training on the original dataset. Notably,this approach degrades model performance at the original datasets, a phenomenon known as catastrophic forgetting. In this paper, we develop an approach to address catastrophic forgetting based on elastic weight consolidation combined with modulation of batch normalization statistics under three scenarios: 1) for expanding the domain from one imaging system’s data to another imaging system’s 2) for expanding the domain from a large multi-hospital dataset to another single hospital dataset 3) for expanding the domain from dataset from one geographic region to a dataset from another geographic region. Focusing on the clinical uses cases of mammographic breast density detection and retinopathy of prematurity stage diagnosis, we show that our approach outperforms several other state-of-the-art approaches and provide theoretical justification for the efficacy of batch normalization modulation. The results of this study are generally applicable to the deployment of any clinical deep learning model which requires domain expansion.


Water ◽  
2021 ◽  
Vol 13 (21) ◽  
pp. 3100
Author(s):  
Arianna Catenacci ◽  
Matteo Grana ◽  
Francesca Malpei ◽  
Elena Ficara

Anaerobic co-digestion in wastewater treatment plants is looking increasingly like a straightforward solution to many issues arising from the operation of mono-digestion. Process modelling is relevant to predict plant behavior and its sensitivity to operational parameters, and to assess the feasibility of simultaneously feeding a digester with different organic wastes. Still, much work has to be completed to turn anaerobic digestion modelling into a reliable and practical tool. Indeed, the complex biochemical processes described in the ADM1 model require the identification of several parameters and many analytical determinations for substrate characterization. A combined protocol including batch Biochemical Methane Potential tests and analytical determinations is proposed and applied for substrate influent characterization to simulate a pilot-scale anaerobic digester where co-digestion of waste sludge and expired yogurt was operated. An iterative procedure was also developed to improve the fit of batch tests for kinetic parameter identification. The results are encouraging: the iterative procedure significantly reduced the Theil’s Inequality Coefficient (TIC), used to evaluate the goodness of fit of the model for alkalinity, total volatile fatty acids, pH, COD, volatile solids, and ammoniacal nitrogen. Improvements in the TIC values, compared to the first iteration, ranged between 30 and 58%.


2021 ◽  
Vol 22 (3) ◽  
Author(s):  
Zeinab Torabi ◽  
Somaye Timarchi

Comparison, division and sign detection are considered complicated operations in residue number system (RNS). A straightforward solution is to convert RNS numbers into binary formats and then perform complicated operations using conventional binary operators. If efficient circuits are provided for comparison, division and sign detection, the application of RNS can be extended to the cases including these operations.For RNS comparison in the 3-moduli set , we have only found one hardware realization. In this paper, an efficient RNS comparator is proposed for the moduli set  which employs sign detection method and operates more efficient than its counterparts. The proposed sign detector and comparator utilize dynamic range partitioning (DRP), which has been recently presented for unsigned RNS comparison. Delay and cost of the proposed comparator are lower than the previous works and makes it appropriate for RNS applications with limited delay and cost.


Author(s):  
Ludovic Chatellier

Lagrangian Particle Tracking (LPT) has become a near-standard approach for performing accurate 3D flow measurements, thanks notably to the technical breakthroughs brought by the Iterative Particle Reconstruction (IPR: Wieneke, 2013) and Shake-the-Box (STB: Schanz et.al, 2016) procedures. These decisive progresses have triggered a number of studies relative to the eduction of flow kinematics and dynamics based on particle trajectory analyses. Novara & Scarano (2013), and others, focused on polynomial approximations of the trajectories, which analytically provide the material derivatives used to estimate pressure gradients. In particular, approximations based on second order polynomials fits of a small number of particle positions are used in commercially available softwares and among research teams as a straightforward solution to obtain the first and second order derivatives with a limited effect of the measurement noise. Additionally the analyses conducted during the 2020 LPT challenge (Leclaire, 2020 ; Sciacchitano, 2020) have addressed the performance of methodologies used by different groups with respect to second order trajectory fits for both multi-pulse and four-pulse (Novara et. al, 2016) LPT cases. On more advanced theoretical grounds, Geseman et. al (2016) have proposed the trackfit approach using penalized B-splines with considerations on the time-varying acceleration rate (i.e. jolt or jerk) and spectral content of noisy particle tracks


2021 ◽  
Vol 1 (38) ◽  
pp. 99-114
Author(s):  
Yurii Kovbasko

Current shift of linguistic paradigms and loss of interest in previously mainstream ‘parts of speech theory’ do not imply that all ambiguity and outstanding issues have been challenged and successfully solved. On the contrary, these issues have been put on pause, as linguists, coming to naught and being unable to set forward a univocal, straightforward solution, started refocusing their scientific pursuits. Nevertheless, the problem of parts of speech overlapping has remained of vital importance, even if it is in the background of linguistic research. This issue must be addressed from the theoretical and practical perspectives. The present study attempts to give a theoretical overview of grammatical approaches to prepositions, adverbs, conjunctions, and particles which were prevailing in the Late Modern English grammar. The analysis is based on 400 English grammar books, published over the period of Late Modern English, and is divided into three sections in conformity with certain historical periods, viz. 1700–1799, 1800–1849, 1850–1899, respectively. The research presents the major tendencies towards identification of the aforementioned categories, which characterize each historical period in English grammar and explain the current state of affairs in the parts of speech theory, providing theoretical background for further practical research on the parts of speech overlapping. 


Entropy ◽  
2021 ◽  
Vol 23 (5) ◽  
pp. 530
Author(s):  
Milton Silva ◽  
Diogo Pratas ◽  
Armando J. Pinho

Recently, the scientific community has witnessed a substantial increase in the generation of protein sequence data, triggering emergent challenges of increasing importance, namely efficient storage and improved data analysis. For both applications, data compression is a straightforward solution. However, in the literature, the number of specific protein sequence compressors is relatively low. Moreover, these specialized compressors marginally improve the compression ratio over the best general-purpose compressors. In this paper, we present AC2, a new lossless data compressor for protein (or amino acid) sequences. AC2 uses a neural network to mix experts with a stacked generalization approach and individual cache-hash memory models to the highest-context orders. Compared to the previous compressor (AC), we show gains of 2–9% and 6–7% in reference-free and reference-based modes, respectively. These gains come at the cost of three times slower computations. AC2 also improves memory usage against AC, with requirements about seven times lower, without being affected by the sequences’ input size. As an analysis application, we use AC2 to measure the similarity between each SARS-CoV-2 protein sequence with each viral protein sequence from the whole UniProt database. The results consistently show higher similarity to the pangolin coronavirus, followed by the bat and human coronaviruses, contributing with critical results to a current controversial subject. AC2 is available for free download under GPLv3 license.


2021 ◽  
pp. 147488512098460
Author(s):  
Adrienne de Ruiter

Dehumanisation is a puzzling phenomenon. Nazi propaganda likened the Jews to rats, but also portrayed them as ‘poisoners of culture’. In the Soviet Union, the Stalinist regime called opponents vermin, yet put them on show trials. During the Rwandan genocide, the Hutus identified the Tutsis with cockroaches, but nonetheless raped Tutsi women. These examples reveal tensions in the way in which dehumanisers perceive, portray and treat victims. Dehumanisation seems to require that perpetrators both deny and acknowledge the humanity of their victims in certain ways. Several scholars have proposed solutions to this so-called ‘paradox of dehumanisation’ that question the usefulness of dehumanisation as a concept to explain genocidal violence, claim that dehumanisation is characterised by an unstable belief in the non-human essence of the dehumanised, or contend that dehumanisation revolves around a denial of metaphysical human status. The main aim of this article is to present a novel framework for theorising dehumanisation that offers a more straightforward solution to this paradox based on the idea that perpetrators deny their victims’ human standing in a moral sense without necessarily negating their biological human status or human subjectivity. The article illustrates this framework through examples drawn from Primo Levi’s memoirs of Auschwitz.


Author(s):  
Aun Yichiet ◽  
Jasmina Khaw Yen Min ◽  
Gan Ming Lee ◽  
Low Jun Sheng

The semantic diversity of policies written by people with different IT literacy to achieve certain network security or performance goals created ambiguity to otherwise straightforward solution implementations. In this project, an intent-aware solution recommender is designed to decode semantic cues in network policies written by various demographics for robust solution recommendations. A novel policy analyzer is designed to extract the intrinsic networking intents from ICT policies to provide context-specific solution recommendations. A custom network-aware intent recognizer is trained on a small keywords-to-intents dataset annotated by domain experts using NLP algorithms in AWS comprehend. The bin-of-words model is then used to classify sentences in the policies into predicted ‘intent' class. A collaborative filtering recommendation system using crowd-sourced ground-truth is designed to suggest optimal architecting solutions to achieve the requirements outlined in ICT policies.


2021 ◽  
Vol 33 ◽  
pp. 102041
Author(s):  
Abdalazeez Ismail Mohamed Albashir ◽  
Wen Shang ◽  
Mohammed Kamal Hadi ◽  
Junlei Zhang ◽  
Tianyun Zhang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document