scholarly journals Routing Density Analysis of Area-Efficient Ring Oscillator Physically Unclonable Functions

2021 ◽  
Vol 11 (20) ◽  
pp. 9730
Author(s):  
Zulfikar Zulfikar ◽  
Norhayati Soin ◽  
Sharifah Fatmadiana Wan Muhamad Hatta ◽  
Mohamad Sofian Abu Talip ◽  
Anuar Jaafar

The research into ring oscillator physically unclonable functions (RO-PUF) continues to expand due to its simple structure, ease of generating responses, and its promises of primitive security. However, a substantial study has yet to be carried out in developing designs of the FPGA-based RO-PUF, which effectively balances performance and area efficiency. This work proposes a modified RO-PUF where the ring oscillators are connected directly to the counters. The proposed RO-PUF requires fewer RO than the conventional structure since this work utilizes the direct pulse count method. This work aims to seek the ideal routing density of ROs to improve uniqueness. For this purpose, five logic arrangements of a wide range of routing densities of ROs were tested. Upon implementation onto the FPGA chip, the routing density of ROs are varied significantly in terms of wire utilization (higher than 25%) and routing hotspots (higher than 80%). The best uniqueness attained was 52.71%, while the highest reliability was 99.51%. This study improves the uniqueness by 2% subsequent to the application of scenarios to consider ROs with a narrow range of routing density. The best range of wire utilization and routing hotspots of individual RO in this work is 3–5% and 20–50%, respectively. The performance metrics (uniqueness and reliability) of the proposed RO-PUF are much better than existing works using a similar FPGA platform (Altera), and it is as good as the recent RO-PUFs realized on Xilinx. Additionally, this work estimates the minimum runtimes to reduce error and response bit-flip of RO-PUF.

Author(s):  
Svitlana Lobchenko ◽  
Tetiana Husar ◽  
Viktor Lobchenko

The results of studies of the viability of spermatozoa with different incubation time at different concentrations and using different diluents are highlighted in the article. (Un) concentrated spermatozoa were diluented: 1) with their native plasma; 2) medium 199; 3) a mixture of equal volumes of plasma and medium 199. The experiment was designed to generate experimental samples with spermatozoa concentrations prepared according to the method, namely: 0.2; 0.1; 0.05; 0.025 billion / ml. The sperm was evaluated after 2, 4, 6 and 8 hours. The perspective of such a study is significant and makes it possible to research various aspects of the subject in a wide range. In this regard, a series of experiments were conducted in this area. The data obtained are statistically processed and allow us to highlight the results that relate to each stage of the study. In particular, in this article it was found out some regularities between the viability of sperm, the type of diluent and the rate of rarefaction, as evidenced by the data presented in the tables. As a result of sperm incubation, the viability of spermatozoa remains at least the highest trend when sperm are diluted to a concentration of 0.1 billion / ml, regardless of the type of diluent used. To maintain the viability of sperm using this concentration of medium 199 is not better than its native plasma, and its mixture with an equal volume of plasma through any length of time incubation of such sperm. Most often it is at this concentration of sperm that their viability is characterized by the lowest coefficient of variation, regardless of the type of diluent used, which may indicate the greatest stability of the result under these conditions. The viability of spermatozoa with a concentration of 0.1 billion / ml is statistically significantly reduced only after 6 or even 8 hours of incubation. If the sperm are incubated for only 2 hours, regardless of the type of diluent used, the sperm concentrations tested do not affect the viability of the sperm. Key words: boar, spermatozoa, sperm plasma, concentration, incubation, medium 199, activity, viability, rarefaction.


1996 ◽  
Vol 118 (3) ◽  
pp. 439-443 ◽  
Author(s):  
Chuen-Huei Liou ◽  
Hsiang Hsi Lin ◽  
F. B. Oswald ◽  
D. P. Townsend

This paper presents a computer simulation showing how the gear contact ratio affects the dynamic load on a spur gear transmission. The contact ratio can be affected by the tooth addendum, the pressure angle, the tooth size (diametral pitch), and the center distance. The analysis presented in this paper was performed by using the NASA gear dynamics code DANST. In the analysis, the contact ratio was varied over the range 1.20 to 2.40 by changing the length of the tooth addendum. In order to simplify the analysis, other parameters related to contact ratio were held constant. The contact ratio was found to have a significant influence on gear dynamics. Over a wide range of operating speeds, a contact ratio close to 2.0 minimized dynamic load. For low-contact-ratio gears (contact ratio less than two), increasing the contact ratio reduced gear dynamic load. For high-contact-ratio gears (contact ratio equal to or greater than 2.0), the selection of contact ratio should take into consideration the intended operating speeds. In general, high-contact-ratio gears minimized dynamic load better than low-contact-ratio gears.


2021 ◽  
Vol 11 (13) ◽  
pp. 5859
Author(s):  
Fernando N. Santos-Navarro ◽  
Yadira Boada ◽  
Alejandro Vignoni ◽  
Jesús Picó

Optimal gene expression is central for the development of both bacterial expression systems for heterologous protein production, and microbial cell factories for industrial metabolite production. Our goal is to fulfill industry-level overproduction demands optimally, as measured by the following key performance metrics: titer, productivity rate, and yield (TRY). Here we use a multiscale model incorporating the dynamics of (i) the cell population in the bioreactor, (ii) the substrate uptake and (iii) the interaction between the cell host and expression of the protein of interest. Our model predicts cell growth rate and cell mass distribution between enzymes of interest and host enzymes as a function of substrate uptake and the following main lab-accessible gene expression-related characteristics: promoter strength, gene copy number and ribosome binding site strength. We evaluated the differential roles of gene transcription and translation in shaping TRY trade-offs for a wide range of expression levels and the sensitivity of the TRY space to variations in substrate availability. Our results show that, at low expression levels, gene transcription mainly defined TRY, and gene translation had a limited effect; whereas, at high expression levels, TRY depended on the product of both, in agreement with experiments in the literature.


Microbiome ◽  
2021 ◽  
Vol 9 (1) ◽  
Author(s):  
Dieter M. Tourlousse ◽  
Koji Narita ◽  
Takamasa Miura ◽  
Mitsuo Sakamoto ◽  
Akiko Ohashi ◽  
...  

Abstract Background Validation and standardization of methodologies for microbial community measurements by high-throughput sequencing are needed to support human microbiome research and its industrialization. This study set out to establish standards-based solutions to improve the accuracy and reproducibility of metagenomics-based microbiome profiling of human fecal samples. Results In the first phase, we performed a head-to-head comparison of a wide range of protocols for DNA extraction and sequencing library construction using defined mock communities, to identify performant protocols and pinpoint sources of inaccuracy in quantification. In the second phase, we validated performant protocols with respect to their variability of measurement results within a single laboratory (that is, intermediate precision) as well as interlaboratory transferability and reproducibility through an industry-based collaborative study. We further ascertained the performance of our recommended protocols in the context of a community-wide interlaboratory study (that is, the MOSAIC Standards Challenge). Finally, we defined performance metrics to provide best practice guidance for improving measurement consistency across methods and laboratories. Conclusions The validated protocols and methodological guidance for DNA extraction and library construction provided in this study expand current best practices for metagenomic analyses of human fecal microbiota. Uptake of our protocols and guidelines will improve the accuracy and comparability of metagenomics-based studies of the human microbiome, thereby facilitating development and commercialization of human microbiome-based products.


2021 ◽  
Vol 11 (8) ◽  
pp. 3623
Author(s):  
Omar Said ◽  
Amr Tolba

Employment of the Internet of Things (IoT) technology in the healthcare field can contribute to recruiting heterogeneous medical devices and creating smart cooperation between them. This cooperation leads to an increase in the efficiency of the entire medical system, thus accelerating the diagnosis and curing of patients, in general, and rescuing critical cases in particular. In this paper, a large-scale IoT-enabled healthcare architecture is proposed. To achieve a wide range of communication between healthcare devices, not only are Internet coverage tools utilized but also satellites and high-altitude platforms (HAPs). In addition, the clustering idea is applied in the proposed architecture to facilitate its management. Moreover, healthcare data are prioritized into several levels of importance. Finally, NS3 is used to measure the performance of the proposed IoT-enabled healthcare architecture. The performance metrics are delay, energy consumption, packet loss, coverage tool usage, throughput, percentage of served users, and percentage of each exchanged data type. The simulation results demonstrate that the proposed IoT-enabled healthcare architecture outperforms the traditional healthcare architecture.


2021 ◽  
Author(s):  
Danila Piatov ◽  
Sven Helmer ◽  
Anton Dignös ◽  
Fabio Persia

AbstractWe develop a family of efficient plane-sweeping interval join algorithms for evaluating a wide range of interval predicates such as Allen’s relationships and parameterized relationships. Our technique is based on a framework, components of which can be flexibly combined in different manners to support the required interval relation. In temporal databases, our algorithms can exploit a well-known and flexible access method, the Timeline Index, thus expanding the set of operations it supports even further. Additionally, employing a compact data structure, the gapless hash map, we utilize the CPU cache efficiently. In an experimental evaluation, we show that our approach is several times faster and scales better than state-of-the-art techniques, while being much better suited for real-time event processing.


CORROSION ◽  
1976 ◽  
Vol 32 (10) ◽  
pp. 414-417 ◽  
Author(s):  
R. WALKER

Abstract The use of triazole, benzotriazole, and naphthotriazole as corrosion inhibitors for brass is briefly reviewed. The corrosion of 70/30 brass immersed in a wide range of solutions is reported both with and without the inhibitors. The inhibitor efficiency of benzotriazole is given as a function of the solution pH and the concentration used. Triazole was only effective in mildly corrosive solutions and benzotriazole and naphthotriazole were much better. Generally naphthotriazole was better than benzotriazole but is much more expensive and a higher concentration of benzotriazole can give the same protection as naphthotriazole at a much lower cost.


1995 ◽  
Vol 1 (2) ◽  
pp. 163-190 ◽  
Author(s):  
Kenneth W. Church ◽  
William A. Gale

AbstractShannon (1948) showed that a wide range of practical problems can be reduced to the problem of estimating probability distributions of words and ngrams in text. It has become standard practice in text compression, speech recognition, information retrieval and many other applications of Shannon's theory to introduce a “bag-of-words” assumption. But obviously, word rates vary from genre to genre, author to author, topic to topic, document to document, section to section, and paragraph to paragraph. The proposed Poisson mixture captures much of this heterogeneous structure by allowing the Poisson parameter θ to vary over documents subject to a density function φ. φ is intended to capture dependencies on hidden variables such genre, author, topic, etc. (The Negative Binomial is a well-known special case where φ is a Г distribution.) Poisson mixtures fit the data better than standard Poissons, producing more accurate estimates of the variance over documents (σ2), entropy (H), inverse document frequency (IDF), and adaptation (Pr(x ≥ 2/x ≥ 1)).


2020 ◽  
Vol 499 (4) ◽  
pp. 4905-4917
Author(s):  
S Contreras ◽  
R E Angulo ◽  
M Zennaro ◽  
G Aricò ◽  
M Pellejero-Ibañez

ABSTRACT Predicting the spatial distribution of objects as a function of cosmology is an essential ingredient for the exploitation of future galaxy surveys. In this paper, we show that a specially designed suite of gravity-only simulations together with cosmology-rescaling algorithms can provide the clustering of dark matter, haloes, and subhaloes with high precision. Specifically, with only three N-body simulations, we obtain the power spectrum of dark matter at z = 0 and 1 to better than 3 per cent precision for essentially all currently viable values of eight cosmological parameters, including massive neutrinos and dynamical dark energy, and over the whole range of scales explored, 0.03 < $k/{h}^{-1}\, {\rm Mpc}^{-1}$ < 5. This precision holds at the same level for mass-selected haloes and for subhaloes selected according to their peak maximum circular velocity. As an initial application of these predictions, we successfully constrain Ωm, σ8, and the scatter in subhalo-abundance-matching employing the projected correlation function of mock SDSS galaxies.


Sign in / Sign up

Export Citation Format

Share Document