scholarly journals Stochastic methods defeat regular RSA exponentiation algorithms with combined blinding methods

2021 ◽  
Vol 15 (1) ◽  
pp. 408-433
Author(s):  
Margaux Dugardin ◽  
Werner Schindler ◽  
Sylvain Guilley

Abstract Extra-reductions occurring in Montgomery multiplications disclose side-channel information which can be exploited even in stringent contexts. In this article, we derive stochastic attacks to defeat Rivest-Shamir-Adleman (RSA) with Montgomery ladder regular exponentiation coupled with base blinding. Namely, we leverage on precharacterized multivariate probability mass functions of extra-reductions between pairs of (multiplication, square) in one iteration of the RSA algorithm and that of the next one(s) to build a maximum likelihood distinguisher. The efficiency of our attack (in terms of required traces) is more than double compared to the state-of-the-art. In addition to this result, we also apply our method to the case of regular exponentiation, base blinding, and modulus blinding. Quite surprisingly, modulus blinding does not make our attack impossible, and so even for large sizes of the modulus randomizing element. At the cost of larger sample sizes our attacks tolerate noisy measurements. Fortunately, effective countermeasures exist.

2021 ◽  
Vol 11 (10) ◽  
pp. 4553
Author(s):  
Ewelina Ziajka-Poznańska ◽  
Jakub Montewka

The development of autonomous ship technology is currently in focus worldwide and the literature on this topic is growing. However, an in-depth cost and benefit estimation of such endeavours is in its infancy. With this systematic literature review, we present the state-of-the-art system regarding costs and benefits of the operation of prospective autonomous merchant ships with an objective for identifying contemporary research activities concerning an estimation of operating, voyage, and capital costs in prospective, autonomous shipping and vessel platooning. Additionally, the paper outlines research gaps and the need for more detailed business models for operating autonomous ships. Results reveal that valid financial models of autonomous shipping are lacking and there is significant uncertainty affecting the cost estimates, rendering only a reliable evaluation of specific case studies. The findings of this paper may be found relevant not only by academia, but also organisations considering to undertake a challenge of implementing Maritime Autonomous Surface Ships in their operations.


2020 ◽  
Vol 9 (1) ◽  
pp. 303-322 ◽  
Author(s):  
Zhifang Zhao ◽  
Tianqi Qi ◽  
Wei Zhou ◽  
David Hui ◽  
Cong Xiao ◽  
...  

AbstractThe behavior of cement-based materials is manipulated by chemical and physical processes at the nanolevel. Therefore, the application of nanomaterials in civil engineering to develop nano-modified cement-based materials is a promising research. In recent decades, a large number of researchers have tried to improve the properties of cement-based materials by employing various nanomaterials and to characterize the mechanism of nano-strengthening. In this study, the state of the art progress of nano-modified cement-based materials is systematically reviewed and summarized. First, this study reviews the basic properties and dispersion methods of nanomaterials commonly used in cement-based materials, including carbon nanotubes, carbon nanofibers, graphene, graphene oxide, nano-silica, nano-calcium carbonate, nano-calcium silicate hydrate, etc. Then the research progress on nano-engineered cementitious composites is reviewed from the view of accelerating cement hydration, reinforcing mechanical properties, and improving durability. In addition, the market and applications of nanomaterials for cement-based materials are briefly discussed, and the cost is creatively summarized through market survey. Finally, this study also summarizes the existing problems in current research and provides future perspectives accordingly.


2020 ◽  
Vol 15 (1) ◽  
pp. 4-17
Author(s):  
Jean-François Biasse ◽  
Xavier Bonnetain ◽  
Benjamin Pring ◽  
André Schrottenloher ◽  
William Youmans

AbstractWe propose a heuristic algorithm to solve the underlying hard problem of the CSIDH cryptosystem (and other isogeny-based cryptosystems using elliptic curves with endomorphism ring isomorphic to an imaginary quadratic order 𝒪). Let Δ = Disc(𝒪) (in CSIDH, Δ = −4p for p the security parameter). Let 0 < α < 1/2, our algorithm requires:A classical circuit of size $2^{\tilde{O}\left(\log(|\Delta|)^{1-\alpha}\right)}.$A quantum circuit of size $2^{\tilde{O}\left(\log(|\Delta|)^{\alpha}\right)}.$Polynomial classical and quantum memory.Essentially, we propose to reduce the size of the quantum circuit below the state-of-the-art complexity $2^{\tilde{O}\left(\log(|\Delta|)^{1/2}\right)}$ at the cost of increasing the classical circuit-size required. The required classical circuit remains subexponential, which is a superpolynomial improvement over the classical state-of-the-art exponential solutions to these problems. Our method requires polynomial memory, both classical and quantum.


2021 ◽  
Vol 15 (3) ◽  
pp. 1-28
Author(s):  
Xueyan Liu ◽  
Bo Yang ◽  
Hechang Chen ◽  
Katarzyna Musial ◽  
Hongxu Chen ◽  
...  

Stochastic blockmodel (SBM) is a widely used statistical network representation model, with good interpretability, expressiveness, generalization, and flexibility, which has become prevalent and important in the field of network science over the last years. However, learning an optimal SBM for a given network is an NP-hard problem. This results in significant limitations when it comes to applications of SBMs in large-scale networks, because of the significant computational overhead of existing SBM models, as well as their learning methods. Reducing the cost of SBM learning and making it scalable for handling large-scale networks, while maintaining the good theoretical properties of SBM, remains an unresolved problem. In this work, we address this challenging task from a novel perspective of model redefinition. We propose a novel redefined SBM with Poisson distribution and its block-wise learning algorithm that can efficiently analyse large-scale networks. Extensive validation conducted on both artificial and real-world data shows that our proposed method significantly outperforms the state-of-the-art methods in terms of a reasonable trade-off between accuracy and scalability. 1


2018 ◽  
Vol 27 (07) ◽  
pp. 1860013 ◽  
Author(s):  
Swair Shah ◽  
Baokun He ◽  
Crystal Maung ◽  
Haim Schweitzer

Principal Component Analysis (PCA) is a classical dimensionality reduction technique that computes a low rank representation of the data. Recent studies have shown how to compute this low rank representation from most of the data, excluding a small amount of outlier data. We show how to convert this problem into graph search, and describe an algorithm that solves this problem optimally by applying a variant of the A* algorithm to search for the outliers. The results obtained by our algorithm are optimal in terms of accuracy, and are shown to be more accurate than results obtained by the current state-of-the- art algorithms which are shown not to be optimal. This comes at the cost of running time, which is typically slower than the current state of the art. We also describe a related variant of the A* algorithm that runs much faster than the optimal variant and produces a solution that is guaranteed to be near the optimal. This variant is shown experimentally to be more accurate than the current state-of-the-art and has a comparable running time.


2021 ◽  
Vol 12 (4) ◽  
pp. 98-116
Author(s):  
Noureddine Boukhari ◽  
Fatima Debbat ◽  
Nicolas Monmarché ◽  
Mohamed Slimane

Evolution strategies (ES) are a family of strong stochastic methods for global optimization and have proved their capability in avoiding local optima more than other optimization methods. Many researchers have investigated different versions of the original evolution strategy with good results in a variety of optimization problems. However, the convergence rate of the algorithm to the global optimum stays asymptotic. In order to accelerate the convergence rate, a hybrid approach is proposed using the nonlinear simplex method (Nelder-Mead) and an adaptive scheme to control the local search application, and the authors demonstrate that such combination yields significantly better convergence. The new proposed method has been tested on 15 complex benchmark functions and applied to the bi-objective portfolio optimization problem and compared with other state-of-the-art techniques. Experimental results show that the performance is improved by this hybridization in terms of solution eminence and strong convergence.


1996 ◽  
Vol 33 (1) ◽  
pp. 147-157 ◽  
Author(s):  
Henrik A. Thomsen ◽  
Kenneth Kisbye

State-of-the-art on-line meters for determination of ammonium, nitrate and phosphate are presented. The on-line meters employ different measuring principles and are available in many different designs differing with respect to size, calibration and cleaning principle, user-friendliness, response time, reagent and sample consumption. A study of Danish experiences on several plants has been conducted. The list price of an on-line meter is between USD 8000 and USD 35,000. To this should be added the cost of sample preparation, design, installation and running-in. The yearly operating for one meter are in the range of USD 200-2500 and the manpower consumption is in the range of 1-5 hours/month. The accuracy obtained is only slightly smaller than the accuracy on collaborative laboratory analyses, which is sufficient for most control purposes.


Author(s):  
Eahsan Shahriary ◽  
Amir Hajibabaee

This book offers the students and researchers a unique introduction to Bayesian statistics. Authors provide a wonderful journey in the realm of Bayesian Probability and aspire readers to become Bayesian statisticians. The book starts with Introduction to Probability and covers Bayes’ Theorem, Probability Mass Functions, Probability Density Functions, The Beta-Binomial Conjugate, Markov chain Monte Carlo (MCMC), and Metropolis-Hastings Algorithm. The book is very well written, and topics are very to the point with real-world applications but does not provide examples for computing using common open-source software.


2020 ◽  
Vol 67 ◽  
pp. 607-651
Author(s):  
Margarita Paz Castro ◽  
Chiara Piacentini ◽  
Andre Augusto Cire ◽  
J. Christopher Beck

We investigate the use of relaxed decision diagrams (DDs) for computing admissible heuristics for the cost-optimal delete-free planning (DFP) problem. Our main contributions are the introduction of two novel DD encodings for a DFP task: a multivalued decision diagram that includes the sequencing aspect of the problem and a binary decision diagram representation of its sequential relaxation. We present construction algorithms for each DD that leverage these different perspectives of the DFP task and provide theoretical and empirical analyses of the associated heuristics. We further show that relaxed DDs can be used beyond heuristic computation to extract delete-free plans, find action landmarks, and identify redundant actions. Our empirical analysis shows that while DD-based heuristics trail the state of the art, even small relaxed DDs are competitive with the linear programming heuristic for the DFP task, thus, revealing novel ways of designing admissible heuristics.


2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Feixiao Long

Abstract Background Cell nuclei segmentation is a fundamental task in microscopy image analysis, based on which multiple biological related analysis can be performed. Although deep learning (DL) based techniques have achieved state-of-the-art performances in image segmentation tasks, these methods are usually complex and require support of powerful computing resources. In addition, it is impractical to allocate advanced computing resources to each dark- or bright-field microscopy, which is widely employed in vast clinical institutions, considering the cost of medical exams. Thus, it is essential to develop accurate DL based segmentation algorithms working with resources-constraint computing. Results An enhanced, light-weighted U-Net (called U-Net+) with modified encoded branch is proposed to potentially work with low-resources computing. Through strictly controlled experiments, the average IOU and precision of U-Net+ predictions are confirmed to outperform other prevalent competing methods with 1.0% to 3.0% gain on the first stage test set of 2018 Kaggle Data Science Bowl cell nuclei segmentation contest with shorter inference time. Conclusions Our results preliminarily demonstrate the potential of proposed U-Net+ in correctly spotting microscopy cell nuclei with resources-constraint computing.


Sign in / Sign up

Export Citation Format

Share Document