scholarly journals Cache vs. Key-Dependency: Side Channeling an Implementation of Pilsung

Author(s):  
Daniel Genkin ◽  
Romain Poussier ◽  
Rui Qi Sim ◽  
Yuval Yarom ◽  
Yuanjing Zhao

Over the past two decades, cache attacks have been identified as a threat to the security of cipher implementations. These attacks recover secret information by combining observations of the victim cache accesses with the knowledge of the internal structure of the cipher. So far, cache attacks have been applied to ciphers that have fixed state transformations, leaving open the question of whether using secret, key-dependent transformations enhances the security against such attacks. In this paper we investigate this question. We look at an implementation of the North Korean cipher Pilsung, as reverse-engineered by Kryptos Logic. Like AES, Pilsung is a permutation-substitution cipher, but unlike AES, both the substitution and the permutation steps in Pilsung depend on the key, and are not known to the attacker. We analyze Pilsung and design a cache-based attack. We improve the state of the art by developing techniques for reversing secret-dependent transformations. Our attack, which requires an average of eight minutes on a typical laptop computer, demonstrates that secret transformations do not necessarily protect ciphers against side channel attacks.

2016 ◽  
Vol 67 (1) ◽  
pp. 55-68 ◽  
Author(s):  
Nicolas Courtois ◽  
Guangyan Song ◽  
Ryan Castellucci

Abstract In this paper, we study and give the first detailed benchmarks on existing implementations of the secp256k1 elliptic curve used by at least hundreds of thousands of users in Bitcoin and other cryptocurrencies. Our implementation improves the state of the art by a factor of 2.5 with a focus on the cases, where side channel attacks are not a concern and a large quantity of RAM is available. As a result, we are able to scan the Bitcoin blockchain for weak keys faster than any previous implementation. We also give some examples of passwords which we have cracked, showing that brain wallets are not secure in practice even for quite complex passwords.


Author(s):  
Chao Sun ◽  
Thomas Espitau ◽  
Mehdi Tibouchi ◽  
Masayuki Abe

The lattice reduction attack on (EC)DSA (and other Schnorr-like signature schemes) with partially known nonces, originally due to Howgrave-Graham and Smart, has been at the core of many concrete cryptanalytic works, side-channel based or otherwise, in the past 20 years. The attack itself has seen limited development, however: improved analyses have been carried out, and the use of stronger lattice reduction algorithms has pushed the range of practically vulnerable parameters further, but the lattice construction based on the signatures and known nonce bits remain the same.In this paper, we propose a new idea to improve the attack based on the same data in exchange for additional computation: carry out an exhaustive search on some bits of the secret key. This turns the problem from a single bounded distance decoding (BDD) instance in a certain lattice to multiple BDD instances in a fixed lattice of larger volume but with the same bound (making the BDD problem substantially easier). Furthermore, the fact that the lattice is fixed lets us use batch/preprocessing variants of BDD solvers that are far more efficient than repeated lattice reductions on non-preprocessed lattices of the same size. As a result, our analysis suggests that our technique is competitive or outperforms the state of the art for parameter ranges corresponding to the limit of what is achievable using lattice attacks so far (around 2-bit leakage on 160-bit groups, or 3-bit leakage on 256-bit groups).We also show that variants of this idea can also be applied to bits of the nonces (leading to a similar improvement) or to filtering signature data (leading to a data-time trade-off for the lattice attack). Finally, we use our technique to obtain an improved exploitation of the TPM–FAIL dataset similar to what was achieved in the Minerva attack.


Author(s):  
Fabricio Almeida-Silva ◽  
Kanhu C Moharana ◽  
Thiago M Venancio

Abstract In the past decade, over 3000 samples of soybean transcriptomic data have accumulated in public repositories. Here, we review the state of the art in soybean transcriptomics, highlighting the major microarray and RNA-seq studies that investigated soybean transcriptional programs in different tissues and conditions. Further, we propose approaches for integrating such big data using gene coexpression network and outline important web resources that may facilitate soybean data acquisition and analysis, contributing to the acceleration of soybean breeding and functional genomics research.


1967 ◽  
Vol 71 (677) ◽  
pp. 342-343
Author(s):  
F. H. East

The Aviation Group of the Ministry of Technology (formerly the Ministry of Aviation) is responsible for spending a large part of the country's defence budget, both in research and development on the one hand and production or procurement on the other. In addition, it has responsibilities in many non-defence fields, mainly, but not exclusively, in aerospace.Few developments have been carried out entirely within the Ministry's own Establishments; almost all have required continuous co-operation between the Ministry and Industry. In the past the methods of management and collaboration and the relative responsibilities of the Ministry and Industry have varied with time, with the type of equipment to be developed, with the size of the development project and so on. But over the past ten years there has been a growing awareness of the need to put some system into the complex business of translating a requirement into a specification and a specification into a product within reasonable bounds of time and cost.


2021 ◽  
Vol 27 (1) ◽  
pp. 7-32
Author(s):  
Bruce A. Seaman

The intellectual development of cultural economics has exhibited some notable similarities to the challenges faced by researchers pioneering in other areas of economics. While this is not really surprising, previous reviews of this literature have not focused on such patterns. Specifically, the methodology and normative implications of the field of industrial organization and antitrust policy suggest a series of stages identified here as foundation, maturation, reevaluation, and backlash that suggest a way of viewing the development of and controversies surrounding cultural economics. Also, the emerging field of sports economics, which already shares some substantive similarities to the questions addressed in cultural economics, presents a pattern of development by which core questions and principles are identified in a fragmented literature, which then slowly coalesces and becomes consolidated into a more unified literature that essentially reconfirms and extends those earlier core principles. This fragmentation and consolidation pattern is also exhibited by the development of cultural economics. While others could surely suggest different parallels in the search for such developmental patterns, this way of organizing ones thinking about the past and future of this field provides a hoped for alternative perspective on the state of the art of cultural economics.


Resources ◽  
2020 ◽  
Vol 9 (2) ◽  
pp. 15
Author(s):  
Juan Uribe-Toril ◽  
José Luis Ruiz-Real ◽  
Jaime de Pablo Valenciano

Sustainability, local development, and ecology are keywords that cover a wide range of research fields in both experimental and social sciences. The transversal nature of this knowledge area creates synergies but also divergences, making a continuous review of the existing literature necessary in order to facilitate research. There has been an increasing number of articles that have analyzed trends in the literature and the state-of-the-art in many subjects. In this Special Issue of Resources, the most prestigious researchers analyzed the past and future of Social Sciences in Resources from an economic, social, and environmental perspective.


Computers ◽  
2020 ◽  
Vol 9 (2) ◽  
pp. 37 ◽  
Author(s):  
Luca Cappelletti ◽  
Tommaso Fontana ◽  
Guido Walter Di Donato ◽  
Lorenzo Di Tucci ◽  
Elena Casiraghi ◽  
...  

Missing data imputation has been a hot topic in the past decade, and many state-of-the-art works have been presented to propose novel, interesting solutions that have been applied in a variety of fields. In the past decade, the successful results achieved by deep learning techniques have opened the way to their application for solving difficult problems where human skill is not able to provide a reliable solution. Not surprisingly, some deep learners, mainly exploiting encoder-decoder architectures, have also been designed and applied to the task of missing data imputation. However, most of the proposed imputation techniques have not been designed to tackle “complex data”, that is high dimensional data belonging to datasets with huge cardinality and describing complex problems. Precisely, they often need critical parameters to be manually set or exploit complex architecture and/or training phases that make their computational load impracticable. In this paper, after clustering the state-of-the-art imputation techniques into three broad categories, we briefly review the most representative methods and then describe our data imputation proposals, which exploit deep learning techniques specifically designed to handle complex data. Comparative tests on genome sequences show that our deep learning imputers outperform the state-of-the-art KNN-imputation method when filling gaps in human genome sequences.


Author(s):  
Jukka Tyrkkö

This chapter outlines the state of the art in corpus-based language teaching and digital pedagogy, focusing on the differences between using corpora with present-day and historical data. The basic concepts of corpus-based research such as representativeness, frequency, and statistical significance can be introduced to students who are new to corpus methods, and the application of these concepts to the history of English can deepen students’ understanding of how historical varieties of the language are researched. This chapter will also address some of the key challenges particular to teaching the history of English using corpora, such as dealing with the seemingly counterintuitive findings, non-standard features, and small datasets. Finally, following an overview of available historical corpora and corpus tools, several practical examples of corpus-driven activities will be discussed in detail, with suggestions and ideas on how a teacher might prepare and run corpus-based lessons.


2017 ◽  
Vol 24 (4) ◽  
pp. 529-540
Author(s):  
Paul Eisenberg

Purpose This paper aims to approach fundamental topics of financial crime and the law. What does constitute financial crime? Which field of law is best suited to address the threats of transgression by financial executives? What does motivate highly rewarded financiers to become white collar criminals? Design/methodology/approach To answer these research questions, contemporary theories of criminology in general and of white collar crime in particular, as well as theories on motivation, are critically discussed. Benefits and limitations of the theories in use are exemplified on the background of the London Interbank Offered Rate (LIBOR) scandal. Findings The paper criticises that the state-of-the-art theories are not able to embrace financial criminality in its entirety. A provoking pace for further research might be that of psychopathic disorders among white collar criminals. Thus, white collar crime maintains its challenging character. Originality/value This paper provides a thorough testing of multidisciplinary theories that emerged over the past decades against the recent LIBOR scandal. The research questions addressed and the methodologies applied provide a framework for the assessment of the prevailing theories against other financial scandals.


2002 ◽  
Vol 50 ◽  
pp. 317-328 ◽  
Author(s):  
Jenann Ismael

I want to consider some features of the position put forward by Julian Barbour in The End of Time that seem to me of particular philosophical interest. At the level of generality at which I'll be concerned with it, the view is relatively easy to describe. It can be arrived at by thinking of time as decomposing in some natural way linearly ordered atomic parts, ‘moments’, and combining an observation about the internal structure of moments with an epistemological doctrine about our access to the past. The epistemological doctrine, which I'll call ‘Presentism’, following Butterfield, is the view that our access to the past is mediated by records, or local representations, of it. The observation is that the state of the world at any moment has the structure of what Barbour calls a ‘time capsule’, which is to say that it constitutes a partial record of its past, it is pregnant with interrelated mutually consistent representations of its own history.


Sign in / Sign up

Export Citation Format

Share Document