scholarly journals Replicating the benefits of Deutschian closed timelike curves without breaking causality

2015 ◽  
Vol 1 (1) ◽  
Author(s):  
Xiao Yuan ◽  
Syed M Assad ◽  
Jayne Thompson ◽  
Jing Yan Haw ◽  
Vlatko Vedral ◽  
...  

AbstractIn general relativity, closed timelike curves can break causality with remarkable and unsettling consequences. At the classical level, they induce causal paradoxes disturbing enough to motivate conjectures that explicitly prevent their existence. At the quantum level such problems can be resolved through the Deutschian formalism, however this induces radical benefits—from cloning unknown quantum states to solving problems intractable to quantum computers. Instinctively, one expects these benefits to vanish if causality is respected. Here we show that in harnessing entanglement, we can efficiently solve NP-complete problems and clone arbitrary quantum states—even when all time-travelling systems are completely isolated from the past. Thus, the many defining benefits of Deutschian closed timelike curves can still be harnessed, even when causality is preserved. Our results unveil a subtle interplay between entanglement and general relativity, and significantly improve the potential of probing the radical effects that may exist at the interface between relativity and quantum theory.

2021 ◽  
Vol 21 (15&16) ◽  
pp. 1296-1306
Author(s):  
Seyed Mousavi

Our computers today, from sophisticated servers to small smartphones, operate based on the same computing model, which requires running a sequence of discrete instructions, specified as an algorithm. This sequential computing paradigm has not yet led to a fast algorithm for an NP-complete problem despite numerous attempts over the past half a century. Unfortunately, even after the introduction of quantum mechanics to the world of computing, we still followed a similar sequential paradigm, which has not yet helped us obtain such an algorithm either. Here a completely different model of computing is proposed to replace the sequential paradigm of algorithms with inherent parallelism of physical processes. Using the proposed model, instead of writing algorithms to solve NP-complete problems, we construct physical systems whose equilibrium states correspond to the desired solutions and let them evolve to search for the solutions. The main requirements of the model are identified and quantum circuits are proposed for its potential implementation.


2005 ◽  
Vol 5 (6) ◽  
pp. 449-455
Author(s):  
R. Orus

We perform a mathematical analysis of the classical computational complexity of two genuine quantum-mechanical problems, which are inspired in the calculation of the expected magnetizations and the entanglement between subsystems for a quantum spin system. These problems, which we respectively call SES and SESSP, are specified in terms of pure slightly-entangled quantum states of $n$ qubits, and rigorous mathematical proofs that they belong to the NP-Complete complexity class are presented. Both SES and SESSP are, therefore, computationally equivalent to the relevant $3$-SAT problem, for which an efficient algorithm is yet to be discovered.


Author(s):  
Benjamin F. Trump ◽  
Irene K. Berezesky ◽  
Raymond T. Jones

The role of electron microscopy and associated techniques is assured in diagnostic pathology. At the present time, most of the progress has been made on tissues examined by transmission electron microscopy (TEM) and correlated with light microscopy (LM) and by cytochemistry using both plastic and paraffin-embedded materials. As mentioned elsewhere in this symposium, this has revolutionized many fields of pathology including diagnostic, anatomic and clinical pathology. It began with the kidney; however, it has now been extended to most other organ systems and to tumor diagnosis in general. The results of the past few years tend to indicate the future directions and needs of this expanding field. Now, in addition to routine EM, pathologists have access to the many newly developed methods and instruments mentioned below which should aid considerably not only in diagnostic pathology but in investigative pathology as well.


2020 ◽  
Vol 5 (1) ◽  
pp. 6-11 ◽  
Author(s):  
Laurence B. Leonard

Purpose The current “specific language impairment” and “developmental language disorder” discussion might lead to important changes in how we refer to children with language disorders of unknown origin. The field has seen other changes in terminology. This article reviews many of these changes. Method A literature review of previous clinical labels was conducted, and possible reasons for the changes in labels were identified. Results References to children with significant yet unexplained deficits in language ability have been part of the scientific literature since, at least, the early 1800s. Terms have changed from those with a neurological emphasis to those that do not imply a cause for the language disorder. Diagnostic criteria have become more explicit but have become, at certain points, too narrow to represent the wider range of children with language disorders of unknown origin. Conclusions The field was not well served by the many changes in terminology that have transpired in the past. A new label at this point must be accompanied by strong efforts to recruit its adoption by clinical speech-language pathologists and the general public.


2020 ◽  
Vol 1 (2) ◽  
pp. 157-172
Author(s):  
Thomas Leitch

Building on Tzvetan Todorov's observation that the detective novel ‘contains not one but two stories: the story of the crime and the story of the investigation’, this essay argues that detective novels display a remarkably wide range of attitudes toward the several pasts they represent: the pasts of the crime, the community, the criminal, the detective, and public history. It traces a series of defining shifts in these attitudes through the evolution of five distinct subgenres of detective fiction: exploits of a Great Detective like Sherlock Holmes, Golden Age whodunits that pose as intellectual puzzles to be solved, hardboiled stories that invoke a distant past that the present both breaks with and echoes, police procedurals that unfold in an indefinitely extended present, and historical mysteries that nostalgically fetishize the past. It concludes with a brief consideration of genre readers’ own ambivalent phenomenological investment in the past, present, and future each detective story projects.


Author(s):  
Andrea Gamberini

As it had been in the communal age, so, in the Visconti-Sforza era, law was the instrument that the public authority relied upon in order to subordinate the many actors present and to subjugate their political cultures. There is, therefore, the attempt to tighten a vice around competing powers—a vice that is at the same time legislative, doctrinal, and judicial. And yet, it is difficult to escape the impression of an effort whose outcomes were somewhat more uncertain than had been the case in the past. The chapter focuses on all these aspects of the deployment of legal and other stratagems to consolidate or to wrest power.


Author(s):  
John Hunsley ◽  
Eric J. Mash

Evidence-based assessment relies on research and theory to inform the selection of constructs to be assessed for a specific assessment purpose, the methods and measures to be used in the assessment, and the manner in which the assessment process unfolds. An evidence-based approach to clinical assessment necessitates the recognition that, even when evidence-based instruments are used, the assessment process is a decision-making task in which hypotheses must be iteratively formulated and tested. In this chapter, we review (a) the progress that has been made in developing an evidence-based approach to clinical assessment in the past decade and (b) the many challenges that lie ahead if clinical assessment is to be truly evidence-based.


2021 ◽  
pp. 875529302199636
Author(s):  
Mertcan Geyin ◽  
Brett W Maurer ◽  
Brendon A Bradley ◽  
Russell A Green ◽  
Sjoerd van Ballegooy

Earthquakes occurring over the past decade in the Canterbury region of New Zealand have resulted in liquefaction case-history data of unprecedented quantity. This provides the profession with a unique opportunity to advance the prediction of liquefaction occurrence and consequences. Toward that end, this article presents a curated dataset containing ∼15,000 cone-penetration-test-based liquefaction case histories compiled from three earthquakes in Canterbury. The compiled, post-processed data are presented in a dense array structure, allowing researchers to easily access and analyze a wealth of information pertinent to free-field liquefaction response (i.e. triggering and surface manifestation). Research opportunities using these data include, but are not limited to, the training or testing of new and existing liquefaction-prediction models. The many methods used to obtain and process the case-history data are detailed herein, as is the structure of the compiled digital file. Finally, recommendations for analyzing the data are outlined, including nuances and limitations that users should carefully consider.


2021 ◽  
pp. 026858092110053
Author(s):  
Koichi Hiraoka

This article reviews the research trends in welfare sociology (sociological studies on social security and welfare), one of the many subfields of active research in sociology in Japan. For this purpose, several research streams formed from the 1970s to the 2000s are described, and some of the most important research results produced within these in the past two decades are introduced. In the latter part of this article, a broad overview of the research trends in Japanese welfare sociology is attempted by focusing on the contents of the journal published by the Japan Welfare Sociology Association (JWSA).


Sign in / Sign up

Export Citation Format

Share Document