scholarly journals Quadratic-exponential functionals of Gaussian quantum processes

Author(s):  
Igor G. Vladimirov ◽  
Ian R. Petersen ◽  
Matthew R. James

This paper is concerned with exponential moments of integral-of-quadratic functions of quantum processes with canonical commutation relations of position-momentum type. Such quadratic-exponential functionals (QEFs) arise as robust performance criteria in control problems for open quantum harmonic oscillators (OQHOs) driven by bosonic fields. We develop a randomised representation for the QEF using a Karhunen–Loeve expansion of the quantum process on a bounded time interval over the eigenbasis of its two-point commutator kernel, with noncommuting position-momentum pairs as coefficients. This representation holds regardless of a particular quantum state and employs averaging over an auxiliary classical Gaussian random process whose covariance operator is specified by the commutator kernel. This allows the QEF to be related to the moment-generating functional of the quantum process and computed for multipoint Gaussian states. For stationary Gaussian quantum processes, we establish a frequency-domain formula for the QEF rate in terms of the Fourier transform of the quantum covariance kernel in composition with trigonometric functions. A differential equation is obtained for the QEF rate with respect to the risk sensitivity parameter for its approximation and numerical computation. The QEF is also applied to large deviations and worst-case mean square cost bounds for OQHOs in the presence of statistical uncertainty with a quantum relative entropy description.

2019 ◽  
Vol 17 (07) ◽  
pp. 1950055
Author(s):  
Seid Koudia ◽  
Abdelhakim Gharbi

We address the superposition of causal orders in the quantum switch as a convenient framework for quantum process discrimination in the presence of noise in qubit systems, using Bayes strategy. We show that, for different kinds of qubit noises, the indefinite causal order between the unitary to be discriminated and noise gives enhancement compared to the definite causal order case without reaching the ultimate bound of discrimination in general. Whereas, for entanglement breaking channels, the enhancement is significant, where the quantum switch allows for the attainability of the ultimate bound for discrimination posed by quantum mechanics. Memory effects escorting the superposition of causal orders are discussed, where we point out that processes describing an indefinite causal order, violate the notion of Markov locality. Accordingly, a suggestion for the simulation of indefinite causal orders in more generic scenarios beyond the quantum switch is given.


Paleobiology ◽  
2017 ◽  
Vol 43 (4) ◽  
pp. 667-692 ◽  
Author(s):  
Corentin Gibert ◽  
Gilles Escarguel

AbstractEstimating biodiversity and its variations through geologic time is a notoriously difficult task, due to several taphonomic and methodological effects that make the reconstructed signal potentially distinct from the unknown, original one. Through a simulation approach, we examine the effect of a major, surprisingly still understudied, source of potential disturbance: the effect of time discretization through biochronological construction, which generates spurious coexistences of taxa within discrete time intervals (i.e., biozones), and thus potentially makes continuous- and discrete-time biodiversity curves very different. Focusing on the taxonomic-richness dimension of biodiversity (including estimates of origination and extinction rates), our approach relies on generation of random continuous-time richness curves, which are then time-discretized to estimate the noise generated by this manipulation. A broad spectrum of data-set parameters (including average taxon longevity and biozone duration, total number of taxa, and simulated time interval) is evaluated through sensitivity analysis. We show that the deteriorating effect of time discretization on the richness signal depends highly on such parameters, most particularly on average biozone duration and taxonomic longevity because of their direct relationship with the number of false coexistences generated by time discretization. With several worst-case but realistic parameter combinations (e.g., when relatively short-lived taxa are analyzed in a long-ranging biozone framework), the original and time-discretized richness curves can ultimately show a very weak to zero correlation, making these two time series independent. Based on these simulation results, we propose a simple algorithm allowing the back-transformation of a discrete-time taxonomic-richness data set, as customarily constructed by paleontologists, into a continuous-time data set. We show that the reconstructed richness curve obtained this way fits the original signal much more closely, even when the parameter combination of the original data set is particularly adverse to an effective time-discretized reconstruction.


Author(s):  
Xudong Qin ◽  
Yuxin Deng ◽  
Wenjie Du

Abstract One important application of quantum process algebras is to formally verify quantum communication protocols. With a suitable notion of behavioural equivalence and a decision method, one can determine if an implementation of a protocol is consistent with its specification. Ground bisimulation is a convenient behavioural equivalence for quantum processes because of its associated coinduction proof technique. We exploit this technique to design and implement two on-the-fly algorithms for the strong and weak versions of ground bisimulation to check if two given processes in quantum CCS are equivalent. We then develop a tool that can verify interesting quantum protocols such as the BB84 quantum key distribution scheme.


2017 ◽  
Vol 2 (5) ◽  
Author(s):  
Julio César Silva Ruiz ◽  
Roberth Zambrano Santos

Los rápidos cambios en la ciencia y la tecnología hacen que una mayor atención a las materias que se imparten en las escuelas, incluyendo las matemáticas, lo que constituye una de las claves en el proceso. El objetivo de este trabajo fue encontrar nuevos mecanismos para estimular la enseñanza de las matemáticas, que debe ser enfocada en el desarrollo de competencias con los criterios de rendimiento necesarios para que el estudiante sea capaz de resolver los problemas de todos los días, mientras que el pensamiento lógico y crítico impuesto por el mundo moderno. Se llevó a  cabo en  la  Unidad Educativa ITSUP en  Ecuador. Se propone el  uso de la  pizarra cartesiana en la enseñanza para fortalecer este proceso con los movimientos de arte, donde se manipulan y lograr con el objetivo de clase de objetos. Para esta propuesta, se utilizó una metodología que facilita la información que se encuentra en un tablero con agujeros en un plano con el apoyo de dos reglas, una horizontal y una vertical, lo que indica la representación de un nivel secundario, que se sintetiza en el plano cartesiano. Se concluyó que las operaciones que requieren el uso de avión Descartes, sobre la cual el gráfico lineal o funciones cuadráticas se proporciona. Mayor dinamismo en la clase se logra y el deseo espontáneo de conocimiento de los estudiantes.   Palabras clave: Metodología matemática, enseñanza matemática, aprendizaje de matemática, herramienta matemática   The Cartesian blackboard teaching to work in the math lab Abstract  The rapid changes in science and technology makes greater attention to the subjects taught in schools, including mathematics, which constitutes one of the key in the process. The objective of this work was to find new mechanisms for stimulating the teaching of mathematics, which should be focused on developing skills with performance criteria necessary for the student to be able to solve everyday problems, while the logical and critical thinking imposed by the modern world. It was carried out in Education Unit ITSUP in Ecuador. It proposes the use of Cartesian blackboard in teaching to strengthen this process with the art movements, where objects are manipulated and achieved with the class objective. For this proposal, it was used  a methodology that facilitates the information that lies  on a board with holes in one plane with the support of two rules, a horizontal and a vertical one, indicating the representation of a secondary level, which is synthesized in the Cartesian plane. It was concluded that operations requiring the use of Descartes plane, on which to linear graph or quadratic functions is provided. Greater dynamism in class is achieved and the spontaneous desire of knowledge of students is developed.  Keywords: mathematical methodology, mathematics teaching, learning, math,        mathematical tool


2020 ◽  
Vol 18 (02) ◽  
pp. 1941002 ◽  
Author(s):  
Philip Taranto

Understanding temporal processes and their correlations in time is of paramount importance for the development of near-term technologies that operate under realistic conditions. Capturing the complete multi-time statistics that define a stochastic process lies at the heart of any proper treatment of memory effects. This is well understood in classical theory, where a hierarchy of joint probability distributions completely characterizes the process at hand. However, attempting to generalize this notion to quantum mechanics is problematic: observing realizations of a quantum process necessarily disturbs the state of the system, breaking an implicit, and crucial, assumption in the classical setting. This issue can be overcome by separating the experimental interventions from the underlying process, enabling an unambiguous description of the process itself and accounting for all possible multi-time correlations for any choice of interrogating instruments. In this paper, using a novel framework for the characterization of quantum stochastic processes, we first solve the long standing question of unambiguously describing the memory length of a quantum processes. This is achieved by constructing a quantum Markov order condition, which naturally generalizes its classical counterpart for the quantification of finite-length memory effects. As measurements are inherently invasive in quantum mechanics, one has no choice but to define Markov order with respect to the interrogating instruments that are used to probe the process at hand: different memory effects are exhibited depending on how one addresses the system, in contrast to the standard classical setting. We then fully characterize the structural constraints imposed on quantum processes with finite Markov order, shedding light on a variety of memory effects that can arise through various examples. Finally, we introduce an instrument-specific notion of memory strength that allows for a meaningful quantification of the temporal correlations between the history and the future of a process for a given choice of experimental intervention. These findings are directly relevant to both characterizing and exploiting memory effects that persist for a finite duration. In particular, immediate applications range from developing efficient compression and recovery schemes for the description of quantum processes with memory to designing coherent control protocols that efficiently perform information-theoretic tasks, amongst a plethora of others.


2021 ◽  
Vol 13 (12) ◽  
pp. 6677
Author(s):  
Cláudia Ferreira ◽  
Ana Silva ◽  
Jorge de Brito ◽  
Ilídio S. Dias ◽  
Inês Flores-Colen

The increase of awareness with sustainability and the desire of reducing the energy consumption in the construction sector haved increased the application of External Thermal Insulation Composite Systems (ETICS) across Europe in the last decades. Nevertheless, the implementation of appropriate maintenance strategies is still neglected. The aim of this study is to analyse the impact of different maintenance strategies. For that purpose, a condition-based maintenance model, based on Petri nets, is used to evaluate three maintenance strategies: MS1—total replacement only; MS2—combination of minor intervention and total replacement; and MS3—combination of cleaning operations, minor intervention, and total replacement. In the end, a multi-criteria analysis is used to discuss the impact of the three maintenance strategies proposed, evaluating the remaining service life, the global costs over time, the ETICS’ degradation condition, and the number of replacements (end of service life) over the time horizon. For this purpose, a sample of 378 ETICS was analysed, based on in situ visual inspections, carried out in Portugal. The results from this study reveal that maintenance plays an important role to increase the durability of ETICS, and therefore their sustainability. Regular maintenance can promote the extension of the ETICS’s service life between 88% and 159% (between 15 to 27 years), improve the global degradation condition of the ETICS, and reduce the impact on users by reducing the number of deeper interventions. Further research is essential to optimise the maintenance strategies (time interval between inspections, stakeholders’ performance criteria, and environmental exposure).


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 538
Author(s):  
Marcello Nery ◽  
Marco Túlio Quintino ◽  
Philippe Allard Guérin ◽  
Thiago O. Maciel ◽  
Reinaldo O. Vianna

Guided by the intuition of coherent superposition of causal relations, recent works presented quantum processes without classical common-cause and direct-cause explanation, that is, processes which cannot be written as probabilistic mixtures of quantum common-cause and quantum direct-cause relations (CCDC). In this work, we analyze the minimum requirements for a quantum process to fail to admit a CCDC explanation and present "simple" processes, which we prove to be the most robust ones against general noise. These simple processes can be realized by preparing a maximally entangled state and applying the identity quantum channel, thus not requiring an explicit coherent mixture of common-cause and direct-cause, exploiting the possibility of a process to have both relations simultaneously. We then prove that, although all bipartite direct-cause processes are bipartite separable operators, there exist bipartite separable processes which are not direct-cause. This shows that the problem of deciding weather a process is direct-cause process is not equivalent to entanglement certification and points out the limitations of entanglement methods to detect non-classical CCDC processes. We also present a semi-definite programming hierarchy that can detect and quantify the non-classical CCDC robustnesses of every non-classical CCDC process. Among other results, our numerical methods allow us to show that the simple processes presented here are likely to be also the maximally robust against white noise. Finally, we explore the equivalence between bipartite direct-cause processes and bipartite processes without quantum memory, to present a separable process which cannot be realized as a process without quantum memory.


Quantum ◽  
2019 ◽  
Vol 3 ◽  
pp. 171 ◽  
Author(s):  
Martin Kliesch ◽  
Richard Kueng ◽  
Jens Eisert ◽  
David Gross

Quantum process tomography is the task of reconstructing unknown quantum channels from measured data. In this work, we introduce compressed sensing-based methods that facilitate the reconstruction of quantum channels of low Kraus rank. Our main contribution is the analysis of a natural measurement model for this task: We assume that data is obtained by sending pure states into the channel and measuring expectation values on the output. Neither ancillary systems nor coherent operations across multiple channel uses are required. Most previous results on compressed process reconstruction reduce the problem to quantum state tomography on the channel's Choi matrix. While this ansatz yields recovery guarantees from an essentially minimal number of measurements, physical implementations of such schemes would typically involve ancillary systems. A priori, it is unclear whether a measurement model tailored directly to quantum process tomography might require more measurements. We establish that this is not the case.Technically, we prove recovery guarantees for three different reconstruction algorithms. The reconstructions are based on a trace, diamond, and ℓ2-norm minimization, respectively. Our recovery guarantees are uniform in the sense that with one random choice of measurement settings all quantum channels can be recovered equally well. Moreover, stability against arbitrary measurement noise and robustness against violations of the low-rank assumption is guaranteed. Numerical studies demonstrate the feasibility of the approach.


2011 ◽  
Vol 133 (4) ◽  
Author(s):  
Chen-Chien Hsu ◽  
Tsung-Chi Lu

In this paper, a quantitative index is proposed to address the performance evaluation and design issues in the digital redesign of continuous-time interval systems. From the perspective of signal energy, a worst-case energy resemblance index (WERI), defined as the ratio of the worst-case continuous signal energy (WCSE) of the continuous-time interval system over the worst-case discrete sequence energy (WDSE) of the redesigned digital system, is established for evaluating the closeness of the system performance between the redesigned digital control system and its continuous-time counterpart. Based on the WERI, performance of the redesigned digital systems can be evaluated for different discretization methods at different sampling times. It is found that no discretization method outperforms the others for all sampling times. Because of serious nonlinearities and nonconvexity involved, the determination of WCSE and WDSE is first formulated as an optimization problem and subsequently solved via an evolutionary algorithm. To guarantee stability of the redesigned digital system, the largest sampling time allowed is also evolutionarily determined to establish a sampling-time constraint under which robust Schur stability of the redesigned digital system can be ensured. For design purposes, sampling time required can be determined according to the user-specified WERI, which serves as a performance specification for fine tuning the performance of the redesigned digital control system.


2020 ◽  
Vol 142 (4) ◽  
Author(s):  
Yasir M. Alfulayyih ◽  
Peiwen Li ◽  
Ammar Omar Gwesha

Abstract An algorithm and modeling are developed to make precise planning of year-round solar energy (SE) collection, storage, and redistribution to meet a decided demand of electrical power fully relying on solar energy. The model takes the past 10 years’ data of average and worst-case sky coverage (clouds fraction) condition of a location at a time interval (window) of per 6 min in every day to predict solar energy and electrical energy harvest. The electrical energy obtained from solar energy in sunny times must meet the instantaneous energy demand and also the need for energy storage for nighttime and overcast days, so that no single day will have a shortage of energy supply in the entire year and yearly cycles. The analysis can eventually determine a best starting date of operation, a least solar collection area, and a least energy storage capacity for cost-effectiveness of the system. The algorithm provides a fundamental tool for the design of a general renewable energy harvest and storage system for non-interrupted year-round power supply. As an example, the algorithm was applied for the authors’ local city, Tucson, Arizona of the U.S. for a steady power supply of 1 MW.


Sign in / Sign up

Export Citation Format

Share Document