rigorous framework
Recently Published Documents


TOTAL DOCUMENTS

103
(FIVE YEARS 52)

H-INDEX

13
(FIVE YEARS 3)

2021 ◽  
Vol 933 ◽  
Author(s):  
Emma C. Edwards ◽  
Dick K.-P. Yue

We propose a scientifically rigorous framework to find realistic optimal geometries of wave energy converters (WECs). For specificity, we assume WECs to be axisymmetric point absorbers in a monochromatic unidirectional incident wave, all within the context of linearised potential theory. We consider separately the problem of a WEC moving and extracting wave energy in heave only and then the more general case of motion and extraction in combined heave, surge and pitch. We describe the axisymmetric geometries using polynomial basis functions, allowing for discontinuities in slope. Our framework involves ensuring maximum power, specifying practical motion constraints and then minimising surface area (as a proxy for cost). The framework is robust and well-posed, and the optimisation produces feasible WEC geometries. Using the proposed framework, we develop a systematic computational and theoretical approach, and we obtain results and insights for the optimal WEC geometries. The optimisation process is sped up significantly by a new theoretical result to obtain roots of the heave resonance equation. For both the heave-only, and the heave-surge-pitch combined problems, we find that geometries which protrude outward below the waterline are generally optimal. These optimal geometries have up to 73 % less surface area and 90 % less volume than the optimal cylinders which extract the same power.


Author(s):  
Nuel Belnap ◽  
Thomas Müller ◽  
Tomasz Placek

This book develops a rigorous theory of indeterminism as a local and modal concept. Its crucial insight is that our world contains events or processes with alternative, really possible outcomes. The theory aims at clarifying what this assumption involves, and it does it in two ways. First, it provides a mathematically rigorous framework for local and modal indeterminism. Second, we support that theory by spelling out the philosophically relevant consequences of this formulation and by showing its fruitful applications in metaphysics. To this end, we offer a formal analysis of modal correlations and of causation, which is applicable in indeterministic and non-local contexts as well. We also propose a rigorous theory of objective single-case probabilities, intended to represent degrees of possibility. In a third step, we link our theory to current physics, investigating how local and modal indeterminism relates to issues in the foundations of physics, in particular, quantum non-locality and spatio-temporal relativity. The book also ventures into the philosophy of time, showing how the theory’s resources can be used to explicate the dynamic concept of the past, present, and future based on local indeterminism.


2021 ◽  
pp. 1-39

Abstract Anthropogenically induced radiative imbalances in the climate system lead to a slow accumulation of heat in the ocean. This warming is often obscured by natural modes of climate variability such as the El Niño-Southern Oscillation (ENSO), which drive substantial ocean temperature changes as a function of depth and latitude. The use of watermass coordinates has been proposed to help isolate forced signals and filter out fast adiabatic processes associated with modes of variability. However, how much natural modes of variability project into these different coordinate systems has not been quantified. Here we apply a rigorous framework to quantify ocean temperature variability using both a quasi-Lagrangian, watermass-based temperature coordinate and Eulerian depth and latitude coordinates in a free-running climate model under pre-industrial conditions. The temperature-based coordinate removes the adiabatic component of ENSO-dominated interannual variability by definition, but a substantial diabatic signal remains. At slower (decadal to centennial) frequencies, variability in the temperature- and depth-based coordinates is comparable. Spectral analysis of temperature tendencies reveals the dominance of advective processes in latitude and depth coordinates while the variability in temperature coordinates is related closely to the surface forcing. Diabatic mixing processes play an important role at slower frequencies where quasi steady-state balances emerge between forcing and mixing in temperature, advection and mixing in depth, and forcing and advection in latitude. While watermass-based analyses highlight diabatic effects by removing adiabatic variability, our work shows that natural variability has a strong diabatic component and cannot be ignored in the analysis of long term trends.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 582
Author(s):  
Matthias C. Caro ◽  
Elies Gil-Fuster ◽  
Johannes Jakob Meyer ◽  
Jens Eisert ◽  
Ryan Sweke

A large body of recent work has begun to explore the potential of parametrized quantum circuits (PQCs) as machine learning models, within the framework of hybrid quantum-classical optimization. In particular, theoretical guarantees on the out-of-sample performance of such models, in terms of generalization bounds, have emerged. However, none of these generalization bounds depend explicitly on how the classical input data is encoded into the PQC. We derive generalization bounds for PQC-based models that depend explicitly on the strategy used for data-encoding. These imply bounds on the performance of trained PQC-based models on unseen data. Moreover, our results facilitate the selection of optimal data-encoding strategies via structural risk minimization, a mathematically rigorous framework for model selection. We obtain our generalization bounds by bounding the complexity of PQC-based models as measured by the Rademacher complexity and the metric entropy, two complexity measures from statistical learning theory. To achieve this, we rely on a representation of PQC-based models via trigonometric functions. Our generalization bounds emphasize the importance of well-considered data-encoding strategies for PQC-based models.


Author(s):  
Saman Fatihi ◽  
Surabhi Rathore ◽  
Ankit K. Pathak ◽  
Deepanshi Gahlot ◽  
Mitali Mukerji ◽  
...  

2021 ◽  
Vol 11 (2) ◽  
Author(s):  
Cynthia Dwork ◽  
Weijie Su ◽  
Li Zhang

Differential privacy provides a rigorous framework for privacy-preserving data analysis. This paper proposes the first differentially private procedure for controlling the false discovery rate (FDR) in multiple hypothesis testing. Inspired by the Benjamini-Hochberg procedure (BHq), our approach is to first repeatedly add noise to the logarithms of the p-values to ensure differential privacy and to select an approximately smallest p-value serving as a promising candidate at each iteration; the selected p-values are further supplied to the BHq and our private procedure releases only the rejected ones. Moreover, we develop a new technique that is based on a backward submartingale for proving FDR control of a broad class of multiple testing procedures, including our private procedure, and both the BHq step- up and step-down procedures. As a novel aspect, the proof works for arbitrary dependence between the true null and false null test statistics, while FDR control is maintained up to a small multiplicative factor.


Author(s):  
Uriel Urquiza-García ◽  
Andrew J Millar

Abstract The circadian clock coordinates plant physiology and development. Mathematical clock models have provided a rigorous framework to understand how the observed rhythms emerge from disparate, molecular processes. However, models of the plant clock have largely been built and tested against RNA timeseries data in arbitrary, relative units. This limits model transferability, refinement from biochemical data and applications in synthetic biology. Here, we incorporate absolute mass units into a detailed model of the clock gene network in Arabidopsis thaliana. We re-interpret the established P2011 model, highlighting a transcriptional activator that overlaps the function of REVEILLE 8/LHY-CCA1-LIKE 5. The U2020 model incorporates the repressive regulation of PRR genes, a key feature of the most detailed clock model KF2014, without greatly increasing model complexity. We tested the experimental error distributions of qRT-PCR data calibrated for units of RNA transcripts/cell and of circadian period estimates, in order to link the models to data more appropriately. U2019 and U2020 models were constrained using these data types, recreating previously-described circadian behaviours with RNA metabolic processes in absolute units. To test their inferred rates, we estimated a distribution of observed, transcriptome-wide transcription rates (Plant Empirical Transcription Rates, PETR) in units of transcripts/cell/hour. The PETR distribution and the equivalent degradation rates indicated that the models’ predicted rates are biologically plausible, with individual exceptions. In addition to updated clock models, FAIR data resources and a software environment in Docker, this validation process represents an advance in biochemical realism for models of plant gene regulation.


2021 ◽  
Author(s):  
Georgeos Hardo ◽  
Somenath Bakshi

Abstract Stochastic gene expression causes phenotypic heterogeneity in a population of genetically identical bacterial cells. Such non-genetic heterogeneity can have important consequences for the population fitness, and therefore cells implement regulation strategies to either suppress or exploit such heterogeneity to adapt to their circumstances. By employing time-lapse microscopy of single cells, the fluctuation dynamics of gene expression may be analysed, and their regulatory mechanisms thus deciphered. However, a careful consideration of the experimental design and data-analysis is needed to produce useful data for deriving meaningful insights from them. In the present paper, the individual steps and challenges involved in a time-lapse experiment are discussed, and a rigorous framework for designing, performing, and extracting single-cell gene expression dynamics data from such experiments is outlined.


Sign in / Sign up

Export Citation Format

Share Document