Effect of gravity curvature on large-scale atomic gravimeters

Author(s):  
Dorothee Tell ◽  
Étienne Wodey ◽  
Christian Meiners ◽  
Klaus H. Zipfel ◽  
Manuel Schilling ◽  
...  

<p>In terrestrial geodesy, absolute gravimetry is a tool to observe geophysical processes over extended timescales. This requires measurement devices of high sensitivity and stability. Atom interferometers connect the free fall motion of atomic ensembles to absolute frequency measurements and thus feature very high long-term stability. By extending their vertical baseline to several meters, we introduce Very Long Baseline Interferometry (VLBAI) as a gravity reference of higher-order accuracy.</p><p>By using state-of-the-art vibration isolation, sensor fusion and well controlled atomic sources and environments on a 10 m baseline, we aim for an intrinsic sensitivity σ<sub>g</sub> ≤ 5 nm/s² in a first scenario for our Hannover VLBAI facility. At this level, the effects of gravity gradients and curvature along the free fall region need to be taken into account. We present gravity measurements along the baseline, in agreement with simulations using an advanced model of the building and surroundings [1]. Using this knowledge, we perform a perturbation theory approach to calculate the resulting contribution to the atomic gravimeter uncertainty, as well as the effective instrumental height of the device depending on the interferometry scheme [2]. Based on these results, we will be able to compare gravity values with nearby absolute gravimeters and as a first step verify the performance of the VLBAI gravimeter at a level comparable to classical devices.</p><p>The Hannover VLBAI facility is a major research equipment funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation). This work was supported by the DFG Collaborative Research Center 1464 “TerraQ” (Project A02) and is supported by the CRC 1227 “DQ-mat” (Project B07), Germany’s Excellence Strategy EXC-2123 “QuantumFrontiers”, and the computing cluster of the Leibniz University Hannover under patronage of the Lower Saxony Ministry of Science and Culture (MWK) and the DFG. We acknowledge support from “Niedersächsisches Vorab” through the “Quantum- and Nano-Metrology (QUANOMET)” initiative (Project QT3), and for initial funding of research in the DLR-SI institute, as well as funding from the German Federal Ministry of Education and Research (BMBF) through the funding program Photonics Research Germany.</p><p>[1] Schilling et al. “Gravity field modelling for the Hannover 10 m atom interferometer”.  Journal of Geodesy 94, 122 (2020)</p><p>[2] Ufrecht, Giese,  “Perturbative operator approach to high-precision light-pulse atom interferometry”. Physical Review A 101, 053615 (2020).</p>

1999 ◽  
Vol 39 (4) ◽  
pp. 55-60 ◽  
Author(s):  
J. Alex ◽  
R. Tschepetzki ◽  
U. Jumar ◽  
F. Obenaus ◽  
K.-H. Rosenwinkel

Activated sludge models are widely used for planning and optimisation of wastewater treatment plants and on line applications are under development to support the operation of complex treatment plants. A proper model is crucial for all of these applications. The task of parameter calibration is focused in several papers and applications. An essential precondition for this task is an appropriately defined model structure, which is often given much less attention. Different model structures for a large scale treatment plant with circulation flow are discussed in this paper. A more systematic method to derive a suitable model structure is applied to this case. Results of a numerical hydraulic model are used for this purpose. The importance of these efforts are proven by a high sensitivity of the simulation results with respect to the selection of the model structure and the hydraulic conditions. Finally it is shown, that model calibration was possible only by adjusting to the hydraulic behaviour and without any changes of biological parameters.


Diagnostics ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 869
Author(s):  
Amedeo De Nicolò ◽  
Valeria Avataneo ◽  
Jessica Cusato ◽  
Alice Palermiti ◽  
Jacopo Mula ◽  
...  

Recently, large-scale screening for COVID-19 has presented a major challenge, limiting timely countermeasures. Therefore, the application of suitable rapid serological tests could provide useful information, however, little evidence regarding their robustness is currently available. In this work, we evaluated and compared the analytical performance of a rapid lateral-flow test (LFA) and a fast semiquantitative fluorescent immunoassay (FIA) for anti-nucleocapsid (anti-NC) antibodies, with the reverse transcriptase real-time PCR assay as the reference. In 222 patients, LFA showed poor sensitivity (55.9%) within two weeks from PCR, while later testing was more reliable (sensitivity of 85.7% and specificity of 93.1%). Moreover, in a subset of 100 patients, FIA showed high sensitivity (89.1%) and specificity (94.1%) after two weeks from PCR. The coupled application for the screening of 183 patients showed satisfactory concordance (K = 0.858). In conclusion, rapid serological tests were largely not useful for early diagnosis, but they showed good performance in later stages of infection. These could be useful for back-tracing and/or to identify potentially immune subjects.


Author(s):  
A J Rigby ◽  
N Peretto ◽  
R Adam ◽  
P Ade ◽  
M Anderson ◽  
...  

Abstract Determining the mechanism by which high-mass stars are formed is essential for our understanding of the energy budget and chemical evolution of galaxies. By using the New IRAM KIDs Array 2 (NIKA2) camera on the Institut de Radio Astronomie Millimétrique (IRAM) 30-m telescope, we have conducted high-sensitivity and large-scale mapping of a fraction of the Galactic plane in order to search for signatures of the transition between the high- and low-mass star-forming modes. Here, we present the first results from the Galactic Star Formation with NIKA2 (GASTON) project, a Large Programme at the IRAM 30-m telescope which is mapping ≈2 deg2 of the inner Galactic plane (GP), centred on ℓ = 23${_{.}^{\circ}}$9, b = 0${_{.}^{\circ}}$05, as well as targets in Taurus and Ophiuchus in 1.15 and 2.00 mm continuum wavebands. In this paper we present the first of the GASTON GP data taken, and present initial science results. We conduct an extraction of structures from the 1.15 mm maps using a dendrogram analysis and, by comparison to the compact source catalogues from Herschel survey data, we identify a population of 321 previously-undetected clumps. Approximately 80 per cent of these new clumps are 70 μm-quiet, and may be considered as starless candidates. We find that this new population of clumps are less massive and cooler, on average, than clumps that have already been identified. Further, by classifying the full sample of clumps based upon their infrared-bright fraction – an indicator of evolutionary stage – we find evidence for clump mass growth, supporting models of clump-fed high-mass star formation.


2021 ◽  
Vol 7 (2) ◽  
pp. 18
Author(s):  
Germana Landi ◽  
Fabiana Zama ◽  
Villiam Bortolotti

This paper is concerned with the reconstruction of relaxation time distributions in Nuclear Magnetic Resonance (NMR) relaxometry. This is a large-scale and ill-posed inverse problem with many potential applications in biology, medicine, chemistry, and other disciplines. However, the large amount of data and the consequently long inversion times, together with the high sensitivity of the solution to the value of the regularization parameter, still represent a major issue in the applicability of the NMR relaxometry. We present a method for two-dimensional data inversion (2DNMR) which combines Truncated Singular Value Decomposition and Tikhonov regularization in order to accelerate the inversion time and to reduce the sensitivity to the value of the regularization parameter. The Discrete Picard condition is used to jointly select the SVD truncation and Tikhonov regularization parameters. We evaluate the performance of the proposed method on both simulated and real NMR measurements.


2015 ◽  
Author(s):  
Peter Weiland ◽  
Ina Dehnhard

See video of the presentation.The benefits of making research data permanently accessible through data archives is widely recognized: costs can be reduced by reusing existing data, research results can be compared and validated with results from archived studies, fraud can be more easily detected, and meta-analyses can be conducted. Apart from that, authors may gain recognition and reputation for producing the datasets. Since 2003, the accredited research data center PsychData (part of the Leibniz Institute for Psychology Information in Trier, Germany) documents and archives research data from all areas of psychology and related fields. In the beginning, the main focus was on datasets that provide a high potential for reuse, e.g. longitudinal studies, large-scale cross sectional studies, or studies that were conducted during historically unique conditions. Presently, more and more journal publishers and project funding agencies require researchers to archive their data and make them accessible for the scientific community. Therefore, PsychData also has to serve this need.In this presentation we report on our experiences in operating a discipline-specific research data archive in a domain where data sharing is met with considerable resistance. We will focus on the challenges for data sharing and data reuse in psychology, e.g.large amount of domain-specific knowledge necessary for data curationhigh costs for documenting the data because of a wide range on non-standardized measuressmall teams and little established infrastructures compared with the "big data" disciplinesstudies in psychology not designed for reuse (in contrast to the social sciences)data protectionresistance to sharing dataAt the end of the presentation, we will provide a brief outlook on DataWiz, a new project funded by the German Research Foundation (DFG). In this project, tools will be developed to support researchers in documenting their data during the research phase.


2021 ◽  
Author(s):  
Lingfei Wang

AbstractSingle-cell RNA sequencing (scRNA-seq) provides unprecedented technical and statistical potential to study gene regulation but is subject to technical variations and sparsity. Here we present Normalisr, a linear-model-based normalization and statistical hypothesis testing framework that unifies single-cell differential expression, co-expression, and CRISPR scRNA-seq screen analyses. By systematically detecting and removing nonlinear confounding from library size, Normalisr achieves high sensitivity, specificity, speed, and generalizability across multiple scRNA-seq protocols and experimental conditions with unbiased P-value estimation. We use Normalisr to reconstruct robust gene regulatory networks from trans-effects of gRNAs in large-scale CRISPRi scRNA-seq screens and gene-level co-expression networks from conventional scRNA-seq.


1979 ◽  
Vol 88 (1) ◽  
pp. 56-65 ◽  
Author(s):  
Jack L. Paradise ◽  
Clyde G. Smith

As a test for detecting middle ear disease among preschool children, tympanometry — as opposed to audiometry — has three advantageous attributes: a high degree of sensitivity, minimal need for subject cooperation, and total objectivity. For these reasons interest has arisen in tympanometry as a method for screening, i.e., identifying children with previously undetected middle ear disease. However, uncertainty persists concerning the importance of detecting apparently asymptomatic middle ear effusions, and concerning optimal methods, or even the advisability, of treating them. Further, the sensitivity and specificity of tympanometry depend on how the pass-fail cutoff point is defined. Defining this cutoff point so as to achieve high sensitivity may result in excessively low specificity, with the production of large numbers of false-positives who then become overreferrals. Data are presented to show how the validity of the test may be increased to some extent by attention to the gradient of “negative-pressure” tympanograms. At the present time, given the various aforementioned uncertainties, and with adequate validation as to the presence or absence of disease often lacking in reported studies of impedance screening in preschool populations, the cumulative results of these studies do not warrant embarking on large-scale screening programs. What is needed instead is additional research to explore the issue further.


Author(s):  
Jiang Zhao ◽  
Jiahao Gui ◽  
Jinsong Luo ◽  
Jing Gao ◽  
Caidong Zheng ◽  
...  

Abstract Graphene-based pressure sensors have received extensive attention in wearable devices. However, reliable, low-cost, and large-scale preparation of structurally stable graphene electrodes for flexible pressure sensors is still a challenge. Herein, for the first time, laser-induced graphene (LIG) powder are prepared into screen printing ink, and shape-controllable LIG patterned electrodes can be obtained on various substrates using a facile screen printing process, and a novel asymmetric pressure sensor composed of the resulting screen-printed LIG electrodes has been developed. Benefit from the 3D porous structure of LIG, the as-prepared flexible LIG screen-printed asymmetric pressure sensor has super sensing properties with a high sensitivity of 1.86 kPa−1, low detection limit of about 3.4 Pa, short response time, and long cycle durability. Such excellent sensing performances give our flexible asymmetric LIG screen-printed pressure sensor the ability to realize real-time detection of tiny body physiological movements (such as wrist pulse and pronunciation action). Besides, the integrated sensor array has a multi-touch function. This work could stimulate an appropriate approach to designing shape-controllable LIG screen-printed patterned electrodes on various flexible substrates to adapt the specific needs of fulfilling compatibility and modular integration for potential application prospects in wearable electronics.


2018 ◽  
pp. 1-12 ◽  
Author(s):  
Ashley Earles ◽  
Lin Liu ◽  
Ranier Bustamante ◽  
Pat Coke ◽  
Julie Lynch ◽  
...  

Purpose Cancer ascertainment using large-scale electronic health records is a challenge. Our aim was to propose and apply a structured approach for evaluating multiple candidate approaches for cancer ascertainment using colorectal cancer (CRC) ascertainment within the US Department of Veterans Affairs (VA) as a use case. Methods The proposed approach for evaluating cancer ascertainment strategies includes assessment of individual strategy performance, comparison of agreement across strategies, and review of discordant diagnoses. We applied this approach to compare three strategies for CRC ascertainment within the VA: administrative claims data consisting of International Classification of Diseases, Ninth Revision (ICD9) diagnosis codes; the VA Central Cancer Registry (VACCR); and the newly accessible Oncology Domain, consisting of cases abstracted by local cancer registrars. The study sample consisted of 1,839,043 veterans with index colonoscopy performed from 1999 to 2014. Strategy-specific performance was estimated based on manual record review of 100 candidate CRC cases and 100 colonoscopy controls. Strategies were further compared using Cohen’s κ and focused review of discordant CRC diagnoses. Results A total of 92,197 individuals met at least one CRC definition. All three strategies had high sensitivity and specificity for incident CRC. However, the ICD9-based strategy demonstrated poor positive predictive value (58%). VACCR and Oncology Domain had almost perfect agreement with each other (κ, 0.87) but only moderate agreement with ICD9-based diagnoses (κ, 0.51 and 0.57, respectively). Among discordant cases reviewed, 15% of ICD9-positive but VACCR- or Oncology Domain–negative cases had incident CRC. Conclusion Evaluating novel strategies for identifying cancer requires a structured approach, including validation against manual record review, agreement among candidate strategies, and focused review of discordant findings. Without careful assessment of ascertainment methods, analyses may be subject to bias and limited in clinical impact.


Sign in / Sign up

Export Citation Format

Share Document