scholarly journals Sensitivity below the standard quantum limit in gravitational wave detectors with Michelson-Fabry-Perot readout

2008 ◽  
Vol 77 (12) ◽  
Author(s):  
J. Belfi ◽  
F. Marin
2017 ◽  
Vol 13 (8) ◽  
pp. 776-780 ◽  
Author(s):  
Yiqiu Ma ◽  
Haixing Miao ◽  
Belinda Heyun Pang ◽  
Matthew Evans ◽  
Chunnong Zhao ◽  
...  

Author(s):  
M. Heurs

Interferometric gravitational wave detectors (such as advanced LIGO) employ high-power solid-state lasers to maximize their detection sensitivity and hence their reach into the universe. These sophisticated light sources are ultra-stabilized with regard to output power, emission frequency and beam geometry; this is crucial to obtain low detector noise. However, even when all laser noise is reduced as far as technically possible, unavoidable quantum noise of the laser still remains. This is a consequence of the Heisenberg Uncertainty Principle, the basis of quantum mechanics: in this case, it is fundamentally impossible to simultaneously reduce both the phase noise and the amplitude noise of a laser to arbitrarily low levels. This fact manifests in the detector noise budget as two distinct noise sources—photon shot noise and quantum radiation pressure noise—which together form a lower boundary for current-day gravitational wave detector sensitivities, the standard quantum limit of interferometry. To overcome this limit, various techniques are being proposed, among them different uses of non-classical light and alternative interferometer topologies. This article explains how quantum noise enters and manifests in an interferometric gravitational wave detector, and gives an overview of some of the schemes proposed to overcome this seemingly fundamental limitation, all aimed at the goal of higher gravitational wave event detection rates. This article is part of a discussion meeting issue ‘The promises of gravitational-wave astronomy’.


Sign in / Sign up

Export Citation Format

Share Document