scholarly journals The Hubble Constant from Strongly Lensed Supernovae with Standardizable Magnifications

2022 ◽  
Vol 924 (1) ◽  
pp. 2
Author(s):  
Simon Birrer ◽  
Suhail Dhawan ◽  
Anowar J. Shajib

Abstract The dominant uncertainty in the current measurement of the Hubble constant (H 0) with strong gravitational lensing time delays is attributed to uncertainties in the mass profiles of the main deflector galaxies. Strongly lensed supernovae (glSNe) can provide, in addition to measurable time delays, lensing magnification constraints when knowledge about the unlensed apparent brightness of the explosion is imposed. We present a hierarchical Bayesian framework to combine a data set of SNe that are not strongly lensed and a data set of strongly lensed SNe with measured time delays. We jointly constrain (i) H 0 using the time delays as an absolute distance indicator, (ii) the lens model profiles using the magnification ratio of lensed and unlensed fluxes on the population level, and (iii) the unlensed apparent magnitude distribution of the SN population and the redshift–luminosity relation of the relative expansion history of the universe. We apply our joint inference framework on a future expected data set of glSNe and forecast that a sample of 144 glSNe of Type Ia with well-measured time series and imaging data will measure H 0 to 1.5%. We discuss strategies to mitigate systematics associated with using absolute flux measurements of glSNe to constrain the mass density profiles. Using the magnification of SN images is a promising and complementary alternative to using stellar kinematics. Future surveys, such as the Rubin and Roman observatories, will be able to discover the necessary number of glSNe, and with additional follow-up observations, this methodology will provide precise constraints on mass profiles and H 0.

2006 ◽  
Vol 2 (S235) ◽  
pp. 345-349
Author(s):  
Roberto G. Abraham ◽  
Patrick J. McCarthy ◽  
Erin Mentuch ◽  
Karl Glazebrook ◽  
Preethi Nair ◽  
...  

AbstractWe have used the Hubble Space Telescope's Advanced Camera for Surveys to measure the mass density function of morphologically-selected early-type galaxies in the Gemini Deep Deep Survey fields, over the redshift range 0.9 < z < 1.6. Our imaging data set covers four well-separated sight-lines, and is roughly intermediate (in terms of both depth and area) between the GOODS/GEMS imaging data, and the images obtained in the Hubble Deep Field campaigns. Our images contain 144 galaxies with ultra-deep spectroscopy, and they have been analyzed using a new purpose-written morphological analysis code which improves the reliability of morphological classifications by adopting a ‘quasi-petrosian’ image thresholding technique. We find that at z = 1 approximately 70% of the stars in massive galaxies reside in early-type systems. This fraction is remarkably similar to that seen in the local Universe. However, we detect very rapid evolution in this fraction over the range 1.0 < z < 1.6, suggesting that in this epoch the strong color-morphology relationship seen in the nearby Universe is beginning to fall into place.


2020 ◽  
Vol 643 ◽  
pp. A165 ◽  
Author(s):  
S. Birrer ◽  
A. J. Shajib ◽  
A. Galan ◽  
M. Millon ◽  
T. Treu ◽  
...  

The H0LiCOW collaboration inferred via strong gravitational lensing time delays a Hubble constant value of H0 = 73.3−1.8+1.7 km s−1 Mpc−1, describing deflector mass density profiles by either a power-law or stars (constant mass-to-light ratio) plus standard dark matter halos. The mass-sheet transform (MST) that leaves the lensing observables unchanged is considered the dominant source of residual uncertainty in H0. We quantify any potential effect of the MST with a flexible family of mass models, which directly encodes it, and they are hence maximally degenerate with H0. Our calculation is based on a new hierarchical Bayesian approach in which the MST is only constrained by stellar kinematics. The approach is validated on mock lenses, which are generated from hydrodynamic simulations. We first applied the inference to the TDCOSMO sample of seven lenses, six of which are from H0LiCOW, and measured H0 = 74.5−6.1+5.6 km s−1 Mpc−1. Secondly, in order to further constrain the deflector mass density profiles, we added imaging and spectroscopy for a set of 33 strong gravitational lenses from the Sloan Lens ACS (SLACS) sample. For nine of the 33 SLAC lenses, we used resolved kinematics to constrain the stellar anisotropy. From the joint hierarchical analysis of the TDCOSMO+SLACS sample, we measured H0 = 67.4−3.2+4.1 km s−1 Mpc−1. This measurement assumes that the TDCOSMO and SLACS galaxies are drawn from the same parent population. The blind H0LiCOW, TDCOSMO-only and TDCOSMO+SLACS analyses are in mutual statistical agreement. The TDCOSMO+SLACS analysis prefers marginally shallower mass profiles than H0LiCOW or TDCOSMO-only. Without relying on the form of the mass density profile used by H0LiCOW, we achieve a ∼5% measurement of H0. While our new hierarchical analysis does not statistically invalidate the mass profile assumptions by H0LiCOW – and thus the H0 measurement relying on them – it demonstrates the importance of understanding the mass density profile of elliptical galaxies. The uncertainties on H0 derived in this paper can be reduced by physical or observational priors on the form of the mass profile, or by additional data.


2020 ◽  
Vol 499 (4) ◽  
pp. 5641-5652
Author(s):  
Georgios Vernardos ◽  
Grigorios Tsagkatakis ◽  
Yannis Pantazis

ABSTRACT Gravitational lensing is a powerful tool for constraining substructure in the mass distribution of galaxies, be it from the presence of dark matter sub-haloes or due to physical mechanisms affecting the baryons throughout galaxy evolution. Such substructure is hard to model and is either ignored by traditional, smooth modelling, approaches, or treated as well-localized massive perturbers. In this work, we propose a deep learning approach to quantify the statistical properties of such perturbations directly from images, where only the extended lensed source features within a mask are considered, without the need of any lens modelling. Our training data consist of mock lensed images assuming perturbing Gaussian Random Fields permeating the smooth overall lens potential, and, for the first time, using images of real galaxies as the lensed source. We employ a novel deep neural network that can handle arbitrary uncertainty intervals associated with the training data set labels as input, provides probability distributions as output, and adopts a composite loss function. The method succeeds not only in accurately estimating the actual parameter values, but also reduces the predicted confidence intervals by 10 per cent in an unsupervised manner, i.e. without having access to the actual ground truth values. Our results are invariant to the inherent degeneracy between mass perturbations in the lens and complex brightness profiles for the source. Hence, we can quantitatively and robustly quantify the smoothness of the mass density of thousands of lenses, including confidence intervals, and provide a consistent ranking for follow-up science.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
◽  
Elmar Kotter ◽  
Luis Marti-Bonmati ◽  
Adrian P. Brady ◽  
Nandita M. Desouza

AbstractBlockchain can be thought of as a distributed database allowing tracing of the origin of data, and who has manipulated a given data set in the past. Medical applications of blockchain technology are emerging. Blockchain has many potential applications in medical imaging, typically making use of the tracking of radiological or clinical data. Clinical applications of blockchain technology include the documentation of the contribution of different “authors” including AI algorithms to multipart reports, the documentation of the use of AI algorithms towards the diagnosis, the possibility to enhance the accessibility of relevant information in electronic medical records, and a better control of users over their personal health records. Applications of blockchain in research include a better traceability of image data within clinical trials, a better traceability of the contributions of image and annotation data for the training of AI algorithms, thus enhancing privacy and fairness, and potentially make imaging data for AI available in larger quantities. Blockchain also allows for dynamic consenting and has the potential to empower patients and giving them a better control who has accessed their health data. There are also many potential applications of blockchain technology for administrative purposes, like keeping track of learning achievements or the surveillance of medical devices. This article gives a brief introduction in the basic technology and terminology of blockchain technology and concentrates on the potential applications of blockchain in medical imaging.


2018 ◽  
Author(s):  
Uwe Berger ◽  
Gerd Baumgarten ◽  
Jens Fiedler ◽  
Franz-Josef Lübken

Abstract. In this paper we present a new description about statistical probability density distributions (pdfs) of Polar Mesospheric Clouds (PMC) and noctilucent clouds (NLC). The analysis is based on observations of maximum backscatter, ice mass density, ice particle radius, and number density of ice particles measured by the ALOMAR RMR-lidar for all NLC seasons from 2002 to 2016. From this data set we derive a new class of pdfs that describe the statistics of PMC/NLC events which is different from previously statistical methods using the approach of an exponential distribution commonly named g-distribution. The new analysis describes successfully the probability statistic of ALOMAR lidar data. It turns out that the former g-function description is a special case of our new approach. In general the new statistical function can be applied to many kinds of different PMC parameters, e.g. maximum backscatter, integrated backscatter, ice mass density, ice water content, ice particle radius, ice particle number density or albedo measured by satellites. As a main advantage the new method allows to connect different observational PMC distributions of lidar, and satellite data, and also to compare with distributions from ice model studies. In particular, the statistical distributions of different ice parameters can be compared with each other on the basis of a common assessment that facilitate, for example, trend analysis of PMC/NLC.


2018 ◽  
Vol 617 ◽  
pp. A140 ◽  
Author(s):  
Olivier Wertz ◽  
Bastian Orthen ◽  
Peter Schneider

The central ambition of the modern time delay cosmography consists in determining the Hubble constant H0 with a competitive precision. However, the tension with H0 obtained from the Planck satellite for a spatially flat ΛCDM cosmology suggests that systematic errors may have been underestimated. The most critical of these errors probably comes from the degeneracy existing between lens models that was first formalized by the well-known mass-sheet transformation (MST). In this paper, we assess to what extent the source position transformation (SPT), a more general invariance transformation which contains the MST as a special case, may affect the time delays predicted by a model. To this aim, we have used pySPT, a new open-source python package fully dedicated to the SPT that we present in a companion paper. For axisymmetric lenses, we find that the time delay ratios between a model and its SPT-modified counterpart simply scale like the corresponding source position ratios, Δtˆ/Δt ≈ βˆ/β, regardless of the mass profile and the isotropic SPT. Similar behavior (almost) holds for nonaxisymmetric lenses in the double image regime and for opposite image pairs in the quadruple image regime. In the latter regime, we also confirm that the time delay ratios are not conserved. In addition to the MST effects, the SPT-modified time delays deviate in general no more than a few percent for particular image pairs, suggesting that its impact on time delay cosmography seems not be as crucial as initially suspected. We also reflected upon the relevance of the SPT validity criterion and present arguments suggesting that it should be reconsidered. Even though a new validity criterion would affect the time delays in a different way, we expect from numerical simulations that our conclusions will remain unchanged.


2015 ◽  
Vol 456 (1) ◽  
pp. 739-755 ◽  
Author(s):  
Dandan Xu ◽  
Dominique Sluse ◽  
Peter Schneider ◽  
Volker Springel ◽  
Mark Vogelsberger ◽  
...  

2018 ◽  
Author(s):  
PierGianLuca Porta Mana ◽  
Claudia Bachmann ◽  
Abigail Morrison

Automated classification methods for disease diagnosis are currently in the limelight, especially for imaging data. Classification does not fully meet a clinician's needs, however: in order to combine the results of multiple tests and decide on a course of treatment, a clinician needs the likelihood of a given health condition rather than binary classification yielded by such methods. We illustrate how likelihoods can be derived step by step from first principles and approximations, and how they can be assessed and selected, using fMRI data from a publicly available data set containing schizophrenic and healthy control subjects, as a working example. We start from the basic assumption of partial exchangeability, and then the notion of sufficient statistics and the "method of translation" (Edgeworth, 1898) combined with conjugate priors. This method can be used to construct a likelihood that can be used to compare different data-reduction algorithms. Despite the simplifications and possibly unrealistic assumptions used to illustrate the method, we obtain classification results comparable to previous, more realistic studies about schizophrenia, whilst yielding likelihoods that can naturally be combined with the results of other diagnostic tests.


Sign in / Sign up

Export Citation Format

Share Document