scholarly journals Effects of latency on estimates of the COVID-19 replication number

Author(s):  
Lorenzo Sadun

It is not currently known how long it takes a person infected by the COVID-19 virus to become infectious. Models of the spread of COVID-19 use very different lengths for this latency period, leading to very different estimates of the replication number R, even when models work from the same underlying data sets. In this paper we quantify how much varying the length of the latency period affects estimates of R, and thus the fraction of the population that is predicted to be infected in the first wave of the pandemic. This variation underscores the uncertainty in our understanding of R and raises the possibility that R may be considerably greater than has been assumed by those shaping public policy.

The chapter focuses on geospatial data infrastructure. The mass of data needed for public policy planning could come from various sources. The chapter discusses the participatory approaches for the realization of open and interoperable systems and presents the geospatial data infrastructure approach to address this issue: core data sets, standards, institutional and legal arrangements, technology and capacity building. The environment in which the system is designed impacts the technological solution: legal and institutional framework, compliance with standards, availability of human resources, sustainability in terms of financial resources. The chapter examines experiences at the international level to draw best practices for implementing national and thematic GDI.


2013 ◽  
Vol 47 (01) ◽  
pp. 165-172 ◽  
Author(s):  
Gary King

AbstractThe social sciences are undergoing a dramatic transformation from studying problems to solving them; from making do with a small number of sparse data sets to analyzing increasing quantities of diverse, highly informative data; from isolated scholars toiling away on their own to larger scale, collaborative, interdisciplinary, lab-style research teams; and from a purely academic pursuit focused inward to having a major impact on public policy, commerce and industry, other academic fields, and some of the major problems that affect individuals and societies. In the midst of all this productive chaos, we have been building the Institute for Quantitative Social Science at Harvard, a new type of center intended to help foster and respond to these broader developments. We offer here some suggestions from our experiences for the increasing number of other universities that have begun to build similar institutions and for how we might work together to advance social science more generally.


Author(s):  
Barbara Prainsack

Abstract Along with the proliferation of digital technologies and the datafication of wider areas of people’s bodies and lives, the meaning of Personalised Medicine has shifted. In contemporary Personalised and ‘Precision’ Medicine, openness typically features in terms of calls for data sharing to ensure the availability of the very data sets required for the personalisation of diagnosis, treatment, and prevention. But there are other, more fundamental ways of considering openness in the context of Personalised and Precision Medicine that set different goals for public policy: (1) in an ontological sense, pertaining to the openness of the category of the ‘person’ in Personalised and Precision Medicine; (2) in a pluralistic sense, regarding the plurality of personal and societal perspectives and values in healthcare; and (3) in an emancipatory sense, counteracting concentrations of power around corporate actors—including consumer tech companies—in the health domain. The enhancement of public benefit and social justice and the protection of privacy are key goals for public policy in this context.


2020 ◽  
Vol 4 (1) ◽  
Author(s):  
Omar Isaac Asensio ◽  
Ximin Mi ◽  
Sameer Dharur

For a growing class of prediction problems, big data and machine learning (ML) analyses can greatly enhance our understanding of the effectiveness of public investments and public policy. However, the outputs of many ML models are often abstract and inaccessible to policy communities or the general public. In this article, we describe a hands-on teaching case that is suitable for use in a graduate or advanced undergraduate public policy, public affairs, or environmental studies classroom. Students will engage on the use of increasingly popular ML classification algorithms and cloud-based data visualization tools to support policy and planning on the theme of electric vehicle mobility and connected infrastructure. By using these tools, students will critically evaluate and convert large and complex data sets into human understandable visualization for communication and decision making. The tools also enable user flexibility to engage with streaming data sources in a new creative design with little technical background.


2014 ◽  
Vol 14 (4) ◽  
pp. 253-257 ◽  
Author(s):  
Andy Williamson

AbstractBig data is more than the sum of the parts; it is about scale but also how we inter-connect once disparate data sets, and mine and analyse them. It is something that has started to make a significant impact everywhere from financial markets to supermarkets. And, as Andy Williamson explains, it is starting to become an important factor in the development of public policy and for the delivery and analysis of public services. Big data offers us powerful new ways to see the world around us but this comes with the challenges of ownership, privacy and misuse: it can be used for good or to constrain people. We have to ensure that, as the uptake of big data increases, our legislation and practice keeps pace, ensuring that mistakes and misuse are prevented.


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
Mark Ellisman ◽  
Maryann Martone ◽  
Gabriel Soto ◽  
Eleizer Masliah ◽  
David Hessler ◽  
...  

Structurally-oriented biologists examine cells, tissues, organelles and macromolecules in order to gain insight into cellular and molecular physiology by relating structure to function. The understanding of these structures can be greatly enhanced by the use of techniques for the visualization and quantitative analysis of three-dimensional structure. Three projects from current research activities will be presented in order to illustrate both the present capabilities of computer aided techniques as well as their limitations and future possibilities.The first project concerns the three-dimensional reconstruction of the neuritic plaques found in the brains of patients with Alzheimer's disease. We have developed a software package “Synu” for investigation of 3D data sets which has been used in conjunction with laser confocal light microscopy to study the structure of the neuritic plaque. Tissue sections of autopsy samples from patients with Alzheimer's disease were double-labeled for tau, a cytoskeletal marker for abnormal neurites, and synaptophysin, a marker of presynaptic terminals.


Author(s):  
Douglas L. Dorset

The quantitative use of electron diffraction intensity data for the determination of crystal structures represents the pioneering achievement in the electron crystallography of organic molecules, an effort largely begun by B. K. Vainshtein and his co-workers. However, despite numerous representative structure analyses yielding results consistent with X-ray determination, this entire effort was viewed with considerable mistrust by many crystallographers. This was no doubt due to the rather high crystallographic R-factors reported for some structures and, more importantly, the failure to convince many skeptics that the measured intensity data were adequate for ab initio structure determinations.We have recently demonstrated the utility of these data sets for structure analyses by direct phase determination based on the probabilistic estimate of three- and four-phase structure invariant sums. Examples include the structure of diketopiperazine using Vainshtein's 3D data, a similar 3D analysis of the room temperature structure of thiourea, and a zonal determination of the urea structure, the latter also based on data collected by the Moscow group.


Author(s):  
W. Shain ◽  
H. Ancin ◽  
H.C. Craighead ◽  
M. Isaacson ◽  
L. Kam ◽  
...  

Neural protheses have potential to restore nervous system functions lost by trauma or disease. Nanofabrication extends this approach to implants for stimulating and recording from single or small groups of neurons in the spinal cord and brain; however, tissue compatibility is a major limitation to their practical application. We are using a cell culture method for quantitatively measuring cell attachment to surfaces designed for nanofabricated neural prostheses.Silicon wafer test surfaces composed of 50-μm bars separated by aliphatic regions were fabricated using methods similar to a procedure described by Kleinfeld et al. Test surfaces contained either a single or double positive charge/residue. Cyanine dyes (diIC18(3)) stained the background and cell membranes (Fig 1); however, identification of individual cells at higher densities was difficult (Fig 2). Nuclear staining with acriflavine allowed discrimination of individual cells and permitted automated counting of nuclei using 3-D data sets from the confocal microscope (Fig 3). For cell attachment assays, LRM5 5 astroglial cells and astrocytes in primary cell culture were plated at increasing cell densities on test substrates, incubated for 24 hr, fixed, stained, mounted on coverslips, and imaged with a 10x objective.


Author(s):  
Thomas W. Shattuck ◽  
James R. Anderson ◽  
Neil W. Tindale ◽  
Peter R. Buseck

Individual particle analysis involves the study of tens of thousands of particles using automated scanning electron microscopy and elemental analysis by energy-dispersive, x-ray emission spectroscopy (EDS). EDS produces large data sets that must be analyzed using multi-variate statistical techniques. A complete study uses cluster analysis, discriminant analysis, and factor or principal components analysis (PCA). The three techniques are used in the study of particles sampled during the FeLine cruise to the mid-Pacific ocean in the summer of 1990. The mid-Pacific aerosol provides information on long range particle transport, iron deposition, sea salt ageing, and halogen chemistry.Aerosol particle data sets suffer from a number of difficulties for pattern recognition using cluster analysis. There is a great disparity in the number of observations per cluster and the range of the variables in each cluster. The variables are not normally distributed, they are subject to considerable experimental error, and many values are zero, because of finite detection limits. Many of the clusters show considerable overlap, because of natural variability, agglomeration, and chemical reactivity.


Sign in / Sign up

Export Citation Format

Share Document