scholarly journals The effects of the universal metering programme on water consumption, welfare and equity

2019 ◽  
Author(s):  
Carmine Ornaghi ◽  
Mirco Tonin

Abstract There is consensus that meters are necessary for the promotion of efficient water usage. However, available evidence on the benefits and costs of metering is scant, and often based on small samples. We use data from the first large-scale compulsory metering programme in England to study its impact on consumption, social efficiency and distributional outcomes. We find a decrease in consumption of 22% following meter installation, a considerably higher value than assumed as a policy target. This result implies that, overall, the benefits of metering outweigh its costs. We also document a large heterogeneity in reaction, with many households showing low sensitivity to the new tariff. This novel finding suggests that selective metering, where only more price-sensitive households receive meters, would deliver even higher social welfare. Looking at distributional effects, we find similar reduction in consumption across income groups, although only high-income households gain financially from the new tariff.

2019 ◽  
Vol 61 (1) ◽  
pp. 5-13 ◽  
Author(s):  
Loretta Lees

Abstract Gentrification is no-longer, if it ever was, a small scale process of urban transformation. Gentrification globally is more often practised as large scale urban redevelopment. It is state-led or state-induced. The results are clear – the displacement and disenfranchisement of low income groups in favour of wealthier in-movers. So, why has gentrification come to dominate policy making worldwide and what can be done about it?


2018 ◽  
Author(s):  
Matthias May ◽  
Kira Rehfeld

Greenhouse gas emissions must be cut to limit global warming to 1.5-2C above preindustrial levels. Yet the rate of decarbonisation is currently too low to achieve this. Policy-relevant scenarios therefore rely on the permanent removal of CO<sub>2</sub> from the atmosphere. However, none of the envisaged technologies has demonstrated scalability to the decarbonization targets for the year 2050. In this analysis, we show that artificial photosynthesis for CO<sub>2</sub> reduction may deliver an efficient large-scale carbon sink. This technology is mainly developed towards solar fuels and its potential for negative emissions has been largely overlooked. With high efficiency and low sensitivity to high temperature and illumination conditions, it could, if developed towards a mature technology, present a viable approach to fill the gap in the negative emissions budget.<br>


2018 ◽  
Author(s):  
Matthias May ◽  
Kira Rehfeld

Greenhouse gas emissions must be cut to limit global warming to 1.5-2C above preindustrial levels. Yet the rate of decarbonisation is currently too low to achieve this. Policy-relevant scenarios therefore rely on the permanent removal of CO<sub>2</sub> from the atmosphere. However, none of the envisaged technologies has demonstrated scalability to the decarbonization targets for the year 2050. In this analysis, we show that artificial photosynthesis for CO<sub>2</sub> reduction may deliver an efficient large-scale carbon sink. This technology is mainly developed towards solar fuels and its potential for negative emissions has been largely overlooked. With high efficiency and low sensitivity to high temperature and illumination conditions, it could, if developed towards a mature technology, present a viable approach to fill the gap in the negative emissions budget.<br>


2011 ◽  
Vol 6 (2) ◽  
pp. 252-277 ◽  
Author(s):  
Stephen T. Ziliak

AbstractStudent's exacting theory of errors, both random and real, marked a significant advance over ambiguous reports of plant life and fermentation asserted by chemists from Priestley and Lavoisier down to Pasteur and Johannsen, working at the Carlsberg Laboratory. One reason seems to be that William Sealy Gosset (1876–1937) aka “Student” – he of Student'st-table and test of statistical significance – rejected artificial rules about sample size, experimental design, and the level of significance, and took instead an economic approach to the logic of decisions made under uncertainty. In his job as Apprentice Brewer, Head Experimental Brewer, and finally Head Brewer of Guinness, Student produced small samples of experimental barley, malt, and hops, seeking guidance for industrial quality control and maximum expected profit at the large scale brewery. In the process Student invented or inspired half of modern statistics. This article draws on original archival evidence, shedding light on several core yet neglected aspects of Student's methods, that is, Guinnessometrics, not discussed by Ronald A. Fisher (1890–1962). The focus is on Student's small sample, economic approach to real error minimization, particularly in field and laboratory experiments he conducted on barley and malt, 1904 to 1937. Balanced designs of experiments, he found, are more efficient than random and have higher power to detect large and real treatment differences in a series of repeated and independent experiments. Student's world-class achievement poses a challenge to every science. Should statistical methods – such as the choice of sample size, experimental design, and level of significance – follow the purpose of the experiment, rather than the other way around? (JEL classification codes: C10, C90, C93, L66)


Author(s):  
A. Eroshkin ◽  
M. Petrov

The economic and innovative rise of the developing states stimulated a deep restructuring of the existing system of international relations in science and technology sphere. As the article points, one of the main manifestations of this trend can be seen in the transformation of global innovation strategies of transnational corporations. The world’s largest TNCs, mostly based in the industrial nations, have begun to transfer growing segments and parts of their R&D programs to the developing countries in order to take advantage of their increased research capacity. As a result, the nature of the projects being implemented there by the TNCs is changing. Historically, the TNCs’ local R&D activities were of adaptive nature. Namely, the stress was made on modification of the products and services offered by the TNCS globally to the specifics of local markets. Currently, a growing number of transnational corporations are implementing the large-scale programs in the developing countries aimed at designing new types of products, including those targeted at the low-income groups of consumers that make up the bulk of the population in developing countries.


2001 ◽  
Vol 43 (10) ◽  
pp. 287-294 ◽  
Author(s):  
S. Hills ◽  
A. Smith ◽  
P. Hardy ◽  
R. Birks

Thames Water is working with the New Millennium Experience Company to provide a water recycling system for the Millennium Dome which will supply 500m3/d of reclaimed water for WC and urinal flushing. The system will treat water from three sources:rainwater - from the Dome roofgreywater - from handbasins in the toilet blocksgroundwater - from beneath the Dome site The treatment technologies will range from “natural” reedbeds for the rainwater, to more sophisticated options, including biological aerated filters and membranes for the greywater and groundwater. Pilot scale trials were used to design the optimum configuration. In addition to the recycling system, water efficient devices will be installed in three of the core toilet blocks as part of a programme of research into the effectiveness of conservation measures. Data on water usage and customer behaviour will be collected via a comprehensive metering system. Information from the Dome project on the economics and efficiency of on-site recycling at large scale and data on water efficient devices, customer perception and behaviour will be of great value to the water industry. For Thames Water, the project provides vital input to the development of future water resource strategies.


2019 ◽  
Author(s):  
Eduard Klapwijk ◽  
Wouter van den Bos ◽  
Christian K. Tamnes ◽  
Nora Maria Raschle ◽  
Kathryn L. Mills

Many workflows and tools that aim to increase the reproducibility and replicability of research findings have been suggested. In this review, we discuss the opportunities that these efforts offer for the field of developmental cognitive neuroscience, in particular developmental neuroimaging. We focus on issues broadly related to statistical power and to flexibility and transparency in data analyses. Critical considerations relating to statistical power include challenges in recruitment and testing of young populations, how to increase the value of studies with small samples, and the opportunities and challenges related to working with large-scale datasets. Developmental studies involve challenges such as choices about age groupings, lifespan modelling, analyses of longitudinal changes, and data that can be processed and analyzed in a multitude of ways. Flexibility in data acquisition, analyses and description may thereby greatly impact results. We discuss methods for improving transparency in developmental neuroimaging, and how preregistration can improve methodological rigor. While outlining challenges and issues that may arise before, during, and after data collection, solutions and resources are highlighted aiding to overcome some of these. Since the number of useful tools and techniques is ever-growing, we highlight the fact that many practices can be implemented stepwise.


2018 ◽  
Vol 11 (2) ◽  
pp. 205979911878774 ◽  
Author(s):  
Mark Finnane ◽  
Andy Kaladelfos ◽  
Alana Piper

Historical data pose a variety of problems to those who seek statistically based understandings of the past. Quantitative historical analysis has been limited by researcher’s reliance on rigid statistics collected by individuals or agencies, or else by researcher access to small samples of raw data. Even digital technologies by themselves have not been enough to overcome the challenges of working with manuscript sources and aligning dis-aggregated data. However, by coupling the facilities enabled by the web with the enthusiasm of the public for explorations of the past, history has started to make the same strides towards big data evident in other fields. While the use of citizens to crowdsource research data was first pioneered within the sciences, a number of projects have similarly begun to draw on the help of citizen historians. This article explores the particular example of the Prosecution Project, which since 2014 has been using crowdsourced volunteers on a research collaboration to build a large-scale relational database of criminal prosecutions throughout Australia from the early 1800s to 1960s. The article outlines the opportunities and challenges faced by projects seeking to use web technologies to access, store and re-use historical data in an environment that increasingly enables creative collaborations between researchers and other users of social and historical data.


2020 ◽  
Vol 20 (3) ◽  
pp. 1301-1316
Author(s):  
Georgia Sotiropoulou ◽  
Sylvia Sullivan ◽  
Julien Savre ◽  
Gary Lloyd ◽  
Thomas Lachlan-Cope ◽  
...  

Abstract. In situ measurements of Arctic clouds frequently show that ice crystal number concentrations (ICNCs) are much higher than the number of available ice-nucleating particles (INPs), suggesting that secondary ice production (SIP) may be active. Here we use a Lagrangian parcel model (LPM) and a large-eddy simulation (LES) to investigate the impact of three SIP mechanisms (rime splintering, break-up from ice–ice collisions and drop shattering) on a summer Arctic stratocumulus case observed during the Aerosol-Cloud Coupling And Climate Interactions in the Arctic (ACCACIA) campaign. Primary ice alone cannot explain the observed ICNCs, and drop shattering is ineffective in the examined conditions. Only the combination of both rime splintering (RS) and collisional break-up (BR) can explain the observed ICNCs, since both of these mechanisms are weak when activated alone. In contrast to RS, BR is currently not represented in large-scale models; however our results indicate that this may also be a critical ice-multiplication mechanism. In general, low sensitivity of the ICNCs to the assumed INP, to the cloud condensation nuclei (CCN) conditions and also to the choice of BR parameterization is found. Finally, we show that a simplified treatment of SIP, using a LPM constrained by a LES and/or observations, provides a realistic yet computationally efficient way to study SIP effects on clouds. This method can eventually serve as a way to parameterize SIP processes in large-scale models.


Sign in / Sign up

Export Citation Format

Share Document