scholarly journals Evaluating the impact of free private well testing outreach on participants' private well stewardship in New Jersey

Author(s):  
Alecia Seliga ◽  
Steven E. Spayd ◽  
Nicholas A. Procopio ◽  
Sara V. Flanagan ◽  
Jessie A. Gleason

Abstract Over 1 million people in New Jersey (NJ) are estimated to receive drinking water from private wells. The most commonly detected contaminants in NJ private well water are naturally occurring arsenic and gross alpha (8.3 and 10.9%, respectively). Between 2015 and 2018, three free and voluntary private well testing events tested a total of 571 at-risk wells and 226 (40%) were identified as having one or more contaminants exceeding drinking water standards. Participants were invited to complete a survey to evaluate household characteristics, participant experience, and private well stewardship behavior patterns. Of 529 delivered surveys, 211 (40%) participants completed surveys. Among respondents, 63% reported plans to test their private wells in the future. Among failed wells, 45% of households reported performing mitigative action in response to the event, either through the installation of water treatment system or switching to bottled water. The survey evaluation identified previous knowledge of well contamination risks and discussing test results with a third party as important factors for promoting self-reported stewardship behavior. The evaluation provides guidance for outreach organizers to develop effective testing events and further considers the private well owners' experience of the outreach events to identify information for ‘best practices’ and improvements of future programs.

2015 ◽  
Vol 6 (1) ◽  
pp. 142-150 ◽  
Author(s):  
A. van Geen ◽  
K. M. Ahmed ◽  
E. B. Ahmed ◽  
I. Choudhury ◽  
M. R. Mozumder ◽  
...  

Community wells that extend deeper than most private wells are crucial for reducing exposure to groundwater arsenic (As) in rural Bangladesh. This study evaluates the impact on access to safe drinking water of 915 such intermediate (90–150 m) and deep (>150 m) wells across a 180 km2 area where a total of 48,790 tubewells were tested with field kits in 2012–13. Half the shallow private wells meet the Bangladesh standard of 50 μg/L for As in drinking water, whereas 92% of the intermediate and deep wells meet the more restrictive World Health Organization guideline for As in drinking water of 10 μg/L. As a proxy for water access, distance calculations show that 29% of shallow wells with >50 μg/L As are located within walking distance (100 m) of at least one of the 915 intermediate or deep wells. Similar calculations for a hypothetical more even distribution of deep wells show that 74% of shallow wells with >50 μg/L As could have been located within 100 m of the same number of deep wells. These observations and well-usage data suggest that community wells in Araihazar, and probably elsewhere in Bangladesh, were not optimally allocated by the government because of elite capture.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Agune Ashole Alto ◽  
Wanzahun Godana ◽  
Genet Gedamu

Background. Diarrheal diseases are still one of the major causes of morbidity in under-five children in sub-Saharan Africa. In Ethiopia, diarrhea is responsible for 9% of all deaths and is the major cause of under-five mortality. Objective. To assess the impact of community-led total sanitation and hygiene on the prevalence of diarrheal disease and factors associated among under-five children in Gamo Gofa Zone. Methods. Community-based comparative cross-sectional study design was used to compare the impact of community-led total sanitation and hygiene intervention on under-five diarrheal disease. Multistage sampling method was employed. The data were collected by using pretested structured questionnaires. Data quality was ensured by daily supervision completeness and consistency. The data were coded, entered, and cleaned by using Epi Info version 7 and were analyzed by using SPSS version 20. Bivariate and multivariable analyses were carried out by using binary logistic regression. Significance was declared by using p value of <0.05 and AOR with 95% confidence intervals. Results. The response rate of this study was 93.3%. The overall diarrhea prevalence was 27.5% (CI = (24.06, 30.97)) which was 18.9% (CI = (14.94, 23.2)) in implemented and 36.2%. (CI = (30.41, 41.59)) in nonimplemented woredas. Children whose age was between 12 and 23 months (AOR = 1.6) and greater than 24 months (AOR = 5), availability of handwashing facilities (AOR = 4), disposal of waste in open field (AOR = 9.7), unimproved source of drinking water (AOR = 6.5), using only water for handwashing (AOR = 6), children who started complementary feeding less than 6 months (AOR = 5.6) and greater than 6 months (AOR = 5.2), and utensils used to feed children such as bottle (AOR = 3.9) were the factors positively associated with diarrhea. Conclusion. The overall prevalence of under-five diarrhea was 27.5%. The prevalence was low in CLTSH woredas as compared with non-CLTSH woredas. The study showed that handwashing facility, using only water for handwashing, open refuse disposal, and unimproved source of drinking water among under-five had a statistically significant association with diarrhea occurrence in CLTSH nonimplemented areas. Integrated efforts are needed from the Ministry of Health together with the WASH Project in improving drinking water, handwashing facilities, and solid waste disposal practices.


Author(s):  
Lily N Edwards-Callaway ◽  
M Caitlin Cramer ◽  
Caitlin N Cadaret ◽  
Elizabeth J Bigler ◽  
Terry E Engle ◽  
...  

ABSTRACT Shade is a mechanism to reduce heat load providing cattle with an environment supportive of their welfare needs. Although heat stress has been extensively reviewed, researched, and addressed in dairy production systems, it has not been investigated in the same manner in the beef cattle supply chain. Like all animals, beef cattle are susceptible to heat stress if they are unable to dissipate heat during times of elevated ambient temperatures. There are many factors that impact heat stress susceptibility in beef cattle throughout the different supply chain sectors, many of which relate to the production system, i.e. availability of shade, microclimate of environment, and nutrition management. The results from studies evaluating the effects of shade on production and welfare are difficult to compare due to variation in structural design, construction materials used, height, shape, and area of shade provided. Additionally, depending on operation location, shade may or may not be beneficial during all times of the year, which can influence the decision to make shade a permanent part of management systems. Shade has been shown to lessen the physiologic response of cattle to heat stress. Shaded cattle exhibit lower respiration rates, body temperatures, and panting scores compared to un-shaded cattle in weather that increases the risk of heat stress. Results from studies investigating the provision of shade indicate that cattle seek shade in hot weather. The impact of shade on behavioral patterns is inconsistent in the current body of research, some studies indicating shade provision impacts behavior and other studies reporting no difference between shaded and un-shaded groups. Analysis of performance and carcass characteristics across feedlot studies demonstrated that shaded cattle had increased ADG, improved feed efficiency, HCW, and dressing percentage when compared to cattle without shade. Despite the documented benefits of shade, current industry statistics, although severely limited in scope, indicate low shade implementation rates in feedlots and data in other supply chain sectors do not exist. Industry guidelines and third party on-farm certification programs articulate the critical need for protection from extreme weather but are not consistent in providing specific recommendations and requirements. Future efforts should include: updated economic analyses of cost versus benefit of shade implementation, exploration of producer perspectives and needs relative to shade, consideration of shade impacts in the cow-calf and slaughter plant segments of the supply chain, and integration of indicators of affective (mental) state and preference in research studies to enhance the holistic assessment of cattle welfare.


Water ◽  
2021 ◽  
Vol 13 (16) ◽  
pp. 2159
Author(s):  
George Bennett ◽  
Jill Van Reybrouck ◽  
Ceven Shemsanga ◽  
Mary Kisaka ◽  
Ines Tomašek ◽  
...  

This study characterises high-fluoride groundwater in the aquifer system on the flanks of Mount Meru, focusing on parts of the flanks that were only partially or not at all covered by previous research. Additionally, we analyse the impact of rainwater recharge on groundwater chemistry by monitoring spring discharges during water sampling. The results show that the main groundwater type in the study area is NaHCO3 alkaline groundwater (average pH = 7.8). High F− values were recorded: in 175 groundwater samples, the concentrations range from 0.15 to 301 mg/L (mean: 21.89 mg/L, median: 9.67 mg/L), with 91% of the samples containing F− values above the WHO health-based guideline for drinking water (1.5 mg/L), whereas 39% of the samples have Na+ concentrations above the WHO taste-based guideline of 200 mg/L. The temporal variability in F− concentrations between different seasons is due to the impact of the local groundwater recharge. We recommend that a detailed ecohydrological study should be carried out for the low-fluoride springs from the high-altitude recharge areas on the eastern and northwestern flanks of Mount Meru inside Arusha National Park. These springs are extracted for drinking purposes. An ecohydrological study is required for the management of these springs and their potential enhanced exploitation to ensure the sustainability of this water extraction practice. Another strategy for obtaining safe drinking water could be to use a large-scale filtering system to remove F− from the groundwater.


2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
H Mistry ◽  
B Woolner ◽  
A John

Abstract Introduction Open abdominal surgery confers potentially greater risk of surgical site infections, and local evidence suggests use of drains can reduce this. Our objectives were: Assessing local rates and risk factors of infections and if use of drains can reduce the rates of infections. Method Retrospectively looking from 01/01/2018 to 31/12/2018, at patients following laparotomy or open cholecystectomy. Data collection on demographics, smoking/alcohol status, heart, respiratory or renal disease or diabetes, steroid use and CEPOD status, as well as use of drain and the outcome of infection using inpatient and online patient records. Results 84 patients included, 25 had drains inserted. There were 13 documented cases of surgical site infection, all of whom had no drain post-op. Other parameters shown to be most prevalent in the patients with a surgical site infection include being current/ex-smoker (8/13), having heart disease (9/13), and elective procedures. Conclusions Aiming to reduce the risk of surgical site infections can improve morbidity and potentially mortality outcomes. Our audit data showed that there appears to be a benefit of inserting intra-abdominal or subcutaneous drains. We will create a standard operating procedure of all patient to receive drains post-op and then re-audit to assess the impact this has on infection rates.


BMJ Open ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. e042140
Author(s):  
Vanessa J Apea ◽  
Yize I Wan ◽  
Rageshri Dhairyawan ◽  
Zudin A Puthucheary ◽  
Rupert M Pearse ◽  
...  

ObjectiveTo describe outcomes within different ethnic groups of a cohort of hospitalised patients with confirmed COVID-19 infection. To quantify and describe the impact of a number of prognostic factors, including frailty and inflammatory markers.SettingFive acute National Health Service Hospitals in east London.DesignProspectively defined observational study using registry data.Participants1737 patients aged 16 years or over admitted to hospital with confirmed COVID-19 infection between 1 January and 13 May 2020.Main outcome measuresThe primary outcome was 30-day mortality from time of first hospital admission with COVID-19 diagnosis during or prior to admission. Secondary outcomes were 90-day mortality, intensive care unit (ICU) admission, ICU and hospital length of stay and type and duration of organ support. Multivariable survival analyses were adjusted for potential confounders.Results1737 were included in our analysis of whom 511 had died by day 30 (29%). 538 (31%) were from Asian, 340 (20%) black and 707 (40%) white backgrounds. Compared with white patients, those from minority ethnic backgrounds were younger, with differing comorbidity profiles and less frailty. Asian and black patients were more likely to be admitted to ICU and to receive invasive ventilation (OR 1.54, (95% CI 1.06 to 2.23); p=0.023 and OR 1.80 (95% CI 1.20 to 2.71); p=0.005, respectively). After adjustment for age and sex, patients from Asian (HR 1.49 (95% CI 1.19 to 1.86); p<0.001) and black (HR 1.30 (95% CI 1.02 to 1.65); p=0.036) backgrounds were more likely to die. These findings persisted across a range of risk factor-adjusted analyses accounting for major comorbidities, obesity, smoking, frailty and ABO blood group.ConclusionsPatients from Asian and black backgrounds had higher mortality from COVID-19 infection despite controlling for all previously identified confounders and frailty. Higher rates of invasive ventilation indicate greater acute disease severity. Our analyses suggest that patients of Asian and black backgrounds suffered disproportionate rates of premature death from COVID-19.


2021 ◽  
Vol 10 (8) ◽  
pp. 1551
Author(s):  
Marta Bodro ◽  
Frederic Cofan ◽  
Jose Ríos ◽  
Sabina Herrera ◽  
Laura Linares ◽  
...  

In the context of the coronavirus disease 2019 (COVID-19) pandemic, we aimed to evaluate the impact of anti-cytokine therapies (AT) in kidney transplant recipients requiring hospitalization due to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection. This is an observational retrospective study, which included patients from March to May 2020. An inverse probability of treatment weighting from a propensity score to receive AT was used in all statistical analyses, and we applied a bootstrap procedure in order to calculate an estimation of the 2.5th and 97.5th percentiles of odds ratio (OR). outcomes were measured using an ordinal scale determination (OSD). A total of 33 kidney recipients required hospitalization and 54% of them received at least one AT, mainly tocilizumab (42%), followed by anakinra (12%). There was no statistical effect in terms of intensive care unit (ICU) admission, respiratory secondary infections (35% vs. 7%) or mortality (16% vs. 13%) comparing patients that received AT with those who did not. Nevertheless, patients who received AT presented better outcomes during hospitalization in terms of OSD ≥5 ((OR 0.31; 2.5th, 97.5th percentiles (0.10; 0.72)). These analyses indicate, as a plausible hypothesis, that the use of AT in kidney transplant recipients presenting with COVID-19 could be beneficial, even though multicenter randomized control trials using these therapies in transplanted patients are needed.


Sign in / Sign up

Export Citation Format

Share Document