Comparing predisaster mitigation grant spending with postdisaster assistance spending: Are mitigation investments saving federal dollars?

2020 ◽  
Vol 18 (4) ◽  
pp. 349-354
Author(s):  
Katharina Renken, PhD ◽  
Andrea M. Jackman, PhD ◽  
Mario G. Beruvides, PhD, PE

This work is a companion paper to “Quantifying the Relationship Between Predisaster Mitigation Spending and Major Disaster Declarations for US States and Territories.” Mitigation is a relatively new undertaking, especially for local jurisdictions, within the United States disaster policy. The Disaster Mitigation Act of 2000 (DMA 2000) requires local jurisdictions to plan for and implement mitigative strategies in order to access federal grant funding options for emergency management. After DMA 2000 went into effect in the mid-2000s, a supporting study by the Multi-Hazard Mitigation Council (MMC 2005) found that on average, mitigation projects yielded a benefit-cost ratio of 4:1 at the local level.1 This paper evaluates and compares predisaster mitigation spending and postdisaster assistance spending at the state and FEMA Regional levels, hypothesizing that as mitigation spending increases, postdisaster spending should decrease. The results however indicate the opposite, with most states showing increasing in both types of spending over time.

2020 ◽  
Vol 18 (4) ◽  
pp. 341-347
Author(s):  
Katharina Renken, PhD ◽  
Andrea M. Jackman, PhD ◽  
Mario G. Beruvides, PhD, PE

Since the Stafford Act of 1988, the process of obtaining a formal Major Disaster Declaration has been codified for national implementation, with tasks defined at the smallest levels of local government up to the President. The Disaster Mitigation Act of 2000 (DMA 2000) placed additional requirements on local government to plan for mitigation activities within their jurisdictions. The goal of DMA 2000 was to not only implement more mitigative actions at the local level, but also initiate a process by which local governments could set up ongoing conversations and collaborative efforts with neighboring jurisdictions to ensure continuous, proactive measures were taken against the impacts of disasters. Based on the increased attention paid to mitigation and planning activities, a reasonable expectation would be to see a decline in the number of major disaster declarations since DMA 2000. However, simple correlation analysis shows that since DMA 2000, the number of major disaster declarations continues to increase. This article is intended as a preliminary study to encourage more detailed analysis in the future of the impacts of federal policy on local-level disaster prevention.


2016 ◽  
Vol 131 (4) ◽  
pp. 1795-1848 ◽  
Author(s):  
Patrick Kline ◽  
Christopher R. Walters

Abstract We use data from the Head Start Impact Study (HSIS) to evaluate the cost-effectiveness of Head Start, the largest early childhood education program in the United States. Head Start draws roughly a third of its participants from competing preschool programs, many of which receive public funds. We show that accounting for the fiscal impacts of such program substitution pushes estimates of Head Start’s benefit-cost ratio well above one under a wide range of assumptions on the structure of the market for preschool services and the dollar value of test score gains. To parse the program’s test score impacts relative to home care and competing preschools, we selection-correct test scores in each care environment using excluded interactions between experimental assignments and household characteristics. We find that Head Start generates larger test score gains for children who would not otherwise attend preschool and for children who are less likely to participate in the program.


2018 ◽  
Vol 33 (1) ◽  
pp. 95-105 ◽  
Author(s):  
Debalin Sarangi ◽  
Amit J. Jhala

AbstractDue to depressed corn and soybean prices over the last few years in the United States, growers in Nebraska are showing interest in no-tillage (hereafter referred to as no-till) conventional (non–genetically engineered [non-GE]) soybean production. Due to the increasing number of herbicide-resistant weeds in the United States, weed control in no-till non-GE soybean using POST herbicides is a challenge. The objectives of this study were to compare PRE-only, PRE followed by (fb) POST, and PRE fb POST with residual (POST-WR) herbicide programs for Palmer amaranth and velvetleaf control and soybean injury and yield, as well as to estimate the gross profit margins and benefit–cost ratio of herbicide programs. A field experiment was conducted in 2016 and 2017 at Clay Center, NE. The PRE herbicides tested in this study resulted in ≥95% Palmer amaranth and velvetleaf control at 28 d after PRE (DAPRE). Averaged across the programs, the PRE-only program controlled Palmer amaranth 66%, whereas 86% and 97% control was obtained with the PRE fb POST and PRE fb POST-WR programs, respectively, at 28 d after POST (DAPOST). At 28 DAPOST, the PRE fb POST herbicide programs controlled velvetleaf 94%, whereas the PRE-only program resulted in 85% control. Mixing soil-residual herbicides with foliar-active POST programs did not improve velvetleaf control. Averaged across herbicide programs, PRE fb POST programs increased soybean yield by 10% and 41% in 2016 and 2017, respectively, over the PRE-only programs. Moreover, PRE fb POST-WR programs produced 7% and 40% higher soybean yield in 2016 and 2017, respectively, compared with the PRE fb POST programs. The gross profit margin (US$1,184.3 ha−1) was highest under flumioxazin/pyroxasulfone (PRE) fb fluthiacet-methyl plusS-metolachlor/fomesafen (POST-WR) treatment; however, the benefit–cost ratio was highest (6.1) with the PRE-only program of flumioxazin/chlorimuron-ethyl.


2019 ◽  
Vol 49 (8) ◽  
pp. 897-913 ◽  
Author(s):  
Simone J. Domingue ◽  
Christopher T. Emrich

To date, there has been limited research conducted on disaster aid allocation across multiple regions and disasters within the United States. In addition, there is a paucity of research specifically connecting social indicators of vulnerability to public assistance grants aimed at restoring, rebuilding, and mitigating against future damages in disasters. Given these gaps, this article inquires as to whether the Federal Emergency Management Agency’s (FEMA’s) public assistance program is characterized by procedural inequities, or disparate outcomes for counties with more socially vulnerable populations. Specifically, this article analyzes county-level FEMA’s Public Assistance distribution following major disaster declarations, while controlling for damages sustained, population, household counts, and FEMA Region. Results indicate that FEMA’s Public Assistance program operates well when accounting only for disaster losses across the years, however, findings also show that county social conditions influence funding receipt. Although socioeconomic characteristics were significant drivers of assistance spending, additional vulnerability indicators related to county demographic and built environment characteristics were also important drivers of receipt. Cases of both procedural inequity and equity are highlighted, and implications for equitable disaster recovery are discussed along with recommendations.


2021 ◽  
Author(s):  
Rui Li ◽  
Hanting Liu ◽  
Christopher Kit Fairley ◽  
Zhuoru Zou ◽  
Li Xie ◽  
...  

Background: Over 86% of older adults aged ≥65 years are fully vaccinated against SARS-COV-2 in the United States (US). Waning protection of the existing vaccines promotes the new vaccination strategies, such as providing a booster shot for those fully vaccinated. Methods: We developed a decision-analytic Markov model of COVID-19 to evaluate the cost-effectiveness of a booster strategy of Pfizer-BioNTech BNT162b2 (administered 6 months after 2nd dose) in those aged ≥65 years, from a healthcare system perspective. Findings: Compared with 2-doses of BNT162b2 without a booster, the booster strategy in a 100,000 cohort of older adults would incur an additional cost of $3.4 million, but save $6.7 million in direct medical costs in 180 days. This corresponds to a benefit-cost ratio of 1.95 and a net monetary benefit of $3.4 million. Probabilistic sensitivity analysis indicates that with a COVID-19 incidence of 9.1/100,000 person-day, a booster strategy has a high chance (67%) of being cost-effective. The cost-effectiveness of the booster strategy is highly sensitive to the population incidence of COVID-19, with a cost-effectiveness threshold of 8.1/100,000 person-day. This threshold will increase with a decrease in vaccine and booster efficacies. Doubling the vaccination cost or halving the medical cost for COVID-19 treatment alone would not alter the conclusion of cost-effectiveness, but certain combinations of the two might render the booster strategy not cost-effective. Interpretation: Offering BNT162b2 boosters to older adults aged ≥65 years in the US is likely to be cost-effective. Less efficacious vaccines and boosters may still be cost-effective in settings of high SARS-COV-2 transmission. Funding: National Natural Science Foundation of China. Berlina and Bill Gates Foundation


Author(s):  
Cary Jim ◽  
Sarah Evans ◽  
Alison Grant

In this paper, we share the initial findings from a multi-disciplinary project by Team D2IE (Digital Divide and Inclusion in Education), the recent first-place winner of the Global XPRIZE Education Open Data Challenge, where they investigated how digital infrastructure and internet connectivity varies among K-12 students at the county level across the United States. Two quantitative measures (Student Digital Opportunity and Benefit-Cost Ratio) and three interactive maps were developed from socio-technical and economic perspectives to support decision-making. The three interactive maps allow stakeholders to evaluate digital access, usage, cost, and economic benefits at the county level across the country.


2015 ◽  
Vol 2533 (1) ◽  
pp. 141-148 ◽  
Author(s):  
Ranjit Prasad Godavarthy ◽  
Jeremy Mattson ◽  
Elvis Ndembe

The true value of transit systems in rural and small urban areas in the United States has been largely unmeasured, and there are often effects that go unidentified. Many studies have documented the benefits of urban transit systems with benefit–cost analysis. However, not many have looked into the benefits of transit in rural and small urban areas, where there is a great need for public transit, especially for transportation-disadvantaged individuals. This study focused on evaluating the qualitative and quantitative benefits of rural and small urban public transit systems and analyzed the benefit–cost ratio for rural and small urban transit areas for fixed-route and demand-response services in the United States. Data for rural and small urban transit systems from the national transit database (NTD) and rural NTD were used for calibrating the transit benefits and costs. Results were presented at a national level to show the effects of transit investments in rural and small urban areas nationally. Transit benefits in the United States for 2011 were found to be $1.6 billion for rural transit and $3.7 billion for small urban transit, not including the economic effects. Results showed a benefit–cost ratio of 2.16 for small urban transit and 1.20 for rural transit in the United States. Sensitivity analysis showed that increasing the percentage of forgone trips to 50%, increasing the cost of forgone medical and work trips by 25%, and increasing the percentage of medical trips to 30% substantially increased the total transit benefits by 88%, 20%, and 158%, respectively.


1971 ◽  
Vol 3 (1) ◽  
pp. 161-166
Author(s):  
George A. Pavelis

The stimulus for this article was an observation that resource development in the United States is of a lumpy or whole project-by-project character. We seem to have looked at resource development proposals in isolation from other worthwhile activities and to have been preoccupied with the magnitude of “benefit-cost ratios” in evaluating and comparing individual resource development activities, projects, or programs. Unless properly interpreted, however, such ratios can mislead planners and legislators to invest capital and other inputs in a way that leads to a less than fully efficient pattern of resource development, even where the objective is only to maximize quantifiable monetary benefits. Accordingly, this analysis examines the “benefit-cost ratio” in the context of an income-producing efficiency objective and elementary production theory. Such other currently emphasized objectives as environmental quality improvement are treated implicitly, though not within a multiobjective framework. For a more complete treatment of these see Miller and Holloway [9] who have illustrated an application of multiobjective resource planning principles recently issued by the Water Resources Council [15]. Other particular papers and reports dealing with multiobjective resource development planning are [3, 4, 5, 7, 9, 13 and 14].


2013 ◽  
Vol 8 (1) ◽  
pp. 74-99 ◽  
Author(s):  
Patrick J. Wolf ◽  
Michael McShane

School voucher programs have become a prominent aspect of the education policy landscape in the United States. The DC Opportunity Scholarship Program is the only federally funded voucher program in the United States. Since 2004 it has offered publicly funded private school vouchers to nearly four thousand students to attend any of seventy-three different private schools in Washington, DC. An official experimental evaluation of the program, sponsored by the federal government's Institute of Education Sciences, found that the students who were awarded Opportunity Scholarships graduated from high school at a rate 12 percentage points higher than the students in the randomized control group. This article estimates the benefit/cost ratio of the DC Opportunity Scholarship Program, primarily by considering the increased graduation rate that it induced and the estimated positive economic returns to increased educational attainment. We find a benefit to cost ratio of 2.62, or $2.62 in benefits for every dollar spent on the program.


Author(s):  
Jannatul Ferdousi ◽  
Zabid Al Riyadh ◽  
Md. Iqbal Hossain ◽  
Satya Ranjan Saha ◽  
Mohammad Zakaria

Compacted information on mushroom cultivation in relation to production, performance, problems and prospects is very important for developing this sector. So, the aim of the review paper is to compile information on mushroom cultivation in Bangladesh. Mushroom production is increasing due to high demand of domestic market and export potentiality. In Bangladesh 40000 MT mushrooms are produced during 2018-19. Oyster, Reishi, Milky, Button, Straw and Shiitake mushrooms are most preferable species and cultivated by the farmers; but the maximum cultivation is confined to oyster mushroom (Pleurotus spp.) which are grown throughout the year. Mostly young aged educated people and rural women are adopting mushroom farming as commercial basis in Bangladesh. The study revealed that mushroom production is easy work because it requires only a little technical efficiency and a highly profitable agribusiness as evident for its lucrative benefit cost ratio (BCR 1.55-4.25). Although mushrooms production increased, there are some problems confronting by the mushroom growers during cultivation and marketing including lack of cultivation house, unavailability of good spawn, capital shortage, lack of equipment’s, lack of available market and promotion in local level, lack of storage facilities etc. which are needed to be addressed for further development of this sector. There is enormous opportunity of expanding mushroom farming throughout the country. Considering the country’s limited land, over and unemployed population, strengthening the production of mushroom could be one of the sustainable options for the development of rural economy. Development of this sector would also improve the diversified business and employment opportunities both in the rural and semi-urban areas.


Sign in / Sign up

Export Citation Format

Share Document