scholarly journals By the Numbers: DAP: Digital Analytics Program

2016 ◽  
Vol 44 (1) ◽  
pp. 7
Author(s):  
Pamela Campbell

Every day, millions of people access government websites. Over the thirty days preceding the date this article was written, the National Weather Service and the National Library of Medicine each received more than 50 million visits. According to the US General Services Administration, there are over 1,300 dot-gov domains in use by federal agencies. Given the immense resources involved in building and maintaining these websites, sound decisions about allocating these resources are important. And that means decision makers need good data.

2010 ◽  
Vol 25 (5) ◽  
pp. 1412-1429 ◽  
Author(s):  
Russ S. Schumacher ◽  
Daniel T. Lindsey ◽  
Andrea B. Schumacher ◽  
Jeff Braun ◽  
Steven D. Miller ◽  
...  

Abstract On 22 May 2008, a strong tornado—rated EF3 on the enhanced Fujita scale, with winds estimated between 136 and 165 mi h−1 (61 and 74 m s−1)—caused extensive damage along a 55-km track through northern Colorado. The worst devastation occurred in and around the town of Windsor, and in total there was one fatality, numerous injuries, and hundreds of homes significantly damaged or destroyed. Several characteristics of this tornado were unusual for the region from a climatological perspective, including its intensity, its long track, its direction of motion, and the time of day when it formed. These unusual aspects and the high impact of this tornado also raised a number of questions about the communication and interpretation of information from National Weather Service watches and warnings by decision makers and the public. First, the study examines the meteorological circumstances responsible for producing such an outlier to the regional severe weather climatology. An analysis of the synoptic and mesoscale environmental conditions that were favorable for significant tornadoes on 22 May 2008 is presented. Then, a climatology of significant tornadoes (defined as those rated F2 or higher on the Fujita scale, or EF2 or higher on the Enhanced Fujita scale) near the Front Range is shown to put the 22 May 2008 event into climatological context. This study also examines the communication and interpretation of severe weather information in an area that experiences tornadoes regularly but is relatively unaccustomed to significant tornadoes. By conducting interviews with local decision makers, the authors have compiled and chronicled the flow of information as the event unfolded. The results of these interviews demonstrate that the initial sources of warning information varied widely. Decision makers’ interpretations of the warnings also varied, which led to different perceptions on the timeliness and clarity of the warning information. The decision makers’ previous knowledge of the typical local characteristics of tornadoes also affected their interpretations of the tornado threat. The interview results highlight the complex series of processes by which severe weather information is communicated after a warning is issued by the National Weather Service. The results of this study support the growing recognition that societal factors are just as important to the effectiveness of weather warnings as the timeliness of and information provided in those warnings, and that these factors should be considered in future research in addition to the investments and attention given to improving detection and warning capabilities.


2004 ◽  
Vol 291 (3-4) ◽  
pp. 297-318 ◽  
Author(s):  
Victor Koren ◽  
Seann Reed ◽  
Michael Smith ◽  
Ziya Zhang ◽  
Dong-Jun Seo

2021 ◽  
Vol 9 ◽  
Author(s):  
Dina Abdel-Fattah ◽  
Sarah Trainor ◽  
Eran Hood ◽  
Regine Hock ◽  
Christian Kienholz

Glacial lake outburst floods (GLOFs) significantly affect downstream communities in Alaska. Notably, GLOFs originating from Suicide Basin, adjacent to Mendenhall Glacier, have impacted populated areas in Juneau, Alaska since 2011. On the Kenai Peninsula, records of GLOFs from Snow Glacier date as far back as 1949, affecting downstream communities and infrastructure along the Kenai and Snow river systems. The US National Weather Service, US Geological Survey, and University of Alaska Southeast (for Suicide Basin) provide informational products to aid the public in monitoring both glacial dammed lakes as well as the ensuing GLOFs. This 2 year study (2018–2019) analyzed how communities affected by the aforementioned GLOFs utilize these various products. The participants in this project represented a variety of different sectors and backgrounds to capture a diverse set of perspectives and insights, including those of homeowners, emergency responders, tour operators, and staff at federal and state agencies. In addition, feedback and suggestions were collected from interviewees to facilitate improvements or modifications by the relevant entities to make the informational products more usable. Findings from this study were also used to inform changes to the US National Weather Service monitoring websites for both Suicide Basin and Snow Glacier. This paper’s findings on GLOF information use are relevant for other GLOF-affected communities, from both an information user and information developer perspective.


Author(s):  
Evan S. Bentley ◽  
Richard L. Thompson ◽  
Barry R. Bowers ◽  
Justin G. Gibbs ◽  
Steven E. Nelson

AbstractPrevious work has considered tornado occurrence with respect to radar data, both WSR-88D and mobile research radars, and a few studies have examined techniques to potentially improve tornado warning performance. To date, though, there has been little work focusing on systematic, large-sample evaluation of National Weather Service (NWS) tornado warnings with respect to radar-observable quantities and the near-storm environment. In this work, three full years (2016–2018) of NWS tornado warnings across the contiguous United States were examined, in conjunction with supporting data in the few minutes preceding warning issuance, or tornado formation in the case of missed events. The investigation herein examines WSR-88D and Storm Prediction Center (SPC) mesoanalysis data associated with these tornado warnings with comparisons made to the current Warning Decision Training Division (WDTD) guidance.Combining low-level rotational velocity and the significant tornado parameter (STP), as used in prior work, shows promise as a means to estimate tornado warning performance, as well as relative changes in performance as criteria thresholds vary. For example, low-level rotational velocity peaking in excess of 30 kt (15 m s−1), in a near-storm environment which is not prohibitive for tornadoes (STP > 0), results in an increased probability of detection and reduced false alarms compared to observed NWS tornado warning metrics. Tornado warning false alarms can also be reduced through limiting warnings with weak (<30 kt), broad (>1nm) circulations in a poor (STP=0) environment, careful elimination of velocity data artifacts like sidelobe contamination, and through greater scrutiny of human-based tornado reports in otherwise questionable scenarios.


2018 ◽  
Vol 33 (6) ◽  
pp. 1501-1511 ◽  
Author(s):  
Harold E. Brooks ◽  
James Correia

Abstract Tornado warnings are one of the flagship products of the National Weather Service. We update the time series of various metrics of performance in order to provide baselines over the 1986–2016 period for lead time, probability of detection, false alarm ratio, and warning duration. We have used metrics (mean lead time for tornadoes warned in advance, fraction of tornadoes warned in advance) that work in a consistent way across the official changes in policy for warning issuance, as well as across points in time when unofficial changes took place. The mean lead time for tornadoes warned in advance was relatively constant from 1986 to 2011, while the fraction of tornadoes warned in advance increased through about 2006, and the false alarm ratio slowly decreased. The largest changes in performance take place in 2012 when the default warning duration decreased, and there is an apparent increased emphasis on reducing false alarms. As a result, the lead time, probability of detection, and false alarm ratio all decrease in 2012. Our analysis is based, in large part, on signal detection theory, which separates the quality of the warning system from the threshold for issuing warnings. Threshold changes lead to trade-offs between false alarms and missed detections. Such changes provide further evidence for changes in what the warning system as a whole considers important, as well as highlighting the limitations of measuring performance by looking at metrics independently.


Author(s):  
Andrew M Fielding ◽  
Anne Powell

Medline is the US National Library of Medicine database that is used for searching the medical biochemistry literature. The database is structured using medical subject subheadings (MeSH terms) to classify the content of references; indexing is done manually using MeSH terms as key words. Searching the database effectively means finding the maximum number of relevant references together with the minimum number of irrelevant ones. This article is aimed at explaining the limitations of Medline and suggesting some solutions to key problems. The goal is that users can improve their literature search technique by employing a structured approach. As usual, asking relevant questions before starting a search is essential.


Sign in / Sign up

Export Citation Format

Share Document