scholarly journals Risk assessment of banknotes as a fomite of SARS-CoV-2 in cash payment transactions

Author(s):  
Jack Schijven ◽  
Mark Wind ◽  
Daniel Todt ◽  
John Howes ◽  
Barbora Tamele ◽  
...  

AbstractBackgroundThe COVID 19 pandemic has triggered concerns and assumptions globally about transmission of the SARS-CoV-2 virus via cash transactions.ObjectivesAssess the risk of contracting COVID-19 through exposure to SARS-CoV-2 via cash acting as a fomite in payment transactions.MethodsA quantitative microbial risk assessment was conducted for a worst-case scenario assuming an infectious person at the onset of symptoms, when virion concentrations in coughed droplets are at their highest. This person then contaminates a banknote by coughing on it and immediately hands it over to another person, who might then be infected by transferring the virions with a finger from the contaminated banknote to a facial mucous membrane. The scenario considered transfer efficiency of virions on the banknote to fingertips when droplets were still wet and after having dried up and subsequently being touched by finger printing or rubbing the object.ResultsAccounting for the likelihood of the worst-case scenario to occur by considering 1) a local prevalence of 100 COVID-19 cases/100,000 persons, 2) a maximum of about 1/5th of infected persons transmit high virus loads and 3) the numbers of cash transactions/person/day, the risk of contracting COVID-19 via person-to-person cash transactions was estimated to be much lower than once per 39,000 days (107 years) for a single person. In the general populace, there will be a maximum of 2.6 expected cases/100,000 persons/day. The risk for a cashier at an average point of sale was estimated to be much less than once per 430 working days (21 months).DiscussionThe worst-case scenario is a rare event, therefore, for a single person, the risk of contracting COVID-19 via person-to-person cash transactions is very low. At a point of sale, the risk to the cashier proportionally increases but it is still low.

2006 ◽  
Vol 72 (5) ◽  
pp. 3284-3290 ◽  
Author(s):  
Andrew J. Hamilton ◽  
Frank Stagnitti ◽  
Robert Premier ◽  
Anne-Maree Boland ◽  
Glenn Hale

ABSTRACT Quantitative microbial risk assessment models for estimating the annual risk of enteric virus infection associated with consuming raw vegetables that have been overhead irrigated with nondisinfected secondary treated reclaimed water were constructed. We ran models for several different scenarios of crop type, viral concentration in effluent, and time since last irrigation event. The mean annual risk of infection was always less for cucumber than for broccoli, cabbage, or lettuce. Across the various crops, effluent qualities, and viral decay rates considered, the annual risk of infection ranged from 10−3 to 10−1 when reclaimed-water irrigation ceased 1 day before harvest and from 10−9 to 10−3 when it ceased 2 weeks before harvest. Two previously published decay coefficients were used to describe the die-off of viruses in the environment. For all combinations of crop type and effluent quality, application of the more aggressive decay coefficient led to annual risks of infection that satisfied the commonly propounded benchmark of ≤10−4, i.e., one infection or less per 10,000 people per year, providing that 14 days had elapsed since irrigation with reclaimed water. Conversely, this benchmark was not attained for any combination of crop and water quality when this withholding period was 1 day. The lower decay rate conferred markedly less protection, with broccoli and cucumber being the only crops satisfying the 10−4 standard for all water qualities after a 14-day withholding period. Sensitivity analyses on the models revealed that in nearly all cases, variation in the amount of produce consumed had the most significant effect on the total uncertainty surrounding the estimate of annual infection risk. The models presented cover what would generally be considered to be worst-case scenarios: overhead irrigation and consumption of vegetables raw. Practices such as subsurface, furrow, or drip irrigation and postharvest washing/disinfection and food preparation could substantially lower risks and need to be considered in future models, particularly for developed nations where these extra risk reduction measures are more common.


2004 ◽  
Vol 50 (2) ◽  
pp. 23-30 ◽  
Author(s):  
T. Westrell ◽  
C. Schönning ◽  
T.A. Stenström ◽  
N.J. Ashbolt

Hazard Analysis and Critical Control Points (HACCP) was applied for identifying and controlling exposure to pathogenic microorganisms encountered during normal sludge and wastewater handling at a 12,500 m3/d treatment plant utilising tertiary wastewater treatment and mesophilic sludge digestion. The hazardous scenarios considered were human exposure during treatment, handling, soil application and crop consumption, and exposure via water at the wetland-area and recreational swimming. A quantitative microbial risk assessment (QMRA), including rotavirus, adenovirus, haemorrhagic E. coli, Salmonella, Giardia and Cryptosporidium, was performed in order to prioritise pathogen hazards for control purposes. Human exposures were treated as individual risks but also related to the endemic situation in the general population. The highest individual health risk from a single exposure was via aerosols for workers at the belt press for sludge dewatering (virus infection risk = 1). The largest impact on the community would arise if children ingested sludge at the unprotected storage site, although in the worst-case situation the largest number of infections would arise through vegetables fertilised with sludge and eaten raw (not allowed in Sweden). Acceptable risk for various hazardous scenarios, treatment and/or reuse strategies could be tested in the model.


2020 ◽  
Vol 42 (11) ◽  
pp. 548-557
Author(s):  
Eun Sung Baek ◽  
Kyoshik Park

Objectives : In order to conduct the quantitative risk assessment for hazardous chemical storage facilities at the tank terminal in the port area, the entire risk assessment process was performed in according to the guidances of the Korea Ministry of Environment.Methods : The risk of the facility was derived by the worst-case scenario, alternative scenario, and then evaluated by KORA program. The countermeasures of the risk were suggested by the concept of LOPA.Results and Discussion : Focusing on the worst case scenario and alternative scenario among the scenario having effet to offsite, risk can be reduced to satisfy regulation by applying measures of passive, active, and managerial.Conclusions : According to the result of risk assessment on benzene storage tank and tank lorry when port construction, the amount of storage inside the tank has a significant impact on the offsite. It is necessary to organize the risk of benzene, and comprehensive management of tank terminal storage facilities.


2008 ◽  
Author(s):  
Sonia Savelli ◽  
Susan Joslyn ◽  
Limor Nadav-Greenberg ◽  
Queena Chen

LWT ◽  
2021 ◽  
Vol 144 ◽  
pp. 111201 ◽  
Author(s):  
Prez Verónica Emilse ◽  
Victoria Matías ◽  
Martínez Laura Cecilia ◽  
Giordano Miguel Oscar ◽  
Masachessi Gisela ◽  
...  

Sports ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 76
Author(s):  
Dylan Mernagh ◽  
Anthony Weldon ◽  
Josh Wass ◽  
John Phillips ◽  
Nimai Parmar ◽  
...  

This is the first study to report the whole match, ball-in-play (BiP), ball-out-of-play (BoP), and Max BiP (worst case scenario phases of play) demands of professional soccer players competing in the English Championship. Effective playing time per soccer game is typically <60 min. When the ball is out of play, players spend time repositioning themselves, which is likely less physically demanding. Consequently, reporting whole match demands may under-report the physical requirements of soccer players. Twenty professional soccer players, categorized by position (defenders, midfielders, and forwards), participated in this study. A repeated measures design was used to collect Global Positioning System (GPS) data over eight professional soccer matches in the English Championship. Data were divided into whole match and BiP data, and BiP data were further sub-divided into different time points (30–60 s, 60–90 s, and >90 s), providing peak match demands. Whole match demands recorded were compared to BiP and Max BiP, with BiP data excluding all match stoppages, providing a more precise analysis of match demands. Whole match metrics were significantly lower than BiP metrics (p < 0.05), and Max BiP for 30–60 s was significantly higher than periods between 60–90 s and >90 s. No significant differences were found between positions. BiP analysis allows for a more accurate representation of the game and physical demands imposed on professional soccer players. Through having a clearer understanding of maximum game demands in professional soccer, practitioners can design more specific training methods to better prepare players for worst case scenario passages of play.


Sign in / Sign up

Export Citation Format

Share Document