Video-recorded Retail Cannabis Trades in a Low-risk Marketplace

2017 ◽  
Vol 55 (1) ◽  
pp. 103-124 ◽  
Author(s):  
Kim Moeller

Objectives: This study examines the monetary value of cannabis retail trades and temporal patterns in an open-air market with low legal risks. Method: Video footage detail activities of four sellers and transcriptions were provided by the Copenhagen police. Standard bivariate tests of statistical significance are used to examine the influence of time of day and discretionary activities on trade value and temporal patterns. Results: The average trade was valued at DKK 159 (∼US$24) and DKK 121 when excluding 16 outliers. Both the rate and monetary value of trades increase as the day progresses. Twice as many trades are made hourly after sunset and significantly more in relation to national holidays and on days before days with extreme weather. Conclusion: Under the conditions of low legal risks and easy access to supply, cannabis trades are small and tend to increase in value over the course of a day. The temporal patterns in rates follow user availability of discretionary time.

2021 ◽  
pp. emermed-2021-211669
Author(s):  
Fraser Todd ◽  
James Duff ◽  
Edward Carlton

IntroductionPatients presenting to EDs with chest pain of possible cardiac origin represent a substantial and challenging cohort to risk stratify. Scores such as HE-MACS (History and Electrocardiogram-only Manchester Acute Coronary Syndromes decision aid) and HEAR (History, ECG, Age, Risk factors) have been developed to stratify risk without the need for troponin testing. Validation of these scores remains limited.MethodsWe performed a post hoc analysis of the Limit of Detection and ECG discharge strategy randomised-controlled trial dataset (n=629; June 2018 to March 2019; 8 UK hospitals) to calculate HEAR and HE-MACS scores. A <4% risk of major adverse cardiac events (MACE) at 30 days using HE-MACS and a score of <2 calculated using HEAR defined ‘very low risk’ patients suitable for discharge. The primary outcome of MACE at 30 days was used to assess diagnostic accuracy.ResultsMACE within 30 days occurred in 42/629 (7%) of the cohort. HE-MACS and HEAR scores identified 85/629 and 181/629 patients as ‘very low risk’, with MACE occurring in 0/85 and 1/181 patients, respectively. The sensitivities of each score for ruling out MACE were 100% (95% CI: 91.6% to 100%) for HE-MACS and 97.6% (95% CI: 87.7% to 99.9%) for HEAR. Presenting symptoms within these scores were poorly predictive, with only diaphoresis reaching statistical significance (OR: 4.99 (2.33 to 10.67)). Conventional cardiovascular risk factors and clinician suspicion were related to the presence of MACE at 30 days.ConclusionHEAR and HE-MACS show potential as rule out tools for acute myocardial infarction without the need for troponin testing. However, prospective studies are required to further validate these scores.


1998 ◽  
Vol 86 (3_suppl) ◽  
pp. 1335-1338 ◽  
Author(s):  
Greg Atkinson ◽  
Louise Speirs

With informed consent, 6 competitive tennis players performed alternate 15 “first” (emphasis-speed) serves and 15 “second” (emphasis-accuracy) serves at 09:00, 14:00 and 18:00 hours. Serve velocity was measured by the digitisation of video footage of each serve. The Hewitt Tennis Achievement Test was employed to measure the accuracy of serve. The amount of spin imparted on the ball was not measured. First serves were at all times of day faster than second serves. First serves were faster but least accurate at 18:00 hours, the time of day that body temperature and grip strength were highest. At 09:00 hours, first serves were just as accurate as second serves, even though velocity of first serves was higher. No effects for time of day were found for the speed and accuracy of second serves. These results indicate that time of day does affect the performance of tennis serves in a way that suggests a nonlinear relationship between velocity and accuracy.


2012 ◽  
Vol 2012 ◽  
pp. 1-7 ◽  
Author(s):  
Matthew I. McKinney ◽  
Yong-Lak Park

Osmia cornifronsRadoszkowski (Hymenoptera: Megachilidae) is utilized as an alternate pollinator toApis melliferaL. (Hymenoptera: Apidae) in early-season fruit crops. This study was conducted to investigate nesting activities and associated behaviors ofO. cornifrons.Osmia cornifronsnesting activity was recorded by using a digital video recorder with infrared cameras. Nesting behavior of ten femaleO. cornifronswas observed, and the number of nesting trips per hour was recorded. Trends in daily activity were determined with regression analysis, and chi square analysis was used to determine ifO. cornifronsspent a greater amount of time performing certain activities. The percentage of time required to gather nesting resources and complete nest construction activities was recorded from the video footage. Results of this study showed that pollen gathering was the most time-consuming gathering activity, requiring221.6±28.69 min per cell and cell provisioning was the most time-consuming intranest activity, requiring 28.9 min ± 3.97 min. We also found thatO. cornifronsactivity was correlated with time of day, temperature, and precipitation. Various nesting behaviors, including cell provisioning and partitioning, oviposition, grooming, resting and sleeping, nest-searching, and repairing behaviors, are described in this paper.


2020 ◽  
Vol 653 ◽  
pp. 167-179
Author(s):  
JLY Spaet ◽  
A Manica ◽  
CP Brand ◽  
C Gallen ◽  
PA Butcher

Understanding and predicting the distribution of organisms in heterogeneous environments is a fundamental ecological question and a requirement for sound management. To implement effective conservation strategies for white shark Carcharodon carcharias populations, it is imperative to define drivers of their movement and occurrence patterns and to protect critical habitats. Here, we acoustically tagged 444 immature white sharks and monitored their presence in relation to environmental factors over a 3 yr period (2016-2019) using an array of 21 iridium satellite-linked (VR4G) receivers spread along the coast of New South Wales, Australia. Results of generalized additive models showed that all tested predictors (month, time of day, water temperature, tidal height, swell height, lunar phase) had a significant effect on shark occurrence. However, collectively, these predictors only explained 1.8% of deviance, suggesting that statistical significance may be rooted in the large sample size rather than biological importance. On the other hand, receiver location, which captures geographic fidelity and local conditions not captured by the aforementioned environmental variables, explained a sizeable 17.3% of deviance. Sharks tracked in this study hence appear to be tolerant to episodic changes in environmental conditions, and movement patterns are likely related to currently undetermined, location-specific habitat characteristics or biological components, such as local currents, prey availability or competition. Importantly, we show that performance of VR4G receivers can be strongly affected by local environmental conditions, and provide an example of how a lack of range test controls can lead to misinterpretation and erroneous conclusions of acoustic detection data.


2020 ◽  
Vol 41 (2) ◽  
pp. 347-358 ◽  
Author(s):  
Joso Vrkljan ◽  
Dubravka Hozjan ◽  
Danijela Barić ◽  
Damir Ugarković ◽  
Krešimir Krapinec

The purpose of this study was to determine the frequency of wildlife-vehicle collisions (WVC) based on the animal species, and to deepen the knowledge of temporal patterns of vehicle collisions with roe deer and wild boar. The study analyses the data from police reports on vehicle collisions with animals on state roads, by date and time, section of road, and animal species over a 5-year period (2012–2016). These data were analysed to determine the temporal dynamics of vehicle collisions with roe deer and wild boar by month, time of day, and moon phase. On the state roads in the Dinaric area, roe deer are most commonly involved in vehicle collisions (70.1% of all collisions), followed by wild boar (11.0%). Other large species involved in collisions were fallow deer (4.8%), brown bear (1.8%), red deer (0.9%), grey wolf (0.7%), and European mouflon (0.5%), respectively. Most collisions with roe deer occurred in the period April–August, with reduced frequency during autumn and winter. For wild boar, there was no association between month and frequency of collisions. At the annual level, collisions with roe deer were significantly higher during night (37%) and twilight (41%) than during the day (22%). For wild boar, most collisions occurred during twilight (26%) and night (72%), although the difference between these two periods was not statistically significant. For roe deer, collisions had no association with lunar phase, though wild boar collisions during twilight (dawn or dusk) were more common during twilight periods on days with less moonlight. Since vehicle collisions with wildlife showed certain temporal patterns, these should be taken into consideration in developing statistical models of spatial WVC patterns, and also in planning strategies and countermeasures to mitigate WVC issues.


Author(s):  
Robert Busching ◽  
Johnie J. Allen ◽  
Craig A. Anderson

In our modern age, electronic media usage is prevalent in almost every part of the world. People are more connected than ever before with easy access to highly portable devices (e.g., laptops, smartphones, and tablets) that allow for media consumption at any time of day. Unfortunately, the presence of violence in electronic media content is almost as prevalent as the media itself. Violence can be found in music, television shows, video games, and even YouTube videos. Content analyses have shown that nearly all media contain violence, irrespective of age rating (Linder & Gentile, 2009; Thompson & Haninger, 2001; Thompson, Tepichin, & Haninger, 2006; Yokota & Thompson, 2000). It is therefore important to ask: What are the consequences of pervasive exposure to screen violence? One consequence of media violence exposure, hotly debated by some in the general public, is increased aggressive behavior. This relationship was investigated in many studies using experimental, longitudinal, or cross-sectional design. These studies are summarized in meta-analyses, which support the notion that media violence increase the likelihood of acting aggressively. This link can be explained by an increase in aggressive thoughts, a more hostile perception of the environment, and less empathic reaction to victims of aggressive behavior. However, the often debated notion that media violence allows one to vent off steam, leading to a reduced likelihood of aggressive behavior, has failed to receive empirical support. The effect of media violence is not limited to aggressive behavior; as a consequence of violent media usage attentional problems arise and prosocial behavior decreases.


2007 ◽  
Vol 73 (8) ◽  
pp. 787-791 ◽  
Author(s):  
Michael J. Stumpf ◽  
Fausto Y. Vinces ◽  
Joseph Edwards

The purpose of this article is to determine whether primary anastomosis is a safe option in the surgical management of complications of acute diverticulitis in low-risk patients. Over the past century, the management of diverticulitis has evolved from a three-stage procedure to resection and primary anastomosis. In the beginning of the century, Mayo described drainage and proximal colostomy, a three-stage procedure. This was done by performing a diverting colostomy but leaving the diseased segment of colon, hoping that the inflammation would subside. Later, the patient went back for resection of the diseased segment. Then a third procedure was performed for reversal of the colostomy. Around the late 1970s to early 1980s, it was found that patients had better outcomes if the diseased segment was resected during the first operation–the Hartman procedure. During the late 1990s to early 2000s, some surgeons began performing resection and primary anastomosis in selected groups of patients with diverticulitis. There have been a number of studies published showing that resection and primary anastomosis has an acceptable morbidity and mortality. However, most of these studies are retrospective and do not achieve statistical significance. They also do not attempt to establish guidelines to help decide which patients are good candidates for resection and primary anastomosis. The goal of this study is to establish safe and reasonable practice guidelines that can be applied to a selected group of (low-risk) patients. This study is a retrospective review of all the patients treated surgically for complications of acute diverticulitis from 1998 to 2003 at United Hospital Medical Center in Port Chester, New York. Patients were classified as high or low risk based on their age, APACHE II score, American Society of Anesthesiologists class, and Hinchey score. There were a total of 66 patients operated on for complications of acute diverticulitis (left-sided) over this 5-year period. Thirty-six of them underwent resection and primary anastomosis and 30 underwent the Hartman procedure. Of the 36 who underwent resection and primary anastomosis, 19 were considered low risk. There were no complications in this low-risk group who underwent primary anastomosis. Patients who were low risk based on the mentioned criteria can safely undergo resection and primary anastomosis.


2020 ◽  
Vol 4 (Supplement_1) ◽  
Author(s):  
Peter Y Liu ◽  
Paul Takahashi ◽  
Rebecca Yang ◽  
Ali Iranmanesh ◽  
Johannes D Veldhuis

Abstract Introduction: In young men, sleep restriction decreases testosterone and increases afternoon cortisol, leading to anabolic-catabolic imbalance, insulin resistance and metabolic, neurocognitive, reproductive, and other adverse effects. Age-related differences in the hypothalamo-pituitary-testicular/adrenal response to sleep restriction could expose older individuals to greater or lesser risk, but this possibility has not been previously studied. Subjects and Methods: Thirty-five healthy young and older men aged 18-30y (n=17) and 60-80y (n=18), underwent blood sampling in the Mayo Clinic Center for Clinical and Translational Science every 10 minutes for 24 hours from 6PM-6PM under two conditions in random order spaced at least 3 weeks apart: awake (no sleep) or sleep (from 10PM to 6AM). Blood was assayed for LH, testosterone (T) and cortisol (F), and then analyzed by automated mathematical deconvolution and with cross approximate entropy statistics to determine hormone secretion and hormone synchrony, respectively. Statistical significance was construed by repeated measures ANOVA using a full factorial model that included age, sleep and the interaction. Results: Sleep deprivation had multiple effects on 24-hour (6PM-6PM) Te secretion with significant reductions in mean concentrations, basal, total and pulsatile secretion, and pulse frequency (each P&lt;0.05), in the absence of detectable changes in LH. These effects were most apparent in older men and differed according to age for some parameters: pulsatile Te secretion (P=0.03) and T pulse frequency (P=0.02). Time-of-day analyses revealed that sleep restriction significantly reduced Te in the morning (6AM-9AM) and afternoon (3PM-6PM), reduced LH in the morning, and increased F in the afternoon, particularly in older men. Cross-approximate entropy statistics showed that sleep restriction enforced greater LH-Te and Te-LH joint synchrony in the morning (P&lt;0.05 for each), but not in the afternoon. Conclusion: Sleep restriction decreases morning LH secretion, morning and afternoon Te secretion, and increases afternoon F secretion, especially in older men. This combination of findings could plausibly cause metabolic and reproductive ill-health when accumulated over decades of life, and may explain how chronic sleep loss contributes to metabolic and reproductive diseases that are more prevalent in older men. These preliminary data also suggest a time-of-day dependent uncoupling of the regulatory control of the testicular axis, and of cortisol secretion. Direct verification by interventions that manipulate hormones during the morning and late afternoon in appropriately matched cohorts of young and older men are now required.


2019 ◽  
Vol 66 (3) ◽  
pp. 299-314
Author(s):  
Jaanri Brugman ◽  
Regan Shane Solomons ◽  
Carl Lombard ◽  
Andrew Redfern ◽  
Anne-Marie Du Plessis

Abstract Introduction A computed tomography (CT) brain scan is an often-utilised emergency department imaging modality to detect emergent intra-cranial pathology in a child with a first seizure. Identifying children at low risk of having a clinically significant intra-cranial abnormality could prevent unnecessary radiation exposure and contrast/sedation-related risks. Objectives To identify clinical variables which could predict clinically significant CT brain abnormalities and use recursive partitioning analysis to define a low-risk group of children in whom emergent CT brain can be deferred. Methods Retrospective cross-sectional review of 468 children who underwent emergent CT brain after presenting to a low- and middle-income paediatric emergency department following first seizure. Results In total 133/468 (28.4%) of CT brain scans had clinically significant abnormalities. Failure to return to neurological baseline and focal neurological deficit persisting &gt;36 h had statistical significance in a multiple regression analysis. Recursive partitioning analysis, applied to a subgroup without suspected tuberculous meningitis (n = 414), classified 153 children aged between 6 months and 5 years, who had a normal neurological baseline, had returned to baseline post-seizure, and were not in status epilepticus, as non-clinically significant scans and 98% were correctly classified. Conclusion Our study re-inforces the American Academy of Neurology recommendation that children with persistent post-ictal abnormal neurological status and/or post-ictal focal deficit be prioritised for emergent CT brain. Having excluded children with suspected tuberculous meningitis, the remaining subgroup aged 6 months to 5 years presenting with a non-status first seizure, normal neurological baseline and return to baseline post-seizure, are at very low risk of having a clinically significant CT brain abnormality.


Neurology ◽  
2018 ◽  
Vol 91 (23 Supplement 1) ◽  
pp. S16.2-S16
Author(s):  
Brandon Doan ◽  
Jeff Pasley ◽  
Tiffany Rodriguez ◽  
Katherine Valencia ◽  
Tim Tolbert

Postural control is impaired following a concussion and is 1 diagnostic method used by medical professionals for return-to-play decisions in potentially concussed athletes. Circadian rhythm (time-of-day) affects human function, including postural control. This research investigated time-of-day influence on 1 postural control diagnostic protocol, the Stability Evaluation Test (SET) on a Neurocom Balance Master. The Georgia Gwinnett College Institutional Review Board approved this research protocol. The research participants were 9 healthy women with an average age of 20.4 years, height of 165.8 cm, and weight of 65.3 kg. The participants completed the SET in the morning (between 7:00 am and 10:00 am) for 1 treatment and in the evening (between 3:00 pm and 7:00 pm) for the other treatment. A SET familiarization session was completed, and treatment order was randomized and balanced to attempt to account for order effects. Average postural sway velocity for each of the 6 SET conditions were compared between times of day. There was less postural sway during the morning testing for all conditions, reaching statistical significance (p < 0.05) for 2 of the more challenging balance conditions (Foam Double Leg and Foam Tandem) as well as for the overall SET composite score. While greater sample size, age and gender range are needed, these results may begin to inform practitioners as to the importance of controlling time-of-day between baseline and post-injury testing which may enable more accurate and reliable return-to-play decisions.


Sign in / Sign up

Export Citation Format

Share Document