scholarly journals Assessment of coral restoration’s contribution to reef recovery using machine learning

2020 ◽  
Author(s):  
Gaétan Morand ◽  
Simon Dixon ◽  
Thomas Le Berre

AbstractCoral restoration emerged globally as a form of life support for coral reefs, awaiting urgent mitigation of anthropogenic pressure. Yet its efficiency is difficult to assess, as ambitious transplantation programs handle hundreds of thousands of fragments, with survival rates inherently time-intensive to monitor. Due to limited available data, the influence of most environmental and methodological factors is still unknown.We therefore propose a new method which leverages machine learning to track each colony’s individual health and growth on a large sample size. This is the first time artificial intelligence techniques were used to monitor coral at a colony scale, providing an unprecedented amount of data on coral health and growth. Here we show the influence of genus, depth and initial fragment size, alongside providing an outlook on coral restoration’s efficiency.We show that among 77,574 fragments, individual survival rate was 31% after 2 years (21% after 4 years), which is much lower than most reported results. In the absence of significant anthropogenic pressure, we showed that there was a depth limit below which Pocillopora fragments outperformed Acropora fragments, while the opposite was true past this threshold. During the mid-2019 heatwave, our research indicates that Pocillopora fragments were 37% more likely to survive than Acropora fragments.Overall, the total amount of live coral steadily increased over time, by more than 3,700 liters a year, as growth compensated for mortality. This supports the use of targeted coral restoration to accelerate reef recovery after mass bleaching events.

2021 ◽  
Vol 14 (1) ◽  
Author(s):  
Angela Brenton-Rule ◽  
Daniel Harvey ◽  
Kevin Moran ◽  
Daniel O’Brien ◽  
Jonathon Webber

Abstract Background Podiatrists in New Zealand have a duty of care to assist patients in an emergency, and current cardiopulmonary resuscitation (CPR) certification is a requirement for registration. However, it is unknown how competent and confident podiatrists are in administering CPR and how they would respond in an emergency. Having a health professional who has a competent knowledge of CPR and skills in basic life support, can improve survival rates from sudden cardiac arrest. Therefore, the aim of this study was to survey New Zealand podiatrists to determine their CPR knowledge and qualifications; beliefs about the application of CPR; and perceptions of their competency in CPR. Methods This cross-sectional study used a web-based survey. Participants were New Zealand registered podiatrists with a current annual practising certificate. The 31-item survey included questions to elicit demographic information, CPR practice and attitudes, and CPR knowledge. Responses were collected between March and August 2020. Results 171 podiatrists responded to the survey. 16 % of the podiatrists (n = 28) had performed CPR in an emergency, with a 50 % success rate. Participants were predominantly female (n = 127, 74 %) and working in private practice (n = 140,82 %). Nearly half of respondents were younger than 40 years (n = 75,44 %) and had less than 10 years of clinical experience (n = 73, 43 %). Nearly all (n = 169,97 %) participants had received formal CPR training in the past two years, with 60 % (n = 105) receiving training in the past 12 months. Most respondents (n = 167,98 %) self-estimated their CPR ability as being effective, very effective, or extremely effective. Participants’ knowledge of CPR was variable, with the percentage of correct answers for CPR protocol statements ranging between 20 and 90 %. Conclusions This study provides the first insight into New Zealand podiatrists’ CPR knowledge and perceptions. Podiatrists were found to have high levels of CPR confidence but demonstrated gaps in CPR knowledge. Currently, New Zealand registered podiatrists require biennial CPR re-certification. However, resuscitation authorities in New Zealand and overseas recommend an annual update of CPR skills. Based on this study’s findings, and in line with Australia and the United Kingdom, the authors recommend a change from biennial to annual CPR re-certification for podiatrists in New Zealand. Trial registration The study was registered with the Australian New Zealand Clinical Trials Registry (ACTRN12620001144909).


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Balamurugan Sadaiappan ◽  
Chinnamani PrasannaKumar ◽  
V. Uthara Nambiar ◽  
Mahendran Subramanian ◽  
Manguesh U. Gauns

AbstractCopepods are the dominant members of the zooplankton community and the most abundant form of life. It is imperative to obtain insights into the copepod-associated bacteriobiomes (CAB) in order to identify specific bacterial taxa associated within a copepod, and to understand how they vary between different copepods. Analysing the potential genes within the CAB may reveal their intrinsic role in biogeochemical cycles. For this, machine-learning models and PICRUSt2 analysis were deployed to analyse 16S rDNA gene sequences (approximately 16 million reads) of CAB belonging to five different copepod genera viz., Acartia spp., Calanus spp., Centropages sp., Pleuromamma spp., and Temora spp.. Overall, we predict 50 sub-OTUs (s-OTUs) (gradient boosting classifiers) to be important in five copepod genera. Among these, 15 s-OTUs were predicted to be important in Calanus spp. and 20 s-OTUs as important in Pleuromamma spp.. Four bacterial s-OTUs Acinetobacter johnsonii, Phaeobacter, Vibrio shilonii and Piscirickettsiaceae were identified as important s-OTUs in Calanus spp., and the s-OTUs Marinobacter, Alteromonas, Desulfovibrio, Limnobacter, Sphingomonas, Methyloversatilis, Enhydrobacter and Coriobacteriaceae were predicted as important s-OTUs in Pleuromamma spp., for the first time. Our meta-analysis revealed that the CAB of Pleuromamma spp. had a high proportion of potential genes responsible for methanogenesis and nitrogen fixation, whereas the CAB of Temora spp. had a high proportion of potential genes involved in assimilatory sulphate reduction, and cyanocobalamin synthesis. The CAB of Pleuromamma spp. and Temora spp. have potential genes accountable for iron transport.


10.2196/17425 ◽  
2020 ◽  
Vol 22 (3) ◽  
pp. e17425
Author(s):  
Daniel Katz ◽  
Ronak Shah ◽  
Elizabeth Kim ◽  
Chang Park ◽  
Anjan Shah ◽  
...  

Background The incidence of cardiac arrests per year in the United States continues to increase, yet in-hospital cardiac arrest survival rates significantly vary between hospitals. Current methods of training are expensive, time consuming, and difficult to scale, which necessitates improvements in advanced cardiac life support (ACLS) training. Virtual reality (VR) has been proposed as an alternative or adjunct to high-fidelity simulation (HFS) in several environments. No evaluations to date have explored the ability of a VR program to examine both technical and behavioral skills and demonstrate a cost comparison. Objective This study aimed to explore the utility of a voice-based VR ACLS team leader refresher as compared with HFS. Methods This prospective observational study performed at an academic institution consisted of 25 postgraduate year 2 residents. Participants were randomized to HFS or VR training and then crossed groups after a 2-week washout. Participants were graded on technical and nontechnical skills. Participants also completed self-assessments about the modules. Proctors were assessed for fatigue and task saturation, and cost analysis based on local economic data was performed. Results A total of 23 of 25 participants were included in the scoring analysis. Fewer participants were familiar with VR compared with HFS (9/25, 36% vs 25/25, 100%; P<.001). Self-reported satisfaction and utilization scores were similar; however, significantly more participants felt HFS provided better feedback: 99 (IQR 89-100) vs 79 (IQR 71-88); P<.001. Technical scores were higher in the HFS group; however, nontechnical scores for decision making and communication were not significantly different between modalities. VR sessions were 21 (IQR 19-24) min shorter than HFS sessions, the National Aeronautics and Space Administration task load index scores for proctors were lower in each category, and VR sessions were estimated to be US $103.68 less expensive in a single-learner, single-session model. Conclusions Utilization of a VR-based team leader refresher for ACLS skills is comparable with HFS in several areas, including learner satisfaction. The VR module was more cost-effective and was easier to proctor; however, HFS was better at delivering feedback to participants. Optimal education strategies likely contain elements of both modalities. Further studies are needed to examine the utility of VR-based environments at scale.


Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Lynn J White ◽  
Sarah A Cantrell ◽  
Robert Cronin ◽  
Shawn Koser ◽  
David Keseg ◽  
...  

Introduction Long pauses without chest compressions (CC) have been identified in CPR provided by EMS professionals for out-of-hospital cardiac arrest (OOHCA). The 2005 AHA ECC CPR guidelines emphasize CC. The 2005 AHA Basic Life Support (BLS) for Healthcare Professionals (HCP) course introduced a training method with more CPR skills practice during the DVD based course. The purpose of this before/after study was to determine whether CC rates increased after introduction of the 2005 course. Methods This urban EMS system has 400 cardiac etiology OOHCA events annually. A convenience sample of 49 continuous electronic ECG recordings of VF patients was analyzed with the impedance channel of the LIFEPAK 12 (Physio-Control, Redmond WA) and proprietary software. A trained researcher verified the automated analysis. Each CC during the resuscitation attempt and pauses in CC before and after the first defibrillation shock were noted. The time of return of spontaneous circulation (ROSC) was determined by medical record review and onset of regular electrical activity without CC. Medical records were reviewed for outcome to hospital discharge. The EMS patient care protocol for VF was changed on July 1, 2006 to comply with the 2005 AHA ECC guidelines. Cases were grouped by the OOHCA date: 9/2004 to 12/31/2006 (pre) and 7/1/2006 to 4/21/2007 (post). EMS personnel began taking the 2005 BLS for HCP course during spring 2006. Monthly courses over 3 years will recertify 1500 personnel. Results 29 cases were analyzed from the pre group and 20 from the post group. Compressions per minute increased from a mean (±SD) of 47 ± 16 pre to 75 ± 33 post (P < 0.01). The mean count of shocks given per victim decreased from 4.5 ± 4.0 pre to 2.8 ± 1.8 post (P < 0.04). The CC pause before the first shock was unchanged (23.6 ± 18.4 seconds to 22.1 ± 17.9). but the CC pause following that shock decreased significantly from 48.7 ± 63.2 to 11.8 ± 22.5 (p=0.008). Rates of ROSC (55% pre, 50% post) and survival to discharge (15% pre, 13% post) were similar. Conclusion Following introduction of the 2005 BLS for HCP course and the EMS protocol change, the quality of CPR delivered to victims of OOHCA improved significantly compared with pre-2006 CPR. The sample size was too small to detect differences in survival rates.


Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Tom P Aufderheide ◽  
Marvin Birnbaum ◽  
Charles Lick ◽  
Brent Myers ◽  
Laurie Romig ◽  
...  

Introduction: Maximizing outcomes after cardiac arrest depends on optimizing a sequence of interventions from collapse to hospital discharge. The 2005 American Heart Association (AHA) Guidelines recommended many new interventions during CPR (‘New CPR’) including use of an Impedance Threshold Device (ITD). Hypothesis: The combination of the ITD and ‘New CPR’ will increase return of spontaneous circulation (ROSC) and hospital discharge (HD) rates in patients with an out-of-hospital cardiac arrest. Methods: Quality assurance data were pooled from 7 emergency medical services (EMS) systems (Anoka Co., MN; Harris Co., TX; Madison, WI; Milwaukee, WI; Omaha, NE; Pinellas Co., FL; and Wake Co., NC) where the ITD (ResQPOD®, Advanced Circulatory Systems; Minneapolis, MN) was deployed for >3 months. Historical or concurrent control data were used for comparison. The EMS systems simultaneously implemented ‘New CPR’ including compression/ventilation strategies to provide more compressions/min and continuous compressions during Advanced Life Support. All sites stressed the importance of full chest wall recoil. The sites have a combined population of ~ 3.2 M. ROSC data were available from all sites; HD data were available as of June 2007 from 5 sites (MN, TX, Milwaukee, NE, NC). Results: A total of 893 patients treated with ‘New CPR’ + ITD were compared with 1424 control patients. The average age of both study populations was 64 years; 65% were male. Comparison of the ITD vs controls (all patients) for ROSC and HD [Odds ratios (OR), (95% confidence intervals), and Fisher’s Exact Test] were: 37.9% vs 33.8% [1.2, (1.02, 1.40), p=0.022] and 15.7% vs 7.9% [2.2, (1.53, 3.07), p<0.001], respectively. Patients with ventricular fibrillation had the best outcomes in both groups. Neurological outcome data are pending. Therapeutic hypothermia was used in some patients (MN, NC) after ROSC. Conclusion: Adoption of the ITD + ‘New CPR’ resulted in only a >10% increase in ROSC rates but a doubling of hospital discharge rates, from 7.9% to 15.7%, (p<0.001). These data represent a currently optimized sequence of therapeutic interventions during the performance of CPR for patients in cardiac arrest and support the widespread use of the 2005 AHA CPR Guidelines including use of the ITD.


2019 ◽  
Vol 06 (01) ◽  
pp. 17-28 ◽  
Author(s):  
Hoang Pham ◽  
David H. Pham

In real-life applications, we often do not have population data but we can collect several samples from a large sample size of data. In this paper, we propose a median-based machine-learning approach and algorithm to predict the parameter of the Bernoulli distribution. We illustrate the proposed median approach by generating various sample datasets from Bernoulli population distribution to validate the accuracy of the proposed approach. We also analyze the effectiveness of the median methods using machine-learning techniques including correction method and logistic regression. Our results show that the median-based measure outperforms the mean measure in the applications of machine learning using sampling distribution approaches.


2014 ◽  
Author(s):  
James J Roper ◽  
André M. X. Lima ◽  
Angélica M. K. Uejima

Food limitation may interact with nest predation and influence nesting patterns, such as breeding season length and renesting intervals. If so, reproductive effort should change with food availability. Thus, when food is limited, birds should have fewer attempts and shorter seasons than when food is not limiting. Here we experimentally test that increased food availability results in increased reproductive effort in a fragmented landscape in the Variable Antshrike (Thamnophilus caerulescens) in southern Brazil. We followed nesting pairs in five natural fragments (4, 23, 24, 112, 214 ha) in which food was supplemented for half of those pairs, beginning with the first nest. Nest success in the largest (214 ha) fragment was 59%, compared to 5% in the 112 ha fragment and no nest was successful in the smallest (24 ha) fragment. Birds were seen, but evidence of nesting was never found in the two smallest fragments. Pairs with supplemented food were more likely to increase clutch size from two to three eggs, tended to renest sooner (20 d on average) than control pairs. Also, fragment size interacted with breeding and pairs in the largest fragment had greater daily nest survival rates, and so nests tended to last longer, and so these pairs had fewer nesting attempts than those in the 112 ha fragment while more than those in the smallest fragment with nesting (24 ha). Clearly, pairs increased their reproductive effort when food was supplemented in comparison to control pairs and fragment size seems to influence both predation risk and food abundance.


10.2196/24798 ◽  
2020 ◽  
Vol 4 (11) ◽  
pp. e24798
Author(s):  
Simon Regard ◽  
Django Rosa ◽  
Mélanie Suppan ◽  
Chiara Giangaspero ◽  
Robert Larribau ◽  
...  

Background Victims of out-of-hospital cardiac arrest (OHCA) have higher survival rates and more favorable neurological outcomes when basic life support (BLS) maneuvers are initiated quickly after collapse. Although more than half of OHCAs are witnessed, BLS is infrequently provided, thereby worsening the survival and neurological prognoses of OHCA victims. According to the theory of planned behavior, the probability of executing an action is strongly linked to the intention of performing it. This intention is determined by three distinct dimensions: attitude, subjective normative beliefs, and control beliefs. We hypothesized that there could be a decrease in one or more of these dimensions even shortly after the last BLS training session. Objective The aim of this study was to measure the variation of the three dimensions of the intention to perform resuscitation according to the time elapsed since the last first-aid course. Methods Between January and April 2019, the two largest companies delivering first-aid courses in the region of Geneva, Switzerland sent invitation emails on our behalf to people who had followed a first-aid course between January 2014 and December 2018. Participants were asked to answer a set of 17 psychometric questions based on a 4-point Likert scale (“I don’t agree,” “I partially agree,” “I agree,” and “I totally agree”) designed to assess the three dimensions of the intention to perform resuscitation. The primary outcome was the difference in each of these dimensions between participants who had followed a first-aid course less than 6 months before taking the questionnaire and those who took the questionnaire more than 6 months and up to 5 years after following such a course. Secondary outcomes were the change in each dimension using cutoffs at 1 year and 2 years, and the change regarding each individual question using cutoffs at 6 months, 1 year, and 2 years. Univariate and multivariable linear regression were used for analyses. Results A total of 204 surveys (76%) were analyzed. After adjustment, control beliefs was the only dimension that was significantly lower in participants who took the questionnaire more than 6 months after their last BLS course (P<.001). Resisting diffusion of responsibility, a key element of subjective normative beliefs, was also less likely in this group (P=.001). By contrast, members of this group were less afraid of disease transmission (P=.03). However, fear of legal action was higher in this group (P=.02). Conclusions Control beliefs already show a significant decrease 6 months after the last first-aid course. Short interventions should be designed to restore this dimension to its immediate postcourse state. This could enhance the provision of BLS maneuvers in cases of OHCA.


2021 ◽  
Vol 10 (16) ◽  
pp. 3583
Author(s):  
Styliani Syntila ◽  
Georgios Chatzis ◽  
Birgit Markus ◽  
Holger Ahrens ◽  
Christian Waechter ◽  
...  

Our aim was to compare the outcomes of Impella with extracorporeal life support (ECLS) in patients with post-cardiac arrest cardiogenic shock (CS) complicating acute myocardial infarction (AMI). This was a retrospective study of patients resuscitated from out of hospital cardiac arrest (OHCA) with post-cardiac arrest CS following AMI (May 2015 to May 2020). Patients were supported either with Impella 2.5/CP or ECLS. Outcomes were compared using propensity score-matched analysis to account for differences in baseline characteristics between groups. 159 patients were included (Impella, n = 105; ECLS, n = 54). Hospital and 12-month survival rates were comparable in the Impella and the ECLS groups (p = 0.16 and p = 0.3, respectively). After adjustment for baseline differences, both groups demonstrated comparable hospital and 12-month survival (p = 0.36 and p = 0.64, respectively). Impella patients had a significantly greater left ventricle ejection-fraction (LVEF) improvement at 96 h (p < 0.01 vs. p = 0.44 in ECLS) and significantly fewer device-associated complications than ECLS patients (15.2% versus 35.2%, p < 0.01 for relevant access site bleeding, 7.6% versus 20.4%, p = 0.04 for limb ischemia needing intervention). In subgroup analyses, Impella was associated with better survival in patients with lower-risk features (lactate < 8.6 mmol/L, time from collapse to return of spontaneous circulation < 28 min, vasoactive score < 46 and Horowitz index > 182). In conclusion, the use of Impella 2.5/CP or ECLS in post-cardiac arrest CS after AMI was associated with comparable adjusted hospital and 12-month survival. Impella patients had a greater LVEF improvement than ECLS patients. Device-related access-site complications occurred more frequently in patients with ECLS than Impella support.


Critical Care ◽  
2021 ◽  
Vol 25 (1) ◽  
Author(s):  
Lucas M. Fleuren ◽  
Tariq A. Dam ◽  
Michele Tonutti ◽  
Daan P. de Bruin ◽  
Robbert C. A. Lalisang ◽  
...  

Abstract Introduction Determining the optimal timing for extubation can be challenging in the intensive care. In this study, we aim to identify predictors for extubation failure in critically ill patients with COVID-19. Methods We used highly granular data from 3464 adult critically ill COVID patients in the multicenter Dutch Data Warehouse, including demographics, clinical observations, medications, fluid balance, laboratory values, vital signs, and data from life support devices. All intubated patients with at least one extubation attempt were eligible for analysis. Transferred patients, patients admitted for less than 24 h, and patients still admitted at the time of data extraction were excluded. Potential predictors were selected by a team of intensive care physicians. The primary and secondary outcomes were extubation without reintubation or death within the next 7 days and within 48 h, respectively. We trained and validated multiple machine learning algorithms using fivefold nested cross-validation. Predictor importance was estimated using Shapley additive explanations, while cutoff values for the relative probability of failed extubation were estimated through partial dependence plots. Results A total of 883 patients were included in the model derivation. The reintubation rate was 13.4% within 48 h and 18.9% at day 7, with a mortality rate of 0.6% and 1.0% respectively. The grandient-boost model performed best (area under the curve of 0.70) and was used to calculate predictor importance. Ventilatory characteristics and settings were the most important predictors. More specifically, a controlled mode duration longer than 4 days, a last fraction of inspired oxygen higher than 35%, a mean tidal volume per kg ideal body weight above 8 ml/kg in the day before extubation, and a shorter duration in assisted mode (< 2 days) compared to their median values. Additionally, a higher C-reactive protein and leukocyte count, a lower thrombocyte count, a lower Glasgow coma scale and a lower body mass index compared to their medians were associated with extubation failure. Conclusion The most important predictors for extubation failure in critically ill COVID-19 patients include ventilatory settings, inflammatory parameters, neurological status, and body mass index. These predictors should therefore be routinely captured in electronic health records.


Sign in / Sign up

Export Citation Format

Share Document