scholarly journals Assessment of preparation time and 1-year Invisalign aligner attachment survival using flowable and packable composites:

2021 ◽  
Author(s):  
Shuang Lin ◽  
Ling Huang ◽  
Jialing Li ◽  
Juan Wen ◽  
Li Mei ◽  
...  

ABSTRACT Objectives To compare preparation time and 1-year Invisalign aligner attachment survival between a flowable composite (FC) and a packable composite (PC). Materials and Methods Fifty-five participants (13 men and 42 women, mean age ± SD: 24.2 ± 5.9 years) were included in the study. Ipsilateral quadrants (ie, maxillary and mandibular right, or vice versa) of attachments were randomly assigned to the FC group (Filtek Z350XT Flowable Restorative) and the PC group (Filtek Z350XT Universal Restorative) by tossing a coin. The primary outcome was preparation time. The secondary outcome was time to the first damage of an attachment. Preparation times were compared using the paired t-test, and the survival data were analyzed by the Cox proportional hazards model with a shared frailty term, with α = .05. Results The preparation times were significantly shorter with the FC (6.22 ± 0.22 seconds per attachment) than with the PC (32.83 ± 2.16 seconds per attachment; P < .001). The attachment damage rates were 14.79% for the FC and 9.70% for the PC. According to the Cox models, attachment damage was not significantly affected by the attachment material, sex, arch, tooth location, attachment type, presence of overbite, or occurrence of tooth extraction. Conclusions The use of a FC may save time as compared with the use of a PC. With regard to attachment survival, there was no significant difference between the two composites. None of the covariates of attachment materials (sex, arch, tooth location, attachment type, presence of overbite, oir occurrence of tooth extraction) affected attachment damage.

2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Funada ◽  
Y Goto ◽  
T Maeda ◽  
H Okada ◽  
M Takamura

Abstract Background/Introduction Shockable rhythm after cardiac arrest is highly expected after early initiation of bystander cardiopulmonary resuscitation (CPR) owing to increased coronary perfusion. However, the relationship between bystander CPR and initial shockable rhythm in patients with out-of-hospital cardiac arrest (OHCA) remains unclear. We hypothesized that chest-compression-only CPR (CC-CPR) before emergency medical service (EMS) arrival has an equivalent effect on the likelihood of initial shockable rhythm to the standard CPR (chest compression plus rescue breathing [S-CPR]). Purpose We aimed to examine the rate of initial shockable rhythm and 1-month outcomes in patients who received bystander CPR after OHCA. Methods The study included 59,688 patients (age, ≥18 years) who received bystander CPR after an OHCA with a presumed cardiac origin witnessed by a layperson in a prospectively recorded Japanese nationwide Utstein-style database from 2013 to 2017. Patients who received public-access defibrillation before arrival of the EMS personnel were excluded. The patients were divided into CC-CPR (n=51,520) and S-CPR (n=8168) groups according to the type of bystander CPR received. The primary end point was initial shockable rhythm recorded by the EMS personnel just after arrival at the site. The secondary end point was the 1-month outcomes (survival and neurologically intact survival) after OHCA. In the statistical analyses, a Cox proportional hazards model was applied to reflect the different bystander CPR durations before/after propensity score (PS) matching. Results The crude rate of the initial shockable rhythm in the CC-CPR group (21.3%, 10,946/51,520) was significantly higher than that in the S-CPR group (17.6%, 1441/8168, p<0.0001) before PS matching. However, no significant difference in the rate of initial shockable rhythm was found between the 2 groups after PS matching (18.3% [1493/8168] vs 17.6% [1441/8168], p=0.30). In the Cox proportional hazards model, CC-CPR was more negatively associated with the initial shockable rhythm before PS matching (unadjusted hazards ratio [HR], 0.97; 95% confidence interval [CI], 0.94–0.99; p=0.012; adjusted HR, 0.92; 95% CI, 0.89–0.94; p<0.0001) than S-CPR. After PS matching, however, no significant difference was found between the 2 groups (adjusted HR of CC-CPR compared with S-CPR, 0.97; 95% CI, 0.94–1.00; p=0.09). No significant differences were found between C-CPR and S-CPR in the 1-month outcomes after PS matching as follows, respectively: survival, 8.5% and 10.1%; adjusted odds ratio, 0.89; 95% CI, 0.79–1.00; p=0.07; cerebral performance category 1 or 2, 5.5% and 6.9%; adjusted odds, 0.86; 95% CI, 0.74–1.00; p=0.052. Conclusions Compared with S-CPR, the CC-CPR before EMS arrival had an equivalent multivariable-adjusted association with the likelihood of initial shockable rhythm in the patients with OHCA due to presumed cardiac causes that was witnessed by a layperson. Funding Acknowledgement Type of funding source: None


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Jian-jun Li ◽  
Yexuan Cao ◽  
Hui-Wen Zhang ◽  
Jing-Lu Jin ◽  
Yan Zhang ◽  
...  

Introduction: The atherogenicity of residual cholesterol (RC) has been underlined by recent guidelines, which was linked to coronary artery disease (CAD), especially for patients with diabetes mellitus (DM). Hypothesis: This study aimed to examine the prognostic value of plasma RC, clinically presented as triglyceride-rich lipoprotein-cholesterol (TRL-C) or remnant-like lipoprotein particles-cholesterol (RLP-C), in CAD patients with different glucose metabolism status. Methods: Fasting plasma TRL-C and RLP-C levels were directly calculated or measured in 4331 patients with CAD. Patients were followed for incident MACEs for up to 8.6 years and categorized according to both glucose metabolism status [DM, pre-DM, normal glycaemia regulation (NGR)] and RC levels. Cox proportional hazards model was used to calculate hazard ratios (HRs) with 95% confidence intervals. Results: During a mean follow-up of 5.1 years, 541 (12.5%) MACEs occurred. The risk for MACEs was significantly higher in patients with elevated RC levels after adjustment for potential confounders. No significant difference in MACEs was observed between pre-DM and NGR groups (p>0.05). When stratified by status of glucose metabolism and RC levels, highest levels of RLP-C, calculated and measured TRL-C were significant and independent predictors of developing MACEs in pre-DM (HR: 2.10, 1.98, 1.92, respectively; all p<0.05) and DM (HR: 2.25, 2.00, 2.16, respectively; all p<0.05). Conclusions: In this large cohort study with long-term follow-up, data firstly demonstrated that higher RC levels were significantly associated with the worse prognosis in DM and pre-DM patients with CAD, suggesting RC might be a target for patients with impaired glucose metabolism.


2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S426-S426
Author(s):  
Christopher M Rubino ◽  
Lukas Stulik ◽  
Harald Rouha ◽  
Zehra Visram ◽  
Adriana Badarau ◽  
...  

Abstract Background ASN100 is a combination of two co-administered fully human monoclonal antibodies (mAbs), ASN-1 and ASN-2, that together neutralize the six cytotoxins critical to S. aureus pneumonia pathogenesis. ASN100 is in development for prevention of S. aureus pneumonia in mechanically ventilated patients. A pharmacometric approach to dose discrimination in humans was taken in order to bridge from dose-ranging, survival studies in rabbits to anticipated human exposures using a mPBPK model derived from data from rabbits (infected and noninfected) and noninfected humans [IDWeek 2017, Poster 1849]. Survival in rabbits was assumed to be indicative of a protective effect through ASN100 neutralization of S. aureus toxins. Methods Data from studies in rabbits (placebo through 20 mg/kg single doses of ASN100, four strains representing MRSA and MSSA isolates with different toxin profiles) were pooled with data from a PK and efficacy study in infected rabbits (placebo and 40 mg/kg ASN100) [IDWeek 2017, Poster 1844]. A Cox proportional hazards model was used to relate survival to both strain and mAb exposure. Monte Carlo simulation was then applied to generate ASN100 exposures for simulated patients given a range of ASN100 doses and infection with each strain (n = 500 per scenario) using a mPBPK model. Using the Cox model, the probability of full protection from toxins (i.e., predicted survival) was estimated for each simulated patient. Results Cox models showed that survival in rabbits is dependent on both strain and ASN100 exposure in lung epithelial lining fluid (ELF). At human doses simulated (360–10,000 mg of ASN100), full or substantial protection is expected for all four strains tested. For the most virulent strain tested in the rabbit pneumonia study (a PVL-negative MSSA, Figure 1), the clinical dose of 3,600 mg of ASN100 provides substantially higher predicted effect relative to lower doses, while doses above 3,600 mg are not predicted to provide significant additional protection. Conclusion A pharmacometric approach allowed for the translation of rabbit survival data to infected patients as well as discrimination of potential clinical doses. These results support the ASN100 dose of 3,600 mg currently being evaluated in a Phase 2 S. aureus pneumonia prevention trial. Disclosures C. M. Rubino, Arsanis, Inc.: Research Contractor, Research support. L. Stulik, Arsanis Biosciences GmbH: Employee, Salary. H. Rouha, 3Arsanis Biosciences GmbH: Employee, Salary. Z. Visram, Arsanis Biosciences GmbH: Employee, Salary. A. Badarau, Arsanis Biosciences GmbH: Employee, Salary. S. A. Van Wart, Arsanis, Inc.: Research Contractor, Research support. P. G. Ambrose, Arsanis, Inc.: Research Contractor, Research support. M. M. Goodwin, Arsanis, Inc.: Employee, Salary. E. Nagy, Arsanis Biosciences GmbH: Employee, Salary.


2021 ◽  
Vol 11 ◽  
Author(s):  
Jason C. Sanders ◽  
Donald A. Muller ◽  
Sunil W. Dutta ◽  
Taylor J. Corriher ◽  
Kari L. Ring ◽  
...  

ObjectivesTo investigate the safety and outcomes of elective para-aortic (PA) nodal irradiation utilizing modern treatment techniques for patients with node positive cervical cancer.MethodsPatients with pelvic lymph node positive cervical cancer who received radiation were included. All patients received radiation therapy (RT) to either a traditional pelvic field or an extended field to electively cover the PA nodes. Factors associated with survival were identified using a Cox proportional hazards model, and toxicities between groups were compared with a chi-square test.Results96 patients were identified with a mean follow up of 40 months. The incidence of acute grade ≥ 2 toxicity was 31% in the elective PA nodal RT group and 15% in the pelvic field group (Chi-square p = 0.067. There was no significant difference in rates of grade ≥ 3 acute or late toxicities between the two groups (p&gt;0.05). The KM estimated 5-year OS was not statistically different for those receiving elective PA nodal irradiation compared to a pelvic only field, 54% vs. 73% respectively (log-rank p = 0.11).ConclusionsElective PA nodal RT can safely be delivered utilizing modern planning techniques without a significant increase in severe (grade ≥ 3) acute or late toxicities, at the cost of a possible small increase in non-severe (grade 2) acute toxicities. In this series there was no survival benefit observed with the receipt of elective PA nodal RT, however, this benefit may have been obscured by the higher risk features of this population. While prospective randomized trials utilizing a risk adapted approach to elective PA nodal coverage are the only way to fully evaluate the benefit of elective PA nodal coverage, these trials are unlikely to be performed and instead we must rely on interpretation of results of risk adapted approaches like those used in ongoing clinical trials and retrospective data.


2021 ◽  
Author(s):  
Miguel I. Paredes ◽  
Stephanie Lunn ◽  
Michael Famulare ◽  
Lauren A. Frisbie ◽  
Ian Painter ◽  
...  

Background: The COVID–19 pandemic is now dominated by variant lineages; the resulting impact on disease severity remains unclear. Using a retrospective cohort study, we assessed the risk of hospitalization following infection with nine variants of concern or interest (VOC/VOI). Methods: Our study includes individuals with positive SARS–CoV–2 RT PCR in the Washington Disease Reporting System and with available viral genome data, from December 1, 2020 to July 30, 2021. The main analysis was restricted to cases with specimens collected through sentinel surveillance. Using a Cox proportional hazards model with mixed effects, we estimated hazard ratios (HR) for the risk of hospitalization following infection with a VOC/VOI, adjusting for age, sex, and vaccination status. Findings: Of the 27,814 cases, 23,170 (83.3%) were sequenced through sentinel surveillance, of which 726 (3.1%) were hospitalized due to COVID–19. Higher hospitalization risk was found for infections with Gamma (HR 3.17, 95% CI 2.15–4.67), Beta (HR: 2.97, 95% CI 1.65–5.35), Delta (HR: 2.30, 95% CI 1.69–3.15), and Alpha (HR 1.59, 95% CI 1.26–1.99) compared to infections with an ancestral lineage. Following VOC infection, unvaccinated patients show a similar higher hospitalization risk, while vaccinated patients show no significant difference in risk, both when compared to unvaccinated, ancestral lineage cases. Interpretation: Infection with a VOC results in a higher hospitalization risk, with an active vaccination attenuating that risk. Our findings support promoting hospital preparedness, vaccination, and robust genomic surveillance.


2019 ◽  
Vol 53 (10) ◽  
pp. 1020-1025
Author(s):  
Margaret R. Jorgenson ◽  
Jillian L. Descourouez ◽  
Dou-Yan Yang ◽  
Glen E. Leverson ◽  
Christopher M. Saddler ◽  
...  

Background: Modifiable risk-factors associated with Clostridioides difficile infection (CDI) in renal-transplant (RTX) have not been clearly established and peri-transplant risk has not been described. Objective: Evaluate epidemiology, risk-factors and outcomes after CDI occurring in the first 90 days after RTX (CDI-90).Methods: Observational cohort study/survival analysis of adult RTX recipients from 1/1/2012-12/31/2015. Primary outcome was CDI-90 incidence/risk-factors. Secondary outcome was evaluation of post-90 day transplant outcomes. Results: 982 patients met inclusion criteria; 46 with CDI-90 and 936 without (comparator). CDI incidence in the total population was 4.7% at 90 days, 6.3% at 1 year, and 6.4% at 3 years. Incidence of CDI-90 was 5%; time to diagnosis was 19.4±25 days (median 7). Risk-factors for CDI-90 were alemtuzumab induction (Hazard ratio [HR] 1.5, 95% CI(1.1-2.0), p = 0.005) and age at transplant (HR 1.007/year, 95% CI (1.002-1.012), p= 0.007). However, risk-factors for CDI at any time were different; donation-after-circulatory-death (DCD) donor (HR 2.5 95% CI (1.3-4.9), p = 0.008) and female gender (HR 1.6 95% CI (1.0-2.7), p = 0.049). On Kaplan-Meier, CDI-90 appeared to have an impact on patient/graft survival, however when analyzed in a multivariable stepwise Cox proportional hazards model, only age was significantly associated with survival ( p = 0.002). Conclusion and Relevance: Incidence of CDI-90 is low, mostly occurring in the first post-operative month. Risk-factors vary temporally based on time from transplant. In the early post-op period induction agent and age at transplant are significant, but not after. Associations between CDI and negative graft outcomes appear to be largely driven by age. Future studies validating these risk-factors as well as targeted prophylaxis strategies and their effect on long term graft outcomes and the host microbiome are needed.


2010 ◽  
Vol 18 (2) ◽  
pp. 189-205 ◽  
Author(s):  
Luke Keele

The Cox proportional hazards model is widely used to model durations in the social sciences. Although this model allows analysts to forgo choices about the form of the hazard, it demands careful attention to the proportional hazards assumption. To this end, a standard diagnostic method has been developed to test this assumption. I argue that the standard test for nonproportional hazards has been misunderstood in current practice. This test detects a variety of specification errors, and these specification errors must be corrected before one can correctly diagnose nonproportionality. In particular, unmodeled nonlinearity can appear as a violation of the proportional hazard assumption for the Cox model. Using both simulation and empirical examples, I demonstrate how an analyst might be led astray by incorrectly applying the nonproportionality test.


2020 ◽  
Vol 4 (Supplement_1) ◽  
Author(s):  
Sahityasri Thapi ◽  
Kiwoon Baeg ◽  
Michelle K Kim ◽  
Emily Jane Gallagher

Abstract Background: The incidence and prevalence of gastroenteropancreatic neuroendocrine tumors (GEP-NET) is increasing globally and has been associated with diabetes mellitus (DM). In this study we aimed to compare tumor characteristics, disease-specific survival (DSS) and overall survival (OS) of GEP-NET patients (pts) with and without DM. Methods: Using the Surveillance, Epidemiology, and End Results registry (SEER) linked to Medicare claims, we identified pts diagnosed with GEP-NET between January 1995 and December 2010, aged ≥65 years at the time of GEP-NET diagnosis. We included patients who were in exclusive Medicare coverage without healthcare management organizations and had Medicare Parts A and B coverage for ≥1year after GEP-NET diagnosis or until death. Within the pts with GEP-NET diagnosis, we identified those without a diagnosis of DM prior to the GEP-NET diagnosis. We compared baseline sociodemographics, co-morbidities, and GEP-NET location, stage, grade and treatment between pts with and without DM using χ 2 analysis. Kaplan Meier (KM) curves were used to compare OS and DSS up to 10 years between the DM and non-DM groups. We used Cox proportional hazards analysis to compare the DSS between the groups, adjusting for confounding variables. Results: We identified a cohort of 1,969 well-characterized GEP-NET patients with accurate tumor stage, grade, comorbidities, and treatment data. 478 (25.7%) had DM and 1,383 (74.3%) did not have DM. There were no statistically significant differences in gender or age at the time of GEP-NET diagnosis in the DM (mean age 74.7±SD 6.6 yrs) and non-DM (74.9±7.4 yrs) groups. Significant differences in race were found in the DM (80.6% white, 13.6% black, 1.3% hispanic) and non-DM (86.8% white, 8.2% black, 1.8% Hispanic) groups (p=0.002). Patients with DM had more gastric (14.7%), duodenal (10.9%) and pancreatic (21.0%), and less jejunal/ ileal (12.8%) NETs compared with the non-DM group (9.7%, 6.4%, 16.9%, 18.2%, respectively, p&lt;0.0001). Patients with DM had earlier stage disease than those without DM (p=0.0012), but no difference in tumor grade or treatment was found. KM curves revealed no differences in OS and DSS in the GEP-NET patients with and without DM across all stages. Multivariate adjusted Cox proportional-hazards model found no significant difference in DSS between those with and without DM (HR=0.97, 95%CI: 0.76–1.24). Compared with pts with pancreatic NETs, pts with colon (HR=1.39, 95%CI: 1.04–1.86) had worse survival, while those with jejunal/ileal (HR = 0.59, 95%CI: 0.42–0.83) NETs had a better survival. Discussion: This is the first study to investigate the effect of DM on survival of pts GEP-NETs. We found a high prevalence of pre-existing DM in pts with GEP-NETs, but no difference in OS or DSS in pts with and without DM. Interestingly, pts with DM had more foregut GEP-NETs which may suggest mechanistic links between DM and GEP-NETs at these sites.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
M Fukunaga ◽  
K Hirose ◽  
A Isotani ◽  
T Morinaga ◽  
K Ando

Abstract Background Relationship between atrial fibrillation (AF) and heart failure (HF) is often compared with proverbial question of which came first, the chicken or the egg. Some patients showing AF at the HF admission result in restoration of sinus rhythm (SR) at discharge. It is not well elucidated that the restoration into SR during hospitalization can render the preventive effect for rehospitalization. Purpose To investigate the impact of restoration into SR during hospitalization for readmission rate of the HF patients showing AF. Methods We enrolled consecutive 640 HF patients hospitalized from January 2015 to December 2015. Patients data were retrospectively investigated from medical record. Patients showing atrial fibrillation on admission but unrecognized ever were defined as “incident AF”; patients with AF diagnosed before admission were defined as “prevalent AF”. Primary endpoint was a composite of death from cardiovascular disease or hospitalization for worsening heart failure. Secondary endpoints were death from cardiovascular disease, unplanned hospitalization related to heart failure, and any hospitalization. Results During mean follow up of 19 months, 139 patients (22%) were categorized as incident AF and 145 patients (23%) were categorized as prevalent AF. Among 239 patients showing AF on admission, 44 patients were discharged in SR (39 patients in incident AF and 5 patients in prevalent AF). Among incident AF patients, the primary composite end point occurred in significantly fewer in those who discharged in SR (19% vs. 42% at 1-year; 23% vs. 53% at 2-year follow-up, p=0.005). To compare the risk factors related to readmission due to HF with the cox proportional-hazards model, AF only during hospitalization [Hazard Ratio (HR)=0.37, p<0.01] and prevalent AF (HR=1.67, p=0.04) was significantly associated. There was no significant difference depending on LVEF. Conclusion Newly diagnosed AF with restoration to SR during hospitalization was a good marker to forecast future prognosis.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. 1611-1611
Author(s):  
Shweta Gupta ◽  
Prantesh Jain ◽  
Emilio Araujo Mino ◽  
Audrey L. French ◽  
Fred R Rosen ◽  
...  

1611 Background: County Hospital (CCH) with its HIV clinic, the CORE Center (CC) is the largest provider for HIV patients (pts) in Chicago, treating over 5,500 HIV pts yearly. There is paucity of data on characteristics of HIV+ cancers (ca) in the inner city. The CHAMP cohort is a retrospective study of all HIV associated cancers at CC and CCH over past 14 years (yrs). We analyzed all of the NADC from this cohort. Methods: All HIV pts with NADC were identified from the CHAMP cohort and retrospectively reviewed for HIV and cancer characteristics, overall survival (OS), and pt demographics. Statistics: Survival data was analyzed using Kaplan-Meier analysis and Cox Proportional Hazards model. Results: Of 438 pts identified, 157 were NADC representing 21 ca. The average (ave) age was 48 yrs (range 44-57), with prostate ca having highest age presentation. Over the past 10 yrs, the number of NADC has risen from 10 to over 20 each yr. Unlike historical controls (HC) where lung ca is most common, anal ca (21%) was most frequent followed by lung ca (17%). Prostate, head and neck (HNSCC), liver, and colorectal ca were seen in 9, 9, 8, and 7% respectively. 65% of pts were African Americans (AA) and 18% Caucasians. 78% of all NADC were men. 45% of anal ca present with stage IIIa/b disease, moderately to poorly differentiated ca in 48%, with a median OS of 34 mo. CD4 count did not alter OS but stage predicted better outcomes. 86% lung ca presented as stage III/IV disease with ave CD4 count 204. Histologically, 36% were SCC, 28% adenosquamous and 20% adenocarcinoma. OS was 5.5 mo and did not change by histology, CD4, or age. 68% HNSCC present with stage IVa/b but no IVc. Ave age was 48 yrs with an OS of 18mo. 50% were oropharyngeal compared to 22% in HC. Conclusions: Based on data by Sheilds et. al, CCH treats just over 1% of the country’s NADC population. We demonstrate a higher incidence of NADC over time, dominated by a younger, AA and male population. Each ca presents with advanced stage 45-86% and poorly differentiated tumors ranging from 15-30%. The OS of each cancer is consistent with HC with exception of HNSCC. As HIV pts age becoming prone to cancers of elderly, education and screening of inner city HIV pts will help improve cancer rates.


Sign in / Sign up

Export Citation Format

Share Document