severity class
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 25)

H-INDEX

6
(FIVE YEARS 2)

Author(s):  
Kamil Bugała ◽  
Paweł Rubiś ◽  
Mateusz K. Hołda ◽  
Małgorzata Konieczyńska ◽  
Piotr Bijak ◽  
...  

AbstractAcute decompensated heart failure (ADHF) treatment leads to significant hemodynamic changes. The aim of our study was to quantitatively analyze the dynamics of mitral regurgitation (MR) severity (evaluated by transthoracic echocardiography) which occur during the treatment of ADHF and to correlate these changes with the clinical condition of patients as well as heart failure biochemical markers. The study included 27 consecutive adult patients (40.7% females, mean age 71.19 ± 11.2 years) who required hospitalization due to signs of acute HF. Echocardiographic assessment was performed upon admission and discharge together with clinical and laboratory evaluation. Significant reduction in dyspnea intensity [0–100 scale] (81.48 ± 9.07 vs. 45.00 ± 11.04 pts, p < 0.001), body weight (84.98 ± 18.52 vs. 79.77 ± 17.49 kg, p < 0.001), and NT-proBNP level (7520.56 ± 5288.62 vs. 4949.88 ± 3687.86 pg/ml, p = 0.001) was found. The severity of MR parameters decreased significantly (MR volume 44.92 ± 22.83 vs. 30.88 ± 18.77 ml, p < 0.001; EROA 0.37 ± 0.17 vs. 0.25 ± 0.16 cm2, p < 0.001; VC 6.21 ± 1.48 vs. 5.26 ± 1.61 mm, p < 0.001). Left atrial area (35.86 ± 9.11 vs. 32.47 ± 9.37, p < 0.001) and mitral annular diameter (42.33 ± 6.63 vs. 39.72 ± 5.05. p < 0.001) also underwent statistically significant reductions. An increase in LVEF was observed (34.73 ± 13.88 vs. 40.24 ± 13.19%, p < 0.001). In 40.7% of patients, a change in MR severity class (transition from a higher class to a lower one) was observed: 6/8 (75%) patients transitioned from severe to moderate and 6/18 (33.3%) patients transitioned from moderate to mild class. Treatment of ADHF leads to a significant reduction in MR severity, together with significant reductions in left atrial and mitral annular dimensions. Quantitative measurement of MR dynamics offer valuable assistance for ADHF management.


2021 ◽  
Vol 30 (4) ◽  
pp. 441-448
Author(s):  
M Aromaa ◽  
MM Rajamäki ◽  
L Lilja-Maula

To promote successful breeding against brachycephalic obstructive airway syndrome (BOAS), it is important to assess how BOAS signs progress during young adulthood and how evaluation age and ageing affect the results of chosen breeding selection tools. The aims of this study were to assess how veterinary-assessed and owner-reported BOAS signs and exercise test results change when dogs age. Eight English Bulldogs, 25 French Bulldogs, and 31 Pugs that had undergone previous evaluation were re-examined 2– 3 years later. An owner questionnaire regarding BOAS signs, a veterinary assessment of BOAS severity, and exercise, ie walk tests were re-performed. In Pugs, both 6-min walking distance and 1,000-m time worsened and the initial evaluation age had a significant effect on the 1,000-m time. No significant changes were seen in the results of the French Bulldogs but a negative effect on the 1,000-m time was seen with weight gain. Exercise test statistics were not performed with regard to English Bulldogs due to low sample size. The veterinary-assessed BOAS severity class remained the same in the majority of dogs and the BOAS grade worsened mostly in those dogs that were initially evaluated at less than two years of age. Most owners reported no major changes in BOAS severity. BOAS grading and walk tests were easy to repeat and results remained relatively constant in dogs initially evaluated at over two years of age, supporting the use of these breeding selection tools. However, further, large-scale offspring studies are still needed.


Author(s):  
Antoine BARRO ◽  
Joseph NANAMA ◽  
Zinmanké COULIBALY ◽  
Zakaria DIENI ◽  
Mirela CORDEA

Vegetable cowpea is eaten mainly fresh, in the form of young, immature pods, tender and sweet like the common bean. However, like cowpea with seeds, vegetable cowpea experience yield losses due to the cowpea aphid-borne mosaic virus (CABMV). This study aims to improve yields through the development of vegetable cowpea varieties resistant to CABMV. The study focused on ten varieties of vegetable cowpea, carried out in a greenhouse at the Kamboinsé research station using a randomized complete block design with three replications, all inoculated with CABMV. The data collection concerned resistance parameters. Mechanical inoculation made it possible to observe various symptoms of CABMV, thus highlighting the existence of variability within the varieties tested. Strong correlations were observed between several variables. Thus, the varieties of vegetable cowpea IT85F-2089-5, UG-CP-8, IT85F-2805-5 and Telma were identified as resistant, because belonging to the low severity classes and having a low value of area under the disease progress curve. On the other hand, the varieties RW-CP-5, UG-CP-6, IT83S-911, niébé baguette grimpant possessing a high severity class were judged to be susceptible. These resistant varieties will thus be able to contribute to the improvement of production and the protection of cowpea resources in Burkina.


2021 ◽  
Author(s):  
Kamil Bugała ◽  
Paweł Rubiś ◽  
Mateusz K Hołda ◽  
Małgorzata Konieczyńska ◽  
Piotr Bijak ◽  
...  

Abstract Purpose: Acute decompensated heart failure (ADHF) treatment leads to significant hemodynamic changes. The aim of our study was to quantitatively analyze the dynamics of mitral regurgitation (MR) severity (evaluated by transthoracic echocardiography) which occur during the treatment of ADHF and to correlate these changes with the clinical condition of patients as well as heart failure biochemical markers. Methods: The study included 27 consecutive adult patients (40.7% females, mean age 71.19±11.2 years) who required hospitalization due to signs of acute HF. Echocardiographic assessment was performed upon admission and discharge together with clinical and laboratory evaluation. Results: Significant reduction in dyspnea intensity [0-100 scale] (81.48±9.07 vs. 45.00±11.04 pts, p<0.001), body weight (84.98±18.52 vs. 79.77±17.49 kg, p<0,001), and NT-proBNP level (7520.56±5288.62 vs. 4949.88±3687.86 pg/ml, p=0.001) was found. The severity of MR parameters decreased significantly (MR volume 44.92±22.83 vs. 30.88±18.77 ml, p<0.001; EROA 0.37±0.17 vs. 0.25±0.16 cm2, p<0.001; VC 6.21±1.48 vs. 5.26±1.61 mm, p<0.001). Left atrial area (35.86±9.11 vs. 32.47±9.37, p<0.001) and mitral annular diameter (42.33±6.63 vs. 39.72±5.05. p<0.001) also underwent statistically significant reductions. An increase in LVEF was observed (34.73±13.88 vs. 40.24±13.19 %, p<0.001). In 40.7% of patients, a change in MR severity class (transition from a higher class to a lower one) was observed: 6/8 (75%) patients transitioned from severe to moderate and 6/18 (33.3%) patients transitioned from moderate to mild class.Conclusions: Treatment of ADHF leads to a significant reduction in MR severity, together with significant reductions in left atrial and mitral annular dimensions. Quantitative measurement of MR dynamics offer valuable assistance for ADHF management.


2021 ◽  
pp. 20210279
Author(s):  
Julie Suhr Villefrance ◽  
Lise-Lotte Kirkevang ◽  
Ann Wenzel ◽  
Michael Væth ◽  
Louise Hauge Matzen

Objectives: To compare the severity of external cervical resorption (ECR) observed in periapical (PA) images and cone beam CT (CBCT) using the Heithersay classification system and pulp involvement; and to assess inter- and intraobserver reproducibility for three observers. Methods: CBCT examination was performed in 245 teeth (in 190 patients, mean age 40 years, range 12–82) with ECR diagnosed in PA images. Three observers scored the severity of ECR using the Heithersay classification system (severity class 1–4) and pulp involvement (yes/no) in both PA images and CBCT. Percentage concordance and κ-statistics described observer variation in PA images and CBCT for both inter- and intraobserver reproducibility. Results: For all three observers, the ECR score was the same in the two modalities in more than half of cases (average 59%; obs1: 54%, obs2: 63%, obs3: 61%). However, in 38% (obs1: 44%, obs2: 33%, obs3: 36%) of the cases, the observers scored more severe ECR in CBCT than in PA images (p < 0.001). The ECR score changed to a less severe score in CBCT only in 3% (obs1: 1%, obs2: 4%, obs3: 4%). For pulp involvement, 14% (obs1: 7%, obs2: 20%, obs3: 15%) of the cases changed from “no” in PA images to “yes” in CBCT. In general, κ values were higher for CBCT than for PA images for both the Heithersay classification score and pulp involvement. Conclusions: ECR was generally scored as more severe in CBCT than PA images using the Heithersay classification and also more cases had pulp involvement in CBCT.


2021 ◽  
Author(s):  
Sunil Nepal ◽  
W Keith Moser ◽  
Zhaofei Fan

Abstract Quantifying invasion severity of nonnative invasive plant species is vital for the development of appropriate mitigation and control measures. We examined more than 23,250 Forest Inventory and Analysis (FIA) plots from the southern coastal states of the United States to develop an alternative method to classify and map the invasion severity of Chinese tallow (Triadica sebifera). Remeasured FIA plot-level data were used to examine the spatiotemporal changes in the presence probability and cover percentage of tallow. Four invasion severity classes were identified by using the product of presence probability and cover percentage. Chinese tallow invasion severity increased over time with 90 and 123 counties being classified into the highest severity class for the first and second measurement, respectively. Further, the invasibility of major forest-type groups by severity class was examined using the product of the county-level mean presence probability and mean cover percentage of Chinese tallow as a proxy of invasibility. Longleaf/slash pine (Pinus palustris/P. elliottii) forests were highly resilient to the Chinese tallow invasion. In contrast, elm/ash/cottonwood (Ulmus spp./Fraxinus spp./Populus deltoides) and oak/gum/cypress (Quercus spp./Nyssa spp./Taxodium spp.) forest-type groups were vulnerable to invasion. Study Implications: In the southern United States forestland, differences in invasion severity and vulnerability of forest types to Chinese tallow invasion have been observed across time and space. Our findings provide insight into spatial variations in the severity of Chinese tallow invasion and the relative susceptibility of different forest-type groups in the region to inform monitoring and management of this invasive species. High invasion severity occurs in the lower Gulf of Mexico coastal region of Texas, Louisiana, and Mississippi and the Atlantic coastal region of South Carolina and Georgia, with the longleaf/slash pine and oak/gum/cypress forest-type groups being most susceptible to Chinese tallow invasion. Based on these results, we recommend that management efforts be tailored to the different invasion severity classes. Forests in the high-severity class need a management program coordinated across different agencies and landowners to curb the increase of tallow populations to prevent stand replacing risks. The monitoring of Chinese tallow spread should focus on longleaf/slash pine, loblolly/shortleaf pine, and oak/gum/cypress groups, because the spread rate was higher in these forest-type groups. A better use of scarce resources could be to treat lands in the moderate- and low-severity classes to reduce the propagule pressure levels and post-invasion spread. For those counties with a minimal-severity condition, early detection and eradiction measures should be taken in a timely maner to prevent tallow from invading noninvaded neighboring counties. Managers may be able to treat a larger area of these lands for a given investment compared with lands already severely invaded.


2021 ◽  
Author(s):  
Juan Andres Paredes ◽  
Juan Pablo Edwards Molina ◽  
Luis Ignacio Cazón ◽  
Florencia Asinari ◽  
Joaquín Humberto Monguillot ◽  
...  

Peanut smut caused by the soil-borne pathogen Thecaphora frezii, has increased in incidence in the main Argentine peanut growing region. Smut affected pods transforming the kernel into a mass of teliospores, which survive long-term in the soil. This study is the first wide-scale survey to determine the occurrence and distribution of peanut smut in the main growing area of Argentina. Survey was conducted in Córdoba province, in commercial peanut fields (n=217) randomly selected from the 2015 to 2020. The intensity of disease was explored, analyzing the distribution of classes of disease severity from peanut fields, and assessing the relationship between disease parameters. No field with 0% incidence was recorded, being the mean of the incidence increased from 1.66% in 2015 to 11.47% in 2020. Smut symptoms varied from a small sori to the complete transformation of the kernel (severity classes). Severely damaged pods (SDP) were considered severity class 3 and 4, where at least one or both kernels were transformed into a mass of spores, producing a high volume of spores that spread among fields increasing the inoculum in the soil. More than 80% of the infected pods in samples corresponding to SDP. A strong relationship was observed between the disease severity index and incidence (R = 0.99), and between incidence and severity class 3 and 4 (R = 0.97 and R = 0.93), being the linear regression a model that explained the data. The results obtained contribute to progress in the knowledge of the distribution of T. frezii in the peanut-growing area of Argentina. Severity can be estimated by incidence, incidence assessment is faster, more accurate, and reproducible. This is a good technical criterion for monitoring the disease annually, and it can also be used to screen materials in breeding programs or treatments implemented as management strategies for peanut smut.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Emanuela Sozio ◽  
Carlo Tascini ◽  
Martina Fabris ◽  
Federica D’Aurizio ◽  
Chiara De Carlo ◽  
...  

AbstractMid Regional pro-ADM (MR-proADM) is a promising novel biomarker in the evaluation of deteriorating patients and an emergent prognosis factor in patients with sepsis, septic shock and organ failure. It can be induced by bacteria, fungi or viruses. We hypothesized that the assessment of MR-proADM, with or without other inflammatory cytokines, as part of a clinical assessment of COVID-19 patients at hospital admission, may assist in identifying those likely to develop severe disease. A pragmatic retrospective analysis was performed on a complete data set from 111 patients admitted to Udine University Hospital, in northern Italy, from 25th March to 15th May 2020, affected by SARS-CoV-2 pneumonia. Clinical scoring systems (SOFA score, WHO disease severity class, SIMEU clinical phenotype), cytokines (IL-6, IL-1b, IL-8, TNF-α), and MR-proADM were measured. Demographic, clinical and outcome data were collected for analysis. At multivariate analysis, high MR-proADM levels were significantly associated with negative outcome (death or orotracheal intubation, IOT), with an odds ratio of 4.284 [1.893–11.413], together with increased neutrophil count (OR = 1.029 [1.011–1.049]) and WHO disease severity class (OR = 7.632 [5.871–19.496]). AUROC analysis showed a good discriminative performance of MR-proADM (AUROC: 0.849 [95% Cl 0.771–0.730]; p < 0.0001). The optimal value of MR-proADM to discriminate combined event of death or IOT is 0.895 nmol/l, with a sensitivity of 0.857 [95% Cl 0.728–0.987] and a specificity of 0.687 [95% Cl 0.587–0.787]. This study shows an association between MR-proADM levels and the severity of COVID-19. The assessment of MR-proADM combined with clinical scoring systems could be of great value in triaging, evaluating possible escalation of therapies, and admission avoidance or inclusion into trials. Larger prospective and controlled studies are needed to confirm these findings.


Critical Care ◽  
2021 ◽  
Vol 25 (1) ◽  
Author(s):  
Daniele De Luca ◽  
Paola Cogo ◽  
Martin C. Kneyber ◽  
Paolo Biban ◽  
Malcolm Grace Semple ◽  
...  

Abstract Pediatric (PARDS) and neonatal (NARDS) acute respiratory distress syndrome have different age-specific characteristics and definitions. Trials on surfactant for ARDS in children and neonates have been performed well before the PARDS and NARDS definitions and yielded conflicting results. This is mainly due to heterogeneity in study design reflecting historic lack of pathobiology knowledge. We reviewed the available clinical and preclinical data to create an expert consensus aiming to inform future research steps and advance the knowledge in this area. Eight trials investigated the use of surfactant for ARDS in children and ten in neonates, respectively. There were improvements in oxygenation (7/8 trials in children, 7/10 in neonates) and mortality (3/8 trials in children, 1/10 in neonates) improved. Trials were heterogeneous for patients’ characteristics, surfactant type and administration strategy. Key pathobiological concepts were missed in study design. Consensus with strong agreement was reached on four statements: There are sufficient preclinical and clinical data to support targeted research on surfactant therapies for PARDS and NARDS. Studies should be performed according to the currently available definitions and considering recent pathobiology knowledge. PARDS and NARDS should be considered as syndromes and should be pre-clinically studied according to key characteristics, such as direct or indirect (primary or secondary) nature, clinical severity, infectious or non-infectious origin or patients’ age. Explanatory should be preferred over pragmatic design for future trials on PARDS and NARDS. Different clinical outcomes need to be chosen for PARDS and NARDS, according to the trial phase and design, trigger type, severity class and/or surfactant treatment policy. We advocate for further well-designed preclinical and clinical studies to investigate the use of surfactant for PARDS and NARDS following these principles.


2021 ◽  
Vol 10 (4) ◽  
pp. 876
Author(s):  
Hiroko Hashimoto ◽  
Shimpei Hashimoto ◽  
Yoshihiro Shimazaki

Background: There is limited information regarding the association between tooth loss and the medications used for the treatment of rheumatoid arthritis (RA). Here, we examined the association between tooth loss, disease severity, and drug treatment regimens in RA patients. Method: This study recruited 94 Japanese patients with RA. The severity of RA was assessed using the Steinbrocker classification of class and stage. Data on RA medications were obtained from medical records. We examined the associations between tooth loss, RA severity, and drug treatment regi mens using multinomial logistic regression analyses. Results: Patients with 1–19 teeth had significantly higher odds ratios (ORs) of taking methotrexate (MTX) (OR, 8.74; 95% confidence interval (CI), 1.11–68.8) and biologic disease-modifying antirheumatic drugs (bDMARDs) (OR, 21.0; 95% CI, 1.3–339.1) compared to those with 27–28 teeth when adjusted for RA severity (class). Furthermore, patients with 1–19 teeth had significantly higher ORs of taking MTX (OR, 9.71; 95% CI, 1.22–77.1) and bDMARDs (OR, 50.2; 95% CI, 2.55–990.6) compared to those with 27–28 teeth when adjusted for RA severity (stage). Conclusion: RA patients with fewer teeth were more likely to take stronger RA therapies, independent of RA severity and other factors.


Sign in / Sign up

Export Citation Format

Share Document