scholarly journals Mid-term (2009-2019) demographic dynamics of young beech forest in Albongbunji Basin, Ulleungdo, South Korea

2020 ◽  
Vol 44 (1) ◽  
Author(s):  
Yong-Chan Cho ◽  
Hyung Seok Sim ◽  
Songhie Jung ◽  
Han-Gyeoul Kim ◽  
Jun-Soo Kim ◽  
...  

Abstract Background The stem exclusion stage is a stage of forest development that is important for understanding the subsequent understory reinitiation stage and maturation stage during which horizontal heterogeneity is formed. Over the past 11 years (2009–2019), we observed a deciduous broad-leaved forest in the Albongbunji Basin in Ulleungdo, South Korea in its stem exclusion stage, where Fagus engleriana (Engler’s beech) is the dominant species, thereby analyzing the changes in the structure (density and size distributions), function (biomass and species richness), and demographics. Results The mean stem density data presented a bell-shaped curve with initially increasing, peaking, and subsequently decreasing trends in stem density over time, and the mean biomass data showed a sigmoidal pattern indicating that the rate of biomass accumulation slowed over time. Changes in the density and biomass of Fagus engleriana showed a similar trend to the changes in density and biomass at the community level, which is indicative of the strong influence of this species on the changing patterns of forest structure and function. Around 2015, a shift between recruitment and mortality rates was observed. Deterministic processes were the predominant cause of tree mortality in our study; however, soil deposition that began in 2017 in some of the quadrats resulted in an increase in the contribution of stochastic processes (15% in 2019) to tree mortality. The development of horizontal heterogeneity was observed in forest gaps. Conclusions Our observations showed a dramatic shift between the recruitment and mortality rates in the stem exclusion stage, and that disturbance increases the uncertainty in forest development increases. The minor changes in species composition are likely linked to regional species pool and the limited role of the life-history strategy of species such as shade tolerance and habitat affinity. Our midterm records of ecological succession exhibited detailed demographic dynamics and contributed to the improvement of an ecological perspective in the stem exclusion stage.

1995 ◽  
Vol 25 (2) ◽  
pp. 244-252 ◽  
Author(s):  
G.W. Slaughter ◽  
J.R. Parmeter Jr.

Enlargement of 53 tree-mortality centers surrounding pine stumps infected for about 35 years by Heterobasidionannosum (Fr.) Bref, was studied in northeastern California. At the end of the study period (1992), mean area for all centers was 209 m2 (range 16–701 m2), and the mean rate of radial enlargement was 0.22 m/year (0.7 ft/year). Rates decreased over time and center enlargement essentially ceased at most centers 20–30 years after stump infection. However, 7 centers (13%) continued to enlarge into surrounding stands at a mean rate of radial enlargement of 0.39 m/year (1.3 ft/year), and averaged 557 m2 (0.14 acre) by 1992. Continued enlargement of centers and extension of mortality beyond center stump root zones were associated with adjacent dead trees larger than 20 cm DBH. Many individual centers had different rates of enlargement at different points along their boundaries, ranging from 0 to 0.78 m/year (2.5 ft/year).


2012 ◽  
Vol 42 (9) ◽  
pp. 1687-1696 ◽  
Author(s):  
H.C. Thorpe ◽  
L.D. Daniels

Tree mortality is a critical driver of stand dynamics, influencing forest structure, composition, and capacity for ecosystem service provision. In recent years, tree mortality has been gaining attention as dramatic occurrences of forest die-off have been linked to climate change. Using permanent sample plot data, we examined tree mortality rates in mature forests in west-central Alberta from 1956 to 2007. We quantified mortality risk at an individual-tree level as a function of size, local competition, and calendar year, a proxy for increasing temperature, and used maximum likelihood methods to estimate species-specific model parameters. Tree size and local competition were both important predictors of mortality risk. However, once these factors were included in our model, no additional variation could be attributed to calendar year, indicating that the trend of increasing tree mortality over time found in our raw data is primarily a result of stand development processes. This finding is supported by the changes in forest structure and composition that we documented over the study period. Stands generally increased in basal area and stem density, and lodgepole pine ( Pinus contorta var. latifolia Engelm. ex S. Watson) declined in abundance relative to the more shade-tolerant black spruce ( Picea mariana (Mill.) B.S.P.) and white spruce ( Picea glauca (Moench) Voss). Our results indicate that warming-related changes did not affect background tree mortality rates in mature forests in the Alberta foothills over the study period. These results also provide critical information for future studies of forest dynamics in the region.


2004 ◽  
Vol 359 (1443) ◽  
pp. 421-436 ◽  
Author(s):  
S. L. Lewis ◽  
O. L. Phillips ◽  
T. R. Baker ◽  
J. Lloyd ◽  
Y. Malhi ◽  
...  

Several widespread changes in the ecology of old–growth tropical forests have recently been documented for the late twentieth century, in particular an increase in stem turnover (pan–tropical), and an increase in above–ground biomass (neotropical). Whether these changes are synchronous and whether changes in growth are also occurring is not known. We analysed stand–level changes within 50 long–term monitoring plots from across South America spanning 1971–2002. We show that: (i) basal area (BA: sum of the cross–sectional areas of all trees in a plot) increased significantly over time (by 0.10 ±; 0.04 m 2 ha −1 yr −1 , mean ± 95%CI) as did both (ii) stand–level BA growth rates (sum of the increments of BA of surviving trees and BA of new trees that recruited into a plot); and (iii) stand–level BA mortality rates (sum of the cross–sectional areas of all trees that died in a plot). Similar patterns were observed on a per–stem basis: (i) stem density (number of stems per hectare; 1 hectare is 10 4 m 2 ) increased significantly over time ( 0.94 ± 0.63 stems ha −1 yr −1 ); as did both (ii) stem recruitment rates; and (iii) stem mortality rates. In relative terms, the pools of BA and stem density increased by 0.38 ± 0.15% and 0.18 ± 0.12% yr −1 , respectively. The fluxes into and out of these pools—stand–level BA growth, stand–level BA mortality, stem recruitment and stem mortality rates—increased, in relative terms, by an order of magnitude more. The gain terms (BA growth, stem recruitment) consistently exceeded the loss terms (BA loss, stem mortality) throughout the period, suggesting that whatever process is driving these changes was already acting before the plot network was established. Large long–term increases in stand–level BA growth and simultaneous increases in stand BA and stem density imply a continent–wide increase in resource availability which is increasing net primary productivity and altering forest dynamics. Continent–wide changes in incoming solar radiation, and increases in atmospheric concentrations of CO 2 and air temperatures may have increased resource supply over recent decades, thus causing accelerated growth and increased dynamism across the world's largest tract of tropical forest.


2020 ◽  
Vol 41 (S1) ◽  
pp. s216-s217
Author(s):  
Cassandra Salgado ◽  
Stephanie O’Driscoll ◽  
Shruti Puri ◽  
Adrienne Lorek ◽  
Scott Curry

Background: Acute-care hospitals began reporting methicillin-resistant Staphylococcus aureus (MRSA) LabID facility-wide inpatient events to the NHSN in 2013. Few data are available regarding the epidemiology of these patients. Methods: We conducted a retrospective cohort study of patients who developed hospital onset Staphylococcus aureus bloodstream infections (HO-SA-BSIs) to describe the epidemiology (characteristics and outcomes) from January 2014 through June 2019 and to compare MRSA LabID BSIs to HO-MSSA BSIs. Proportions were compared using 2 and continuous variables using the Kruskal-Wallis test (EpiInfo). Results: Overall, 264 HO-SA BSIs occurred over the study period (2.21 per 10,000 patient days), 160 HO-MSSA BSIs (1.34 per 10,000 patient days), and 104 MRSA LabID BSIs (0.869 per 10,000 patient days). These rates have not significantly changed over time (Fig. 1). Most of these patients were men (64%); 42.4% were African-American; mean age was 43.5 years; mean Charlson comorbidity index was 3.2; 67.8% were admitted for medical care (vs surgical); and 13.3% had a previous history of S. aureus infection. Of all HO-SA-BSIs, 49.2% were acquired in the ICU, 53.8% were primary BSIs, and 37.9% were catheter associated. Patients were hospitalized a mean of 19.9 days prior to HO-SA BSI, and the mean overall length of stay was 48.5 days. Compared to HO-MSSA BSIs, there were no significant differences in these characteristics among MRSA LabID BSIs except that a significantly greater proportion were catheter associated (46.2% vs 32.5%; OR, 1.78; 95% CI, 1.07–2.96; P = .04). Overall, 101 patients (38.3%) died: 41 with MRSA LabID BSI (39.4%) and 60 with HO-MSSA BSI (37.5%). Mortality rates have not changed significantly over time. The mean number of days to death was 154.2, and 59 patients (22.3%) died during incident hospitalization: 26.9% of MRSA patients and 19.4% of MSSA BSI patients. Moreover, 28.3% of patients were readmitted within 30 days of discharge from incident hospitalization, and compared to HO-MSSA BSI, this rate was significantly higher among MRSA LabID BSI patients (34.2% vs 24.8%; OR, 2.07; 95% CI, 1.09–3.93; P = .03). Among those who died, 58.4% died during hospitalization, 52.5% died within 30 days, 66.3% died within 60 days, and 74.3% had died within 90 days. Also, 47.5% died as a result of their HO-SA BSI, and compared to HO-MSSA BSI, this rate was significantly higher among those with MRSA LabID-BSI (63.4% vs 36.7%; OR, 2.99; 95% CI, 1.31–6.83; P = .02). Conclusions: Among patients with HO-SA BSI, methicillin-resistance continues to be associated with higher attributable mortality, and in our study, higher rates of 30-day readmission. There has been no significant change in HO-SA BSI rates (MSSA or MRSA) since reporting for MRSA LabID events began. Furthermore, mortality rates have not changed and remain high for both MRSA BSI and MSSA BSI patients. Given these findings, MSSA LabID event reporting should be considered.Funding: NoneDisclosures: None


2020 ◽  
Vol 655 ◽  
pp. 139-155
Author(s):  
DC Yates ◽  
SI Lonhart ◽  
SL Hamilton

Marine reserves are often designed to increase density, biomass, size structure, and biodiversity by prohibiting extractive activities. However, the recovery of predators following the establishment of marine reserves and the consequent cessation of fishing may have indirect negative effects on prey populations by increasing prey mortality. We coupled field surveys with empirical predation assays (i.e. tethering experiments) inside and outside of 3 no-take marine reserves in kelp forests along the central California coast to quantify the strength of interactions between predatory fishes and their crustacean prey. Results indicated elevated densities and biomass of invertebrate predators inside marine reserves compared to nearby fished sites, but no significant differences in prey densities. The increased abundance of predators inside marine reserves translated to a significant increase in mortality of 2 species of decapod crustaceans, the dock shrimp Pandalus danae and the cryptic kelp crab Pugettia richii, in tethering experiments. Shrimp mortality rates were 4.6 times greater, while crab mortality rates were 7 times greater inside reserves. For both prey species, the time to 50% mortality was negatively associated with the density and biomass of invertebrate predators (i.e. higher mortality rates where predators were more abundant). Video analyses indicated that macro-invertivore fishes arrived 2 times faster to tethering arrays at sites inside marine reserves and began attacking tethered prey more rapidly. The results indicate that marine reserves can have direct and indirect effects on predators and their prey, respectively, and highlight the importance of considering species interactions in making management decisions.


Neonatology ◽  
2021 ◽  
pp. 1-9
Author(s):  
Matthias Fröhlich ◽  
Tatjana Tissen-Diabaté ◽  
Christoph Bührer ◽  
Stephanie Roll

<b><i>Introduction:</i></b> In very low birth weight (&#x3c;1,500 g, VLBW) infants, morbidity and mortality have decreased substantially during the past decades, and both are known to be lower in girls than in boys. In this study, we assessed sex-specific changes over time in length of hospital stay (LOHS) and postmenstrual age at discharge (PAD), in addition to survival in VLBW infants. <b><i>Methods:</i></b> This is a single-center retrospective cohort analysis based on quality assurance data of VLBW infants born from 1978 to 2018. Estimation of sex-specific LOHS over time was based on infants discharged home from neonatal care or deceased. Estimation of sex-specific PAD over time was based on infants discharged home exclusively. Analysis of in-hospital survival was performed for all VLBW infants. <b><i>Results:</i></b> In 4,336 of 4,499 VLBW infants admitted from 1978 to 2018 with complete data (96.4%), survival rates improved between 1978–1982 and 1993–1997 (70.8 vs. 88.3%; hazard ratio (HR) 0.20, 95% confidence interval 0.14, 0.30) and remained stable thereafter. Boys had consistently higher mortality rates than girls (15 vs. 12%, HR 1.23 [1.05, 1.45]). Nonsurviving boys died later compared to nonsurviving girls (adjusted mean survival time 23.0 [18.0, 27.9] vs. 20.7 [15.0, 26.3] days). LOHS and PAD assessed in 3,166 survivors displayed a continuous decrease over time (1978–1982 vs. 2013–2018: LOHS days 82.9 [79.3, 86.5] vs. 60.3 [58.4, 62.1] days); PAD 40.4 (39.9, 40.9) vs. 37.4 [37.1, 37.6] weeks). Girls had shorter LOHS than boys (69.4 [68.0, 70.8] vs. 73.0 [71.6, 74.4] days) and were discharged with lower PAD (38.6 [38.4, 38.8] vs. 39.2 [39.0, 39.4] weeks). <b><i>Discussion/Conclusions:</i></b> LOHS and PAD decreased over the last 40 years, while survival rates improved. Male sex was associated with longer LOHS, higher PAD, and higher mortality rates.


2021 ◽  
pp. 107815522110160
Author(s):  
Bernadatte Zimbwa ◽  
Peter J Gilbar ◽  
Mark R Davis ◽  
Srinivas Kondalsamy-Chennakesavan

Purpose To retrospectively determine the rate of death occurring within 14 and 30 days of systemic anticancer therapy (SACT), compare this against a previous audit and benchmark results against other cancer centres. Secondly, to determine if the introduction of immune checkpoint inhibitors (ICI), not available at the time of the initial audit, impacted mortality rates. Method All adult solid tumour and haematology patients receiving SACT at an Australian Regional Cancer Centre (RCC) between January 2016 and July 2020 were included. Results Over a 55-month period, 1709 patients received SACT. Patients dying within 14 and 30 days of SACT were 3.3% and 7.0% respectively and is slightly higher than our previous study which was 1.89% and 5.6%. Mean time to death was 15.5 days. Males accounted for 63.9% of patients and the mean age was 66.8 years. 46.2% of the 119 patients dying in the 30 days post SACT started a new line of treatment during that time. Of 98 patients receiving ICI, 22.5% died within 30 days of commencement. Disease progression was the most common cause of death (79%). The most common place of death was the RCC (38.7%). Conclusion The rate of death observed in our re-audit compares favourably with our previous audit and is still at the lower end of that seen in published studies in Australia and internationally. Cases of patients dying within 30 days of SACT should be regularly reviewed to maintain awareness of this benchmark of quality assurance and provide a feedback process for clinicians.


Author(s):  
Simona Malaspina ◽  
Vesa Oikonen ◽  
Anna Kuisma ◽  
Otto Ettala ◽  
Kalle Mattila ◽  
...  

Abstract Purpose This phase 1 open-label study evaluated the uptake kinetics of a novel theranostic PET radiopharmaceutical, 18F-rhPSMA-7.3, to optimise its use for imaging of prostate cancer. Methods Nine men, three with high-risk localised prostate cancer, three with treatment-naïve hormone-sensitive metastatic disease and three with castration-resistant metastatic disease, underwent dynamic 45-min PET scanning of a target area immediately post-injection of 300 MBq 18F-rhPSMA-7.3, followed by two whole-body PET/CT scans acquired from 60 and 90 min post-injection. Volumes of interest (VoIs) corresponding to prostate cancer lesions and reference tissues were recorded. Standardised uptake values (SUV) and lesion-to-reference ratios were calculated for 3 time frames: 35–45, 60–88 and 90–118 min. Net influx rates (Ki) were calculated using Patlak plots. Results Altogether, 44 lesions from the target area were identified. Optimal visual lesion detection started 60 min post-injection. The 18F-rhPSMA-7.3 signal from prostate cancer lesions increased over time, while reference tissue signals remained stable or decreased. The mean (SD) SUV (g/mL) at the 3 time frames were 8.4 (5.6), 10.1 (7) and 10.6 (7.5), respectively, for prostate lesions, 11.2 (4.3), 13 (4.8) and 14 (5.2) for lymph node metastases, and 4.6 (2.6), 5.7 (3.1) and 6.4 (3.5) for bone metastases. The mean (SD) lesion-to-reference ratio increases from the earliest to the 2 later time frames were 40% (10) and 59% (9), respectively, for the prostate, 65% (27) and 125% (47) for metastatic lymph nodes and 25% (19) and 32% (30) for bone lesions. Patlak plots from lesion VoIs signified almost irreversible uptake kinetics. Ki, SUV and lesion-to-reference ratio estimates showed good agreement. Conclusion 18F-rhPSMA-7.3 uptake in prostate cancer lesions was high. Lesion-to-background ratios increased over time, with optimal visual detection starting from 60 min post-injection. Thus, 18F-rhPSMA-7.3 emerges as a very promising PET radiopharmaceutical for diagnostic imaging of prostate cancer. Trial Registration NCT03995888 (24 June 2019).


Weed Science ◽  
2020 ◽  
pp. 1-23
Author(s):  
Tao Li ◽  
Jiequn Fan ◽  
Zhenguan Qian ◽  
Guohui Yuan ◽  
Dandan Meng ◽  
...  

Abstract The use of a corn-earthworm coculture (CE) system is an eco-agricultural technology that has been gradually extended due to its high economic output and diverse ecological benefits for urban agriculture in China. However, the effect of CE on weed occurrence has received little attention. A five-year successive experiment (2015 to 2019) was conducted to compare weed occurrence in CE and a corn (Zea mays L.) monoculture (CM). The results show that CE significantly decreased weed diversity, the dominance index, total weed density and biomass, but increased the weed evenness index. The five-year mean number of weed species per plot was 8.4 in CE and 10.7 in CM. Compared to those in CM, the five-year mean density and biomass of total weeds in CE decreased by 59.2% and 66.6%, respectively. The effect of CE on weed occurrence was species specific. The mean density of large crabgrass [Digitaria sanguinalis (L.) Scop.], green foxtail [Setaria viridis (L.) Beauv.], goosegrass [Eleusine indica (L.) Gaertn.], and common purslane (Portulaca oleracea L.) in CE decreased by 94.5, 78.1, 75.0, and 45.8%, whereas the mean biomass decreased by 96.2, 80.8, 76.9, and 41.4%, respectively. Our study suggests that the use of CE could suppress weed occurrence and reduce herbicide inputs in agriculture.


Sign in / Sign up

Export Citation Format

Share Document