Nursery Seedbed Density Is Determined by Short-Term or Long-Term Objectives¹

1987 ◽  
Vol 11 (1) ◽  
pp. 9-14 ◽  
Author(s):  
Jon P. Caulfield ◽  
David B. South ◽  
James N. Boyer

Abstract Lowering nursery seedbed density can increase the proportion of high-quality (grade 1 and 2) seedlings relative to cull (grade 3) seedlings. Outplanting higher grade seedlings can increase survival and volume production. Lowering seedbed density from present levels may therefore increasestand value at rotation age. The relationship between four seedbed density levels (60, 90, 120, and 150 seedlings/lineal bed foot) is evaluated for slash (Pinus elliottii Engelm.) and loblolly (Pinus taeda L.) pine, and the impact of grade on growth performance is projected. An economic analysisdemonstrates how to determine the present value of the expenditure justified to alter seedbed density to obtain a projected future change in out-planting performance. Potential economic gains ranging from - $4.13 to $2 7.58 per thousand seedlings were derived by altering seedbeddensity from a base-level density of 120 seedlings/lineal bed foot. Positive values were associated with decreases in density and negative values with density increases. Site quality of outplanted areas plays a major role in determining the amount of the justifiable expenditure. South. J.Appl. For. 11(1):9-14.

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e14551-e14551
Author(s):  
Charline Lafayolle de la Bruyère ◽  
Julien Peron ◽  
Pierre Jean Souquet

e14551 Background: It remains unclear whether immune related adverse events (irAEs) and glucocorticoids use could impact long term-outcomes in patients treated for solid tumors with immune checkpoint inhibitors (ICI). Methods: All patients treated with a single-agent ICI for any advanced cancer were included in this retrospective multicentric study. Objectives were to assess the impact of grade 3+ irAEs, glucocorticoids use and interruption of immunotherapy on progression free survival (PFS) and overall survival (OS). Data collection was performed retrospectively using a standardized data collection form. Adverse events were categorized as irAEs based on the judgement of the treating physicians, based on the common Terminology Criteria for Adverse Events, version 4.0. Only grade 3 and more irAEs were considered in this study. As irAEs might happen late during the follow-up and progression event or death might happen early, an immortal-time bias might occur as patients responding to ICI will receive the ICI and then be exposed to irAE for a longer period of time. The first occurrence of an irAE was then included in Cox models, as a time-varying covariate. Similar methods were used to evaluate the impact of glucocorticoids introduction or ICI interruption. Results: In this 828 patients’ cohort, 78 patients presented at least one grade 3+ irAE. The first occurrence of grade 3+ irAEs had no significant impact on PFS (HR 0,94; 95%CI 0.7-1.26; p = 0,70) or OS (HR 0.82; 95%CI 0.6-1,12; p = 0,21). 65% of patients with anti CTLA4 and 55% of patients with anti PD(L)1 requested glucocorticoids, which was associated with a significant shorter PFS (adjusted HR 3.0; 95%CI 1.6-5.4; p = 0.00040) and a trend toward shorter OS. Grade 3+ irAEs led to interruption of the ICI in 82 % of patients, which was associated with a significant shorter PFS (adjusted HR 3.5; 95%CI 1.7-6.0; p < 0.00043) and shorter OS (HR 4.5; 95%CI 1.7-12.1; p = 0.0027). The use of glucocorticoids was statistically associated with immunotherapy interruption. Conclusions: In our population of patients treated with single agent ICI, grade 3+ irAEs did not impact long-term outcomes. However, the need for glucocorticoids and the interruption of immunotherapy resulted in poorer long-term outcomes. The impact of grade 3+ irAEs reported in other studies might then be explained by the management of the irAEs.


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 342-342 ◽  
Author(s):  
David A. Rizzieri ◽  
Robert Storms ◽  
Daniel Nikcevich ◽  
Bercedis Peterson ◽  
Debashish Misra ◽  
...  

Abstract Introduction: Early response rates to non-myeloablative therapy are encouraging, however long term remissions remain elusive. Manipulating donor lymphocyte infusions (DLI) to preferentially infuse Natural Killer (NK) cells, typically comprising 3–5% of a DLI graft, may promote better antitumor and anti-infective surveillance while reducing risk of graft versus host disease (GVHD). We investigated the feasibility of providing NK cell-enhanced DLI following T cell-depleted non-myeloablative allogeneic transplants. Methods: Patients underwent an alemtuzumab and fludarabine-based non-myeloablative preparative regimens from a 3-6/6 HLA matched related donors. At 6 weeks posttransplant, those who engrafted and did not have sevee GVHD received NK cell-enhanced DLIs, repeated x2 at 8-week intervals. For DLI, NK cells were enriched in a single step using the CliniMACS CD56 Reagent and CliniMACSplus instrument, per manufacturer’s protocols (Miltenyi Biotec Inc, Auburn, CA). The total cell dose infused in patients receiving HLA-mismatched DLI was no more than 0.5 X 106 CD3+CD56neg cells/kg patient weight. Patients receiving HLA-matched DLI (6/6) received no more than 106 CD3+CD56neg cells/kg patient weight. Analysis: The primary endpoints for feasibility were mortality, occurrence of severe acute GVHD or other unacceptable toxicity, response and duration of response. Efficacy was measured by Progression Free Survival (PFS) and Overall Survival (OS). NK cell function was used as a primary endpoint for immune recovery. NK cell function was measured by flow cytometry using methods that we had previously validated using unfractionated PBMC and CD56+-enriched NK cell preparations. Results: The NK cell selections worked well with only one device failure resulting in low purity. NK cell purity was 92%+/− 3.5 and yield 74% +/− 16. The resulting cell preparations had low frequencies of CD4+, CD8+ and gamma-delta T cells. Table 1- Clinical feasibility of enhancing DLIs for NK cells using the Miltenyi system. % PURITY % YIELD CD3+CD56-/KG × 10e5 TOTAL CD56+10e7 CD3+CD56 KG × 10e6+/ CD3-CD56+ ×10e6 Median 95.32 83.44 5.34 1.12 1.94 9.21 St Dev 7.96 21.35 10.46 0.65 2.22 7.91 Mis Median 97.46 77.80 2.74 1.44 3.67 9.21 St Dev 3.24 24.05 7.79 0.61 2.41 5.56 Ten patients enrolled had HLA-matched (6/6) sibling donors. Of these, 3 had AML, 2 ALL, 3 follicular lymphoma/CLL, 1 myeloma, and 1 myeloproliferative disorder. At entry, six had active disease, 3 were in 2nd CR and 1 was in 1st CR with high risk ALL. These patients received a total of 15 NK cell-enhanced DLI. Infusions were well tolerated with 2 cases of overall grade 2 (grade 3 skin; grade 1 gut), and one case of grade 3 GVHD (grade 3 skin; grade 1 gut and liver). Four of 10 remain alive and in continuous complete remission. Fourteen patients enrolled had HLA-mismatched (3-4/6) related donors. Six had AML, 3 transformed AML, 2 ALL, 1 T cell NHL, 1 myeloproliferative disease and 1 severe aplastic anemia. At time of transplantation, only 1 subject was in CR1, 7 were in CR2, 6 were relapsed/refractory. These patients received a total of 27 NK cell enhanced- DLI. Despite the HLA mismatch, the infusions were well tolerated with 4 cases of overall grade 1 GVHD (primarily skin), 2 grade 2, and 1 grade 4 (gut and liver). Infections were a concern with 1 patient dying of infection while 2 others experienced sepsis. Further, 3 had parainfluenza, 1 VZV, and 2 polyoma virus in the bladder. Eight patients remain alive and 7 are in continuous remission. NK cell function was measured in 22 patients (Figure 1). Figure 1: (A) At 6 to 8 weeks post-transplant, some NK cell function had returned in 7 of 22 patients. Among other patients, 7 patients demonstrated low NK cell function (bracket) while 8 others did not recover lymphocytes (not shown). (B) The impact of NK DLI was monitored in 7 patients that had not previously responded. Of those patients, 4 responded within 6 to 8 weeks post-DLI. (C) In one patient, NK cell function returned gradually following a 2nd and 3rd DLI. Figure 1:. (A) At 6 to 8 weeks post-transplant, some NK cell function had returned in 7 of 22 patients. Among other patients, 7 patients demonstrated low NK cell function (bracket) while 8 others did not recover lymphocytes (not shown). (B) The impact of NK DLI was monitored in 7 patients that had not previously responded. Of those patients, 4 responded within 6 to 8 weeks post-DLI. (C) In one patient, NK cell function returned gradually following a 2nd and 3rd DLI. Conclusion: NK cell enhanced DLI can be safely delivered following T cell depleted nonmyeloablative allogeneic transplantation. Subsequent infusions may allow for improved function. Longer follow up is needed to evaluate affects on long term toxicity and durability of response.


Cancers ◽  
2021 ◽  
Vol 13 (10) ◽  
pp. 2365
Author(s):  
Charline Lafayolle de la Bruyère ◽  
Pierre-Jean Souquet ◽  
Stéphane Dalle ◽  
Pauline Corbaux ◽  
Amélie Boespflug ◽  
...  

It remains unclear whether immune-related adverse events (irAEs) and glucocorticoid use could impact long-term outcomes in patients treated for solid tumors with immune checkpoint inhibitors (ICI). All patients treated with a single-agent ICI for any advanced cancer were included in this retrospective unicentric study. The objectives were to assess the impact of grade ≥3 irAEs, glucocorticoid use and the interruption of immunotherapy on progression-free survival (PFS) and overall survival (OS). In this 828-patient cohort, the first occurrence of grade ≥3 irAEs had no significant impact on PFS or OS. Glucocorticoid administration for the irAEs was associated with a significantly shorter PFS (adjusted HR 3.0; p = 0.00040) and a trend toward shorter OS. ICI interruption was associated with a significantly shorter PFS (adjusted HR 3.5; p < 0.00043) and shorter OS (HR 4.5; p = 0.0027). Glucocorticoid administration and ICI interruption were correlated. In our population of patients treated with single agent ICI, grade ≥3 irAEs did not impact long-term outcomes. However, the need for glucocorticoids and the interruption of immunotherapy resulted in poorer long-term outcomes. The impact of grade ≥3 irAEs reported in other studies might then be explained by the management of the irAEs.


1998 ◽  
Vol 22 (4) ◽  
pp. 201-208 ◽  
Author(s):  
Dwight K. Lauer ◽  
Glenn R. Glover

Abstract Herbicide treatments were used at four flatwood locations in north Florida and south Georgia to compare early pine response to control of herbaceous and shrub vegetation following bedding. Treatments consisted of three levels of shrub control (none, first year, repeated) with and without first year herbaceous vegetation control. All studies were located on spodosols planted with either loblolly (Pinus taeda L.) or slash (Pinus elliottii Englem.) pine. Responses due to shrub control were about twice that of herbaceous control with height responses of 2.2, 5.0, and 6.9 ft due to first year herbaceous control, shrub control, and the combination of both herbaceous and shrub control, respectively. Pine response did not differ due to duration of shrub control because the difference in shrub cover between first year and repeated shrub control treatments was minor in these young stands. Pines averaged 18.3 ft in height and 3.2 in. in dbh 5 yr after planting when both herbaceous and shrub vegetation was controlled with these operational-like site preparation treatments that combine bedding with first year herbicide applications. Shrub occupancy was highest on treatments that did not include shrub control and continued to increase through the first 5 yr. Operational site-preparation treatments that combine bedding with herbicide applications should be considered in situations where shrub vegetation is present because of the long-term impact that shrubs have on pine yield. South. J. Appl. For. 15(4):201-208.


2001 ◽  
Vol 25 (1) ◽  
pp. 7-16 ◽  
Author(s):  
Michael D. Cain ◽  
Michael G. Shelton

Abstract A study was initiated in 1943 to evaluate the long-term productivity of loblolly (Pinus taeda L.) and shortleaf pines (P. echinata Mill.) when managed under four reproduction cutting methods—clearcut, heavy seedtree, diameter-limit, and selection—on the Upper Coastal Plain of southeastern Arkansas. Early volume production reflected retention of residual pines, and the clearcut was the least productive method through the first 36 yr. After 53 yr, there were no statistically significant (P = 0.07) differences among cutting methods in sawlog volume production, which averaged 3,800 ft3/ac. In terms of sawlog volume (bd ft/ac, Doyle scale), total production on clearcut, seedtree, and selection plots exceeded (P < 0.01) that on diameter-limit plots by 37%, but there were no differences in sawlog volume production among the other cutting methods. Results suggest that forest landowners should consider the advantages and disadvantages of each cutting method when planning their long-term objectives. South. J. Appl. For. 25(1):7–16.


1999 ◽  
Vol 29 (6) ◽  
pp. 737-742 ◽  
Author(s):  
Jeremy T Brawner ◽  
Douglas R Carter ◽  
Dudley A Huber ◽  
Timothy L White

Midrotation data from large block plots of resistant and susceptible slash (Pinus elliottii Engelm.) and loblolly pine (Pinus taeda L.) were used in combination with the Georgia pine plantation simulator growth model to provide projected gains per hectare in volume and value generated by resistance to fusiform rust (Cronartium quercum (Berk.) Miyabe ex Shirai f.sp. fusiforme). The difference in the projected volume production between the resistant and susceptible planting stock of slash pine was larger than the difference between resistance levels in loblolly pine. The increases in projected volume and the reductions in percent infection of the resistant stock led to large differences in the value of the resistant and susceptible planting stock. At a 6% real discount rate, plantations of resistant slash pine were on average worth between 40.2 and 89.8% more than plantations of susceptible slash pine. Plantations of resistant loblolly were on average worth between 6.1 and 40.3% more than plantations of susceptible loblolly pine. However, the marginal value of rust resistance in loblolly was not significantly different from zero under the assumption that economic differences are only due to volume losses and not losses due to product degrade.


2011 ◽  
Vol 70 (1) ◽  
pp. 5-11 ◽  
Author(s):  
Beat Meier ◽  
Anja König ◽  
Samuel Parak ◽  
Katharina Henke

This study investigates the impact of thought suppression over a 1-week interval. In two experiments with 80 university students each, we used the think/no-think paradigm in which participants initially learn a list of word pairs (cue-target associations). Then they were presented with some of the cue words again and should either respond with the target word or avoid thinking about it. In the final test phase, their memory for the initially learned cue-target pairs was tested. In Experiment 1, type of memory test was manipulated (i.e., direct vs. indirect). In Experiment 2, type of no-think instructions was manipulated (i.e., suppress vs. substitute). Overall, our results showed poorer memory for no-think and control items compared to think items across all experiments and conditions. Critically, however, more no-think than control items were remembered after the 1-week interval in the direct, but not in the indirect test (Experiment 1) and with thought suppression, but not thought substitution instructions (Experiment 2). We suggest that during thought suppression a brief reactivation of the learned association may lead to reconsolidation of the memory trace and hence to better retrieval of suppressed than control items in the long term.


2003 ◽  
Author(s):  
Teresa Garate-Serafini ◽  
Jose Mendez ◽  
Patty Arriaga ◽  
Larry Labiak ◽  
Carol Reynolds

Sign in / Sign up

Export Citation Format

Share Document