scholarly journals Flower Bud Cold Hardiness of ‘Muskoka’ Red Raspberry as Related to Water Content in Late Winter

2006 ◽  
Vol 6 (1) ◽  
pp. 63-76 ◽  
Author(s):  
Leila Keinänen ◽  
Pauliina Palonen ◽  
Leena Lindén
2007 ◽  
Vol 132 (1) ◽  
pp. 73-79 ◽  
Author(s):  
Scott R. Kalberer ◽  
Rajeev Arora ◽  
Norma Leyva-Estrada ◽  
Stephen L. Krebs

Dehardening resistance and rehardening capacity in late winter and spring are important factors contributing to the winter survival of woody perennials. Previously the authors determined the midwinter hardiness, dehardening resistance, and rehardening capacities in deciduous azalea (Rhododendron L.) floral buds in early winter. The purpose of this study was to investigate how these parameters changed as winter progressed and to compare rehardening response at three treatment temperatures. Experiments were also conducted to measure bud water content during dehardening and chilling accumulation of 10 azalea genotypes. Buds of R. arborescens (Pursh) Torr., R. canadense (L.) Torr., R. canescens (Michx.) Sweet, and R. viscosum (L.) Torr. var. montanum Rehd. were acclimated in the field and were dehardened in the laboratory at controlled warm temperatures for various durations. Dehardened buds were rehardened for 24 hours at 2 to 4 °C, 0 °C, or –2 °C. Bud hardiness (LT50) was determined from visual estimates of freeze injury during a controlled freeze–thaw regime. The midwinter bud hardiness in the current study was ≈4 to 8 °C greater than in early winter. R. canadense and R. viscosum var. montanum dehardened to a larger extent in late winter than in the early winter study whereas R. arborescens and R. canescens did not. The rehardening capacities were larger in early than in late winter. Even though rehardening occurred throughout the first 8 days of dehardening (DOD) in early winter in the previous study, in the current study it was only observed after 10 DOD (R. viscosum var. montanum) or 15 DOD (R. arborescens). There was no difference among the rehardening capacities at the three rehardening temperatures for any genotype. Water content decreased throughout dehardening in all four genotypes examined. R. canadense had the lowest chilling requirement (CR) [450 chilling units (CU)], followed by R. atlanticum (Ashe) Rehd., R. austrinum (Small) Rehd., R. canescens, and R. calendulaceum (Michx.) Torr. with intermediate CR [820, 830, 830, and 1000 CU respectively). The CR of R. arborescens, R. prinophyllum (Small) Millais, R. prunifolium (Small) Millais, R. viscosum var. montanum, and R. viscosum var. serrulatum (Small) Millais exceeded 1180 CU. Results of this study indicate that the dehardening kinetics (magnitude and rate) and the rehardening capacity of azalea buds are influenced by the progression of winter and the depth of endodormancy.


2012 ◽  
Vol 137 (1) ◽  
pp. 31-37 ◽  
Author(s):  
Mark K. Ehlenfeldt ◽  
Lisa J. Rowland ◽  
Elizabeth L. Ogden ◽  
Bryan T. Vinyard

Cold injury to plants can occur by early fall freezes before cold acclimation, by severe midwinter freezes that exceed the limits of the plant's tolerance, or by hard freezes in late winter or early spring after partial or complete deacclimation. Ideally, blueberry (Vaccinium L.) cultivars for temperate regions should acclimate to cold quickly in the fall, have a high midwinter-hardiness, and deacclimate late and/or slowly during spring or during unseasonably warm spells in winter, and do all of this without adversely delaying time of fruiting. Until recently, only limited evaluations have been done on the acclimation and deacclimation process in blueberry, although it is an integral part of flower bud survival and, thus, is directly related to potential yield. In this study, we have measured the timing and rate of acclimation and deacclimation in seven blueberry genotypes with different amounts of diverse species germplasm in their backgrounds. Primary differences observed among the seven genotypes were differences in maximum hardiness levels and the date at which they were reached, and differences in the date at which maximum acclimation levels were no longer sustained and deacclimation started. Highbush cultivars Bluecrop and Legacy (V. corymbosum L.), rabbiteye cultivar Tifblue [V. ashei Reade (= V. virgatum Aiton)], and two rabbiteye hybrid derivatives (US 1043 and US 1056) all reached maximum or near maximum cold-hardiness by late December with temperatures causing 50% lethality (LT50) in a range from –22 to –27 °C. The half-high, ‘Northsky’, and a hybrid of V. constablaei Gray × V. ashei ‘Little Giant’ both achieved cold acclimation of –28 °C or below (the lowest value we could measure) by the end of November. After reaching their maximum hardiness in late December, ‘Legacy’, ‘Tifblue’, and US 1043 began a sustained and relatively linear deacclimation, whereas US 1056, ‘Bluecrop’, ‘Northsky’, and ‘Little Giant’ sustained their acclimation for longer intervals. ‘Bluecrop’ and US 1056 did not begin to deacclimate until early March, and ‘Little Giant’ and ‘Northsky’ had no LT50 values higher (warmer) than –25 °C until late March. As concerns about climate change increase, knowledge of the ability of breeding germplasm to tolerate greater temperature extremes and fluctuations will prove increasingly valuable.


HortScience ◽  
1998 ◽  
Vol 33 (3) ◽  
pp. 512e-512
Author(s):  
A.M. Shirazi

Six different Japanese Maples (Acer palmatum) cultivars `Water Fall', `Burgundy Lace', `Crimson Queen', `Oshio-Beni', `SangoKaKu', and `Bloodgood' from Monrovia Nursery were planted in a randomized block design on 4 June 1997 at the The Morton Arboretum. Leaf heat tolerance was evaluated by measuring ion leakage of the leaf tissue at 25–60 °C in July, Aug., and Sept. 1997. The LT50 (the temperature at which 50% of the tissues were injured) of all the cultivars were higher in July (≈53 °C) and were lower in September (≈47 °C). Water content of the leaf tissues were higher in July compare to August and September and were not related to heat tolerance of most cultivars. Stem cold hardiness was performed by artificial freezing tests in Oct., Dec., and Feb. 1997/98. The Lowest Survival Temperature (LST) for the most hardy to least hardy cultivars in October and December were: `Burgundy Lace' (–15, –27 °C), `Bloodgood' (–18, –24 °C), `Oshio-Beni' (–15, –24 °C), `Crimson Queen' (–15, –18 °C), `Water Fall' (–9, –18 °C) and `SangoKaKu' (–9, –12 °C), respectively. Growth, dormancy development, spring budbreak and performance of these cultivars will be compared.


2006 ◽  
Vol 131 (2) ◽  
pp. 209-213 ◽  
Author(s):  
Pauliina Palonen ◽  
Leena Lindén

`Maurin Makea', `Muskoka', ` Ottawa', and `Preussen' red raspberry (Rubus idaeus L.) canes were collected from the field and subjected to different hot water treatments (20, 35, 40, 45, and 50 °C) to determine if endodormancy could be removed by a near lethal stress. Estimation of days for 50% budbreak (DD50) was found useful for describing the state of bud dormancy in the samples. Bud dormancy was broken in `Ottawa' by immersing the canes in 45 °C water for 2 hours, in `Maurin Makea' by treating the canes in 40 °C water, and in `Preussen' by both 40 and 45 °C treatments. The influence of this treatment on dormancy and cold hardiness at different times of the winter was further examined using `Ottawa' raspberry. The treatment removed bud dormancy most effectively in October, when the samples were in deepest dormancy. A slight effect was observed in November, but no effect in January. During ecodormancy in February the treatment delayed budbreak. Hot water treatment reduced cold hardiness of `Ottawa' canes by 8 to 15 °C, and that of buds by 9 to 13 °C during both endo- and ecodormancy. Based on the capacity of buds and canes to reacclimate, recovery from the stress treatment was possible at temperatures ≥4 °C. Loss of cold hardiness was caused by high treatment temperature itself and was not related to breaking of dormancy in samples. This finding suggests that dormancy and cold hardiness are physiologically unconnected in raspberry.


2017 ◽  
Vol 47 (8) ◽  
pp. 1116-1122 ◽  
Author(s):  
Rongzhou Man ◽  
Pengxin Lu ◽  
Qing-Lai Dang

Conifer winter damage results primarily from loss of cold hardiness during unseasonably warm days in late winter and early spring, and such damage may increase in frequency and severity under a warming climate. In this study, the dehardening dynamics of lodgepole pine (Pinus contorta Dougl. ex. Loud), jack pine (Pinus banksiana Lamb.), white spruce (Picea glauca (Moench) Voss), and black spruce (Picea mariana (Mill.) B.S.P.) were examined in relation to thermal accumulation during artificial dehardening in winter (December) and spring (March) using relative electrolyte leakage and visual assessment of pine needles and spruce shoots. Results indicated that all four species dehardened at a similar rate and to a similar extent, despite considerably different thermal accumulation requirements. Spring dehardening was comparatively faster, with black spruce slightly hardier than the other conifers at the late stage of spring dehardening. The difference, however, was relatively small and did not afford black spruce significant protection during seedling freezing tests prior to budbreak in late March and early May. The dehardening curves and models developed in this study may serve as a tool to predict cold hardiness by temperature and to understand the potential risks of conifer cold injury during warming–freezing events prior to budbreak.


Agronomy ◽  
2020 ◽  
Vol 10 (8) ◽  
pp. 1067
Author(s):  
Lisa J. Rowland ◽  
Elizabeth L. Ogden ◽  
Bryan T. Vinyard

A diploid blueberry mapping population, used previously to map quantitative trait loci (QTL) for chilling requirement and cold hardiness, was evaluated for several plant development and fruit quality traits. Specifically, the population was phenotyped in a greenhouse for timing of various stages of flower bud, leaf bud and fruit development and for fruit quality traits including weight, diameter, color, scar, firmness, flavor and soluble solids. Phenotypic data was analyzed statistically by analysis of variance, correlation tests, to examine associations of traits, and heritability. Results indicated that the traits were segregating and most were distributed normally in the population. Many of the development traits were correlated, and timing of shoot expansion, early bloom and full bloom was also correlated with the previously evaluated trait of chilling requirement. Some correlations were found among the fruit quality traits as well. For example, weight was highly correlated with diameter, and subjectively measured firmness was moderately correlated with one of the objectively measured firmness traits. In addition, most of the traits showed significant variation across genotypes and across years, and most had moderate to high heritability. Therefore, we conclude that the diploid population should be useful for identifying QTL for many of these traits.


1989 ◽  
Vol 69 (2) ◽  
pp. 355-366 ◽  
Author(s):  
A. L. BRULE-BABEL ◽  
D. B. FOWLER

Field survival is the most commonly employed method of evaluating the winter hardiness of cereals. However, the inherent difficulties with field trials have stimulated a continued interest in the use of controlled environments and prediction tests for the evaluation of cold hardiness. In the present studies, cold hardiness expression of wheat (Triticum aestivum L.) cultivars acclimated in controlled environments was found to be similar to that reported for field conditions in Saskatchewan, Canada. LT50 and tissue water content measurements on wheat and rye (Secale cereale L.) cultivars acclimated in controlled environments were highly correlated with cultivar field survival ability. Investigation of the relationship between field survival and tissue water content during cold acclimation in controlled environments indicated that, to be effective as a screening method for cold hardiness, measurements of tissue water content should be made on fully acclimated plants for which the acclimation conditions have been rigorously controlled. Level of acclimation was not as critical for cold hardiness screening when LT50 measurements were utilized; however, maximum resolution also required fully acclimated plants. Although a strong relationship (r = −0.80 to −0.89) was found to exist with field survival potential, an inability to detect small, but important, differences without excessive replication would generally restrict the use of LT50 and tissue water content to situations where large homogeneous plant populations were available and only coarse screens for cold hardiness were required.Key words: Cold acclimation, winter wheat, winter rye, cold hardiness, water content, replication


1979 ◽  
Vol 57 (14) ◽  
pp. 1511-1517 ◽  
Author(s):  
D. W. A. Roberts

Experiments in which winter wheat plants were exposed to two different controlled hardening-temperature regimes (constant 3 °C, and 5.5 °C (day): 3.5 °C (night)) for long periods (up to 15 weeks) indicate that cold hardiness changes with time.The cold hardiness in plants grown from seed at 3 °C drops rapidly immediately after moistening and reaches a minimum 2–3 weeks later. Hardiness then begins to increase and reaches a maximum that lasts approximately from the 7th to the 11th week of growth after which it slowly declines.The patterns of change in cold hardiness during growth at 3 °C, and 5.5 °C:3.5 °C were almost synchronous if hardiness was plotted against duration of hardening, but were not synchronous if hardiness was plotted against stage of development as measured by the number of leaves produced. A somewhat similar result was obtained if plants grown for 3 weeks at 21 °C before hardening were compared with plants grown from dry seeds under the same hardening conditions. These experiments show that duration of hardening is more important in determining the level of cold resistance and the ability of wheat to retain its cold resistance than is stage of development, as measured by the number of leaves produced at the time cold resistance is measured.When plants seeded outdoors in mid-September were transferred at various dates (0–30 weeks after seeding) during the fall or winter to standardized hardening conditions in a growth cabinet for 0–15 weeks before freezing, their cold resistance changed in a way that suggests that plants in the field undergo the same pattern of changes in cold resistance as plants reared continuously in a growth chamber. This result suggests that the long exposure to hardening temperatures is one of the reasons why wheat in the field has less cold resistance in late winter than in autumn. Loss of carbohydrate reserves during winter may be an additional reason for this phenomenon.Under both growth cabinet and field conditions, increasing cold hardiness coincided with vernalization. Maximum cold hardiness was retained for several weeks after the completion of vernalization. These results suggest that the development of the maximum level of cold resistance may be related to the vernalization process.


Sign in / Sign up

Export Citation Format

Share Document