deficiency symptoms
Recently Published Documents


TOTAL DOCUMENTS

307
(FIVE YEARS 31)

H-INDEX

25
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Ryan Merry ◽  
Mary Jane Espina ◽  
Aaron Lorenz ◽  
Robert Stupar

Abstract Background Soybean iron deficiency chlorosis (IDC) is an important nutrient stress frequently found in high pH and/or soils high in calcium carbonates. To advance the understanding of IDC resistance in soybean, a rapid (21-day) controlled-environment assay was developed to investigate the effects of nodulation, pH, and calcium carbonate levels on soybean iron deficiency traits. This system was tested on four genotypes known to exhibit differences in iron efficiency, including two standard IDC check cultivars and a pair of near-isogenic lines exhibiting variation at an IDC resistance quantitative trait locus. Visual score, chlorophyll content, plant height, root dry mass, and shoot dry mass were measured to quantify iron stress. ResultsCalcium carbonate levels and nodulation were found to have the greatest effects on IDC severity. Increasing carbonate levels worsened IDC symptoms, while nodulation reduced symptoms in all genotypes. Higher pH levels increased iron deficiency symptoms in check genotypes ‘Corsoy 79’ and ‘Dawson’, but did not induce iron deficiency symptoms in near-isogenic lines. A significant interaction was observed between genotype, nodulation, and calcium carbonate level, indicating that a specific treatment level could discern IDC symptoms between genotypes differing in resistance to IDC. ConclusionsIDC symptoms were successfully induced in the Check Genotypes Experiment as well as the NIL Experiment, indicating the success of using a liquid CaCO3 source and this assay for inducing IDC in controlled environments. However, our results suggest that treatment levels that best differentiate genotypes for their IDC resistance may need to be determined for each experiment because of the unique way in which different genotypes display symptoms and respond to iron deficiency conditions.


2021 ◽  
Vol 23 (06) ◽  
pp. 36-46
Author(s):  
Vrunda Kusanur ◽  
◽  
Veena S Chakravarthi ◽  

Soil temperature and humidity straight away influence plant growth and the availability of plant nutrients. In this work, we carried out experiments to identify the relationship between climatic parameters and plant nutrients. When the relative humidity was very high, deficiency symptoms were shown on plant leaves and fruits. But, recognizing and managing these plant nutrients manually would become difficult. However, no much research has been done in this field. The main objective of this research was to propose a machine learning model to manage nutrient deficiencies in the plant. There were two main phases in the proposed research. In the first phase, the humidity, temperature, and soil moisture in the greenhouse environment were collected using WSN and the influence of these parameters on the growth of plants was studied. During experimentation, it was investigated that the transpiration rate decreased significantly and the macronutrient contents in the plant leave decreased when the humidity was 95%. In the second phase, a machine learning model was developed to identify and classify nutrient deficiency symptoms in a tomato plant. A total of 880 images were collected from Bingo images to form a dataset. Among all these images, 80% (704 images) of the dataset were used to train the machine learning model and 20% (176 images) of the dataset were used for testing the model performance. In this study, we selected K-means Clustering for key points detection and SVM for classification and prediction of nutrient stress in the plant. SVM using linear kernel performed better with the accuracy rates of 89.77 % as compared to SVM using a polynomial kernel.


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. A1041-A1041
Author(s):  
David T Zava ◽  
David W Kimball ◽  
Rebecca L Glaser

Abstract Introduction: High dose testosterone (T) has been used for treating hormone-sensitive breast cancers for many years. However, a drawback to T therapy is its propensity to convert to estradiol (E2) via aromatase, which can override the growth inhibitory effects of T and stimulate estrogen sensitive tumors. Aromatase is higher in some women than others, particularly those with active tumors, truncal obesity, and inflammatory conditions contributing to and caused by cancer. Oral aromatase inhibitors (AI) have been used to prevent conversion of T to E2. While this effectively reduces E2 burden, systemic oral AI require higher dosing, which often leads to severe side effects. As an alternative to oral AI therapy, researchers have found that T combined with a much lower dose of AI in a solitary pellet implant placed in the subcutaneous (SC) tissue, or alternatively, into the breast adjacent to a primary tumor, is effective in reducing tumor burden and maintaining a low systemic level of E2, while reducing the adverse side effects of very low E2. Study Design: In this case study we report on the use of an LC-MS/MS method to monitor salivary levels of E2, Estrone (E1), T, and the AIs Letrozole (LET) and Anastrozole (ANZ) following T + AI therapy in a breast cancer patient (intolerant of oral LET) with active (measurable) tumor in the breast and metastatic disease. Steroids and AIs were measured in saliva at baseline, 1 week, and 4 weeks after insertion of subcutaneous pellets containing ‘60 mg T + 4 mg ANZ’ and ‘60 mg T + 6 mg LET’. Results: LC-MS/MS (range values-pg/ml) for postmenopausal women, and baseline, week 1 and week 4 post treatment values for each steroid and AI were: E2: (range 0.3-0.9), 0.4, < 0.3, < 0.3; E1: (range 0.9-3.1), 1.0, <0.4, <0.4; T: (range 7-22), 6, 96, 48; ANZ: (range <4), < 4, 2063, 24; LET: (range <4), < 4, 744, 175. Self-reported (Pre/Post Therapy) estrogen deficiency symptoms such as hot flashes, night sweats, vaginal dryness, joint pain, and sleep disturbances were significantly improved post T+AI therapy. In addition, the intramammary tumor decreased in size > 95% by month 5. Summary: These results show that low-dose SC AI therapy (4-6 mg/2-3 months = 0.1 mg/day) with T (120 mg/2-3 months = 1-2 mg/day) increases T to supra-physiological levels, prevents T metabolism to estrogens E2 and E1, reduces estrogen deficiency symptoms, and has beneficial effects on tumor growth inhibition. Simultaneous testing of sex-steroids E2, E1, T, and aromatase inhibitors LET and ANZ by LC-MS/MS provides a convenient means to monitor the bioavailable levels of these analytes and adjust them as necessary to optimize therapeutic efficacy and reduce adverse side effects.


2021 ◽  
Vol 15 (1) ◽  
pp. e0008895
Author(s):  
Marcin P. Joachimiak

A wide variety of symptoms is associated with Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) infection, and these symptoms can overlap with other conditions and diseases. Knowing the distribution of symptoms across diseases and individuals can support clinical actions on timelines shorter than those for drug and vaccine development. Here, we focus on zinc deficiency symptoms, symptom overlap with other conditions, as well as zinc effects on immune health and mechanistic zinc deficiency risk groups. There are well-studied beneficial effects of zinc on the immune system including a decreased susceptibility to and improved clinical outcomes for infectious pathogens including multiple viruses. Zinc is also an anti-inflammatory and anti-oxidative stress agent, relevant to some severe Coronavirus Disease 2019 (COVID-19) symptoms. Unfortunately, zinc deficiency is common worldwide and not exclusive to the developing world. Lifestyle choices and preexisting conditions alone can result in zinc deficiency, and we compile zinc risk groups based on a review of the literature. It is also important to distinguish chronic zinc deficiency from deficiency acquired upon viral infection and immune response and their different supplementation strategies. Zinc is being considered as prophylactic or adjunct therapy for COVID-19, with 12 clinical trials underway, highlighting the relevance of this trace element for global pandemics. Using the example of zinc, we show that there is a critical need for a deeper understanding of essential trace elements in human health, and the resulting deficiency symptoms and their overlap with other conditions. This knowledge will directly support human immune health for decreasing susceptibility, shortening illness duration, and preventing progression to severe cases in the current and future pandemics.


2020 ◽  
Author(s):  
Thomas Christian Bang ◽  
Søren Husted ◽  
Kristian Holst Laursen ◽  
Daniel Pergament Persson ◽  
Jan Kofod Schjoerring

Agronomy ◽  
2020 ◽  
Vol 10 (12) ◽  
pp. 1946
Author(s):  
Zachary P. Stewart ◽  
Ellen T. Paparozzi ◽  
Charles S. Wortmann ◽  
Prakash Kumar Jha ◽  
Charles A. Shapiro

Nebraska soils are generally micronutrient sufficient. However, critical levels for current yields have not been validated. From 2013 to 2015, 26 on-farm paired comparison strip-trials were conducted across Nebraska to test the effect of foliar-applied micronutrients on maize (Zea mays L.) yield and foliar nutrient concentrations. Treatments were applied from V6 to V14 at sites with 10.9 to 16.4 Mg ha−1 yield. Soils ranged from silty clays to fine sands. Soil micronutrient availability and tissue concentrations were all above critical levels for deficiency. Significant grain yield increases were few. Micronutrient concentrations for leaf growth that occurred after foliar applications were increased 4 to 9 mg Zn kg−1 at 5 of 17 sites with application of 87 to 119 g Zn ha−1, 12 to 16 mg kg−1 Mn at 2 of 17 sites with application of 87 to 89 g Mn ha−1, and an average of 8.1 mg kg−1 Fe across 10 sites showing signs of Fe deficiency with application of 123 g foliar Fe ha−1. Foliar B concentration was not affected by B application. Increases in nutrient concentrations were not related to grain yield responses except for Mn (r = 0.54). The mean, significant grain yield response to 123 g foliar Fe ha−1 was 0.4 Mg ha−1 for the 10 sites with Fe deficiency symptoms. On average, maize yield response to foliar Fe application can be profitable if Fe deficiency symptoms are observed. Response to other foliar micronutrient applications is not likely to be profitable without solid evidence of a nutrient deficiency.


HortScience ◽  
2020 ◽  
Vol 55 (11) ◽  
pp. 1722-1729
Author(s):  
Christopher J. Currey ◽  
Vincent C. Metz ◽  
Nicholas J. Flax ◽  
Alex G. Litvin ◽  
Brian E. Whipker

The objective of this research was to quantify the effects of phosphorous (P) concentrations on the growth, development, and tissue mineral nutrient concentrations of four popular culinary herbs commonly grown in containers. Seedlings of sweet basil (Ocimum basilicum ‘Italian Large Leaf’), dill (Anethum graveolens ‘Fernleaf’), parsley (Petroselinum crispum ‘Giant of Italy’), and sage (Salvia officinalis) were individually transplanted to 11.4-cm-diameter containers filled with soilless substrate comprising canadian sphagnum peatmoss and coarse perlite. Upon transplanting and throughout the experiment, seedlings were irrigated with solutions containing 0, 5, 10, 20, or 40 mg·L−1 P; all other macro- and micronutrient concentrations were the same across P concentrations. Plants were grown for 4 weeks in a greenhouse; after that time, data were collected. Relationships between height and width and P concentrations were nonlinear for all four species; height and width increased as P increased to more than 0 mg·L−1 until the species-specific maxima; after that time, no further increase occurred. The same trend was observed for the branch length of sweet basil and sage, and for internode length, leaf area, and shoot dry mass of all four species. Although visible P deficiency symptoms were observed for plants provided with 0 mg·L−1 P, there were no signs of P deficiency for plants provided with ≥5 mg·L−1 P, even though tissue P concentrations were below the recommended sufficiency ranges. As a result of this research, containerized sweet basil, dill, parsley, and sage can be provided with 5 to 10 mg·L−1 P during production to limit growth and produce plants without visible nutrient deficiency symptoms that are proportional to their containers.


Sensors ◽  
2020 ◽  
Vol 20 (20) ◽  
pp. 5893 ◽  
Author(s):  
Jinhui Yi ◽  
Lukas Krusenbaum ◽  
Paula Unger ◽  
Hubert Hüging ◽  
Sabine J. Seidel ◽  
...  

In order to enable timely actions to prevent major losses of crops caused by lack of nutrients and, hence, increase the potential yield throughout the growing season while at the same time prevent excess fertilization with detrimental environmental consequences, early, non-invasive, and on-site detection of nutrient deficiency is required. Current non-invasive methods for assessing the nutrient status of crops deal in most cases with nitrogen (N) deficiency only and optical sensors to diagnose N deficiency, such as chlorophyll meters or canopy reflectance sensors, do not monitor N, but instead measure changes in leaf spectral properties that may or may not be caused by N deficiency. In this work, we study how well nutrient deficiency symptoms can be recognized in RGB images of sugar beets. To this end, we collected the Deep Nutrient Deficiency for Sugar Beet (DND-SB) dataset, which contains 5648 images of sugar beets growing on a long-term fertilizer experiment with nutrient deficiency plots comprising N, phosphorous (P), and potassium (K) deficiency, as well as the omission of liming (Ca), full fertilization, and no fertilization at all. We use the dataset to analyse the performance of five convolutional neural networks for recognizing nutrient deficiency symptoms and discuss their limitations.


Sign in / Sign up

Export Citation Format

Share Document