Integrated Evaluation on Soil Nutrients in Spring Maize (Zea mays) Field Subjected to Limited Irrigation

2013 ◽  
Vol 389 ◽  
pp. 67-72
Author(s):  
Heng Jia Zhang ◽  
Jun Hui Li

An experiment was conducted to explore the integrated evaluation on soil nutrients in spring maize field subjected to limited irrigation (LI) in oasis region. The soil organic matter (SOM), soil total and available nitrogen (STN and SAN) and phosphorus (STP and SAP), and soil available potassium (SAK) in 0~40 cm increment at harvest of maize subjected to LI were selected as the evaluation factors to calculate the weighing coefficient of each soil nutrient and the IEI for soil nutrients using the membership function in fuzzy mathematics. At maize harvest, differences were not significant (p>0.05) in SOM, STN, STP, SAP, and SAK within 0~40 cm increment among treatments and CK, but significant difference (p<0.05) was found in SAN, with the maximum SAN maintained in MI5, which was respectively 187.3%, 96.8%, and 41.2% higher over MI2 valued the minimum, MI1, and CK. The IEI was improved by 12.4% to 22.3% in all the other treatments and CK compared to the minimum marked in MI4, with the maximum valued in MI3 treatments. Therefore, after one year experiment, the optimized irrigation management was maintained in MI3 treatment due to its maximum IEI in all the LI regimes.

Agriculture ◽  
2021 ◽  
Vol 11 (12) ◽  
pp. 1219
Author(s):  
Xiaodan Wang ◽  
Hua Ma ◽  
Chunyun Guan ◽  
Mei Guan

The rapidly emerging fertilizer rapeseed used as green manure has wide applications for use. However, there have been few studies on its decomposition and effects on soil nutrients and microorganisms after its decay. In this study, 12 rapeseed lines to be screened were decomposed through a randomized block field design with two green-manure-specific varieties as the controls. The contents of nitrogen, phosphorus, and potassium from the plants, soil nutrients, and microbial changes after degradation were measured. There were substantial differences in the rates of decomposition and cumulative release of nutrients among the different lines after 30 days of rolling. The contents of phosphorus and potassium in the soil were 1.23–2.03 and 3.93–6.32 times those before decomposition, respectively. In addition, there was a significant difference in the relative content of soil microorganisms at the phylum level after the decomposition of different species of rapeseeds. Most of the top 20 bacterial groups significantly correlated with the characteristics of plant decomposition and soil nutrient content, including Proteobacteria, Actinomycetes, Armatimonadetes, Rokubacteria, and Planctomycetes. A principal component analysis showed that the soil microorganisms and nutrients are the leading factors that enable the evaluation of the decomposing characteristics of green manure rapeseed. Numbers 5 (purple leaf mustard) and 8 (Xiafang self-seeding) were more effective than two controls, which can be used as excellent types of germplasm to promote the breeding of green manure rapeseed.


2013 ◽  
Vol 773 ◽  
pp. 837-843
Author(s):  
Heng Jia Zhang ◽  
Jun Hui Li

A field experiment was carried out in 2007 and 2008 growing seasons to explore the comprehensive evaluation of soil nutrients in spring wheat field under regulated deficit irrigation (RDI) in an arid environment. The soil organic matter, soil total and available nitrogen, phosphorus, and potassium in 0~40 cm layer after two years of RDI management were selected as the evaluating factors to determine both the weighing coefficient of each soil nutrient component and the comprehensive evaluation index for soil nutrients (CEISN) using the membership function in fuzzy mathematics. The results showed that the CEISN was higher in all the RDI management than that in the no water deficit control except 3.5% lower in RDI2 over CK. The CEISN improvement was respectively 6.3%, 6.1%, 6.1%, 5.3%, 4.9%, and 3.8% higher in RDI7, RDI4, RDI5, RDI1, RDI3, RDI6 than in CK sequentially. Consequently, after two years of experiment, the optimal water management and sustainable soil nutrient use pattern was maintained in RDI7 due to its maximum CEISN in all the water deficit regulation regimes.


Agro-Science ◽  
2021 ◽  
Vol 21 (1) ◽  
pp. 114-116
Author(s):  
S. Idris ◽  
A. Rilwan ◽  
S.A. Abubakar ◽  
M. Adamu ◽  
Y. Sadiq ◽  
...  

Soil testing is key to soil fertility management as it serves as a fertilizer application guide to farmers, scientists and consultants. It gives information on soil nutrient status and its supplying capacity. Laboratory (LB) procedures have been the most reliable approach for soil nutrients analyses. However, it is costly and nonpoint. Thus, the use of in–situ testing kit emerges and becomes prominent. Notwithstanding, applicability of soil testing kit must be validated by laboratory test. This work aimed to examine the reliability/suitability of Soil Testing Kit® Transchem (SK) in determining selected soil nutrients in Sahel Savannah, Nigeria. Twentyfive replicate soil samples were collected from 12°47’86’’-12°20’96’’N and 4°38’37’’-4°188’02’’E, Kebbi State Nigeria and used to test soil pH, N, P, K and soil organic carbon (SOC) by SK and LB. The SK uses colour chart and comparator for rating nutrients status qualitatively into; low, medium and high and up to very high for P. The LB results were transformed to qualitative data by corresponding the values with soil rating standardinto low, medium and high. To perform statistics, weighting was done by assigning weight load to each category; low = 1, medium = 2 and high = 3. The two methods were compared using t-test, regression and descriptive analyses. Results showed non-significant difference between the two methods for soil contents of N, P and K. However, SK poorly estimated soil pH and SOC. Correlation and regression coefficients (r = 0.915 and R2 = 0.838, respectively) indicated reliability of the SK. It is concluded that SK can be reliably used for N, P, and K but not soil pH and SOC estimation for soils in Sahel savannah of Nigeria.


Plants ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 881
Author(s):  
Taimoor Hassan Farooq ◽  
Uttam Kumar ◽  
Jing Mo ◽  
Awais Shakoor ◽  
Jun Wang ◽  
...  

Intercropping is one of the most widely used agroforestry techniques, reducing the harmful impacts of external inputs such as fertilizers. It also controls soil erosion, increases soil nutrients availability, and reduces weed growth. In this study, the intercropping of peanut (Arachishypogaea L.) was done with tea plants (Camellia oleifera), and it was compared with the mono-cropping of tea and peanut. Soil health and fertility were examined by analyzing the variability in soil enzymatic activity and soil nutrients availability at different soil depths (0–10 cm, 10–20 cm, 20–30 cm, and 30–40 cm). Results showed that the peanut–tea intercropping considerably impacted the soil organic carbon (SOC), soil nutrient availability, and soil enzymatic responses at different soil depths. The activity of protease, sucrase, and acid phosphatase was higher in intercropping, while the activity of urease and catalase was higher in peanut monoculture. In intercropping, total phosphorus (TP) was 14.2%, 34.2%, 77.7%, 61.9%; total potassium (TK) was 13.4%, 20%, 27.4%, 20%; available phosphorus (AP) was 52.9%, 26.56%, 61.1%; 146.15% and available potassium (AK) was 11.1%, 43.06%, 46.79% higher than the mono-cropping of tea in respective soil layers. Additionally, available nitrogen (AN) was 51.78%, 5.92%, and 15.32% lower in the 10–20 cm, 20–30 cm, and 30–40 cm layers of the intercropping system than in the mono-cropping system of peanut. Moreover, the soil enzymatic activity was significantly correlated with SOC and total nitrogen (TN) content across all soil depths and cropping systems. The depth and path analysis effect revealed that SOC directly affected sucrase, protease, urease, and catalase enzymes in an intercropping system. It was concluded that an increase in the soil enzymatic activity in the intercropping pattern improved the reaction rate at which organic matter decomposed and released nutrients into the soil environment. Enzyme activity in the decomposition process plays a vital role in forest soil morphology and function. For efficient land use in the cropping system, it is necessary to develop coherent agroforestry practices. The results in this study revealed that intercropping certainly enhance soil nutrients status and positively impacts soil conservation.


PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e5741 ◽  
Author(s):  
Sai Gong ◽  
Chen Chen ◽  
Jingxian Zhu ◽  
Guangyao Qi ◽  
Shuxia Jiang

BackgroundCultivating the wine-cap mushroom (Stropharia rugosoannulata) on forestland has become popular in China. However, the effects of wine-capStrophariacultivation on soil nutrients and bacterial communities are poorly understood.MethodsWe employed chemical analyses and high-throughput sequencing to determine the impact of cultivating the wine-capStrophariaon soil nutrients and bacterial communities of forestland.ResultsCultivation regimes ofStrophariaon forestland resulted in consistent increases of soil organic matter (OM) and available phosphorus (AP) content. Among the cultivation regimes, the greatest soil nutrient contents were found in the one-year interval cultivation regime, and the lowest totalNand alkaline hydrolysable N contents were observed in the current-year cultivation regime. No significant differences were observed in alpha diversity among all cultivation regimes. Specific soil bacterial groups, such as Acidobacteria, increased in abundance after cultivation ofStropharia rugosoannulata.DiscussionGiven the numerous positive effects exerted by OM on soil physical and chemical properties, and the consistent increase in OM content for all cultivation regimes, we suggest that mushroom cultivation is beneficial to forest soil nutrient conditions through increasing OM content. Based on the fact that the one-year interval cultivation regime had the highest soil nutrient content as compared with other cultivation regimes, we recommend this regime for application in farming practice. The spent mushroom compost appeared to be more influential than the hyphae ofS. rugosoannulataon the soil nutrients and bacterial communities; however, this requires further study. This research provides insight into understanding the effects of mushroom cultivation on the forest soil ecosystem and suggests a relevant cultivation strategy that reduces its negative impacts.


VASA ◽  
2017 ◽  
Vol 46 (6) ◽  
pp. 484-489 ◽  
Author(s):  
Tom Barker ◽  
Felicity Evison ◽  
Ruth Benson ◽  
Alok Tiwari

Abstract. Background: The invasive management of varicose veins has a known risk of post-operative deep venous thrombosis and subsequent pulmonary embolism. The aim of this study was to evaluate absolute and relative risk of venous thromboembolism (VTE) following commonly used varicose vein procedures. Patients and methods: A retrospective analysis of secondary data using Hospital Episode Statistics database was performed for all varicose vein procedures performed between 2003 and 2013 and all readmissions for VTE in the same patients within 30 days, 90 days, and one year. Comparison of the incidence of VTEs between procedures was performed using a Pearson’s Chi-squared test. Results: In total, 261,169 varicose vein procedures were performed during the period studied. There were 686 VTEs recorded at 30 days (0.26 % incidence), 884 at 90 days (0.34 % incidence), and 1,246 at one year (0.48 % incidence). The VTE incidence for different procedures was between 0.15–0.35 % at 30 days, 0.26–0.50 % at 90 days, and 0.46–0.58 % at one year. At 30 days there was a significantly lower incidence of VTEs for foam sclerotherapy compared to other procedures (p = 0.01). There was no difference in VTE incidence between procedures at 90 days (p = 0.13) or one year (p = 0.16). Conclusions: Patients undergoing varicose vein procedures have a small but appreciable increased risk of VTE compared to the general population, with the effect persisting at one year. Foam sclerotherapy had a lower incidence of VTE compared to other procedures at 30 days, but this effect did not persist at 90 days or at one year. There was no other significant difference in the incidence of VTE between open, endovenous, and foam sclerotherapy treatments.


1997 ◽  
Vol 78 (05) ◽  
pp. 1327-1331 ◽  
Author(s):  
Paul A Kyrle ◽  
Andreas Stümpflen ◽  
Mirko Hirschl ◽  
Christine Bialonczyk ◽  
Kurt Herkner ◽  
...  

SummaryIncreased thrombin generation occurs in many individuals with inherited defects in the antithrombin or protein C anticoagulant pathways and is also seen in patients with thrombosis without a defined clotting abnormality. Hyperhomocysteinemia (H-HC) is an important risk factor of venous thromboembolism (VTE). We prospectively followed 48 patients with H-HC (median age 62 years, range 26-83; 18 males) and 183 patients (median age 50 years, range 18-85; 83 males) without H-HC for a period of up to one year. Prothrombin fragment Fl+2 (Fl+2) was determined in the patient’s plasma as a measure of thrombin generation during and at several time points after discontinuation of secondary thromboprophylaxis with oral anticoagulants. While on anticoagulants, patients with H-HC had significantly higher Fl+2 levels than patients without H-HC (mean 0.52 ± 0.49 nmol/1, median 0.4, range 0.2-2.8, versus 0.36 ± 0.2 nmol/1, median 0.3, range 0.1-2.1; p = 0.02). Three weeks and 3,6,9 and 12 months after discontinuation of oral anticoagulants, up to 20% of the patients with H-HC and 5 to 6% without H-HC had higher Fl+2 levels than a corresponding age- and sex-matched control group. 16% of the patients with H-HC and 4% of the patients without H-HC had either Fl+2 levels above the upper limit of normal controls at least at 2 occasions or (an) elevated Fl+2 level(s) followed by recurrent VTE. No statistical significant difference in the Fl+2 levels was seen between patients with and without H-HC. We conclude that a permanent hemostatic system activation is detectable in a proportion of patients with H-HC after discontinuation of oral anticoagulant therapy following VTE. Furthermore, secondary thromboprophylaxis with conventional doses of oral anticoagulants may not be sufficient to suppress hemostatic system activation in patients with H-HC.


2020 ◽  
Vol 16 (3) ◽  
Author(s):  
Apar Pokharel ◽  
Naganawalachullu Jaya Prakash Mayya ◽  
Nabin Gautam

Introduction: Deviated nasal septum is one of the most common causes for the nasal obstruction. The objective of this study is to compare the surgical outcomes in patients undergoing conventional septoplasty and endoscopic septoplasty in the management of deviated nasal septum. Methods:  Prospective comparative study was conducted on 60 patients who presented to the Department of ENT, College of Medical sciences, during a period of one year. The severity of the symptoms was subjectively assessed using NOSE score and objectively assessed using modified Gertner plate. Results: There was significant improvement in functional outcome like NOSE Score and area over the Gertner plate among patients who underwent endoscopic septoplasty. Significant difference in incidence of post-operative nasal synechae and haemorrhage was seen in conventional group compared to endoscopic group. Conclusions: Endoscopic surgery is an evolutionary step towards solving the problems related to deviated nasal septum. It is safe, effective and conservative, alternative to conventional septal surgery.


2011 ◽  
pp. 70-76
Author(s):  

Objectives: To evualate the effects of early intervention program after one year for 33 disabled children in Hue city in 2010. Objects and Methods: Conduct with practical work and assessment on developing levels at different skills of the children with developmental delay under 6 years old who are the objects of the program. Results: With the Portage checklist used as a tool for implementing the intervention at the community and assessing developing skills on Social, Cognition, Motor, Self-help and Language skills for children with developmental delay, there still exists significant difference (p ≤ 0.05) at developing level of all areas in the first assessment (January, 2010) and the second assessment (December, 2010) after 12 months. In comparison among skills of different types of disabilities, there is significant difference of p ≤ 0.05 of social, cognition and language skills in the first assessment and of social, cognition, motor and language skills in the second assessment. Conclusion: Home-based Early Intervention Program for children with developmental delay has achieved lots of progress in improving development skills of the children and enhancing the parents’ abilities in supporting their children at home.


Author(s):  
Tewogbade Adeoye Adedeji ◽  
Simeon Adelani. Adebisi ◽  
Nife Olamide Adedeji ◽  
Olusola Akanni Jeje ◽  
Rotimi Samuel Owolabi

Background: Human immunodeficiency virus (HIV) infection impairs renal function, thereby affecting renal phosphate metabolism. Objectives: We prospectively estimated the prevalence of phosphate abnormalities (mild, moderate to life-threatening hypophosphataemia, and hyperphosphataemia) before initiating antiretroviral therapy (ART). Methods: A cross-sectional analysis was performed on 170 consecutive newly diagnosed ART-naïve, HIV-infected patients attending our HIV/AIDS clinics over a period of one year. Fifty (50) screened HIV-negative blood donors were used for comparison (controls). Blood and urine were collected simultaneously for phosphate and creatinine assay to estimate fractional phosphate excretion (FEPi %) and glomerular filtration rate (eGFR). Results: eGFR showed significant difference between patients’ and controls’ medians (47.89ml/min/1.73m2 versus 60ml/min/1.73m2, p <0.001); which denotes a moderate chronic kidney disease in the patients. Of the 170 patients, 78 (45.9%) had normal plasma phosphate (0.6-1.4 mmol/L); 85 (50%) had hyperphosphataemia. Grades 1, 2 and 3 hypophosphataemia was observed in 3 (1.8%), 3 (1.8%), and 1(0.5%) patient(s) respectively. None had grade 4 hypophosphataemia. Overall, the patients had significantly higher median of plasma phosphate than the controls, 1.4 mmol/L (IQR: 1.0 – 2.2) versus 1.1 mmol/L (IQR: 0.3 – 1.6), p <0.001, implying hyperphosphataemia in the patients; significantly lower median urine phosphate than the controls, 1.5 mmol/L (IQR: 0.7 -2.1) versus 8.4 mmol/L (IQR: 3.4 – 16), p <0.001), justifying the hyperphosphataemia is from phosphate retention; but a non-significantly lower median FEPi% than the controls, 0.96 % (IQR: 0.3 -2.2) versus 1.4% (IQR: 1.2 -1.6), p > 0.05. Predictors of FEPi% were age (Odds ratio, OR 0.9, p = 0.009); weight (OR 2.0, p < 0.001); CD4+ cells count predicted urine phosphate among males (p = 0.029). Conclusion: HIV infection likely induces renal insufficiency with reduced renal phosphate clearance. Thus, hyperphosphataemia is highly prevalent, and there is mild to moderate hypophosphataemia but its life-threatening form (grade 4) is rare among ART-naïve HIV patients.


Sign in / Sign up

Export Citation Format

Share Document