scholarly journals Control of liverwort (Marchantia polymorpha L.) growth in nursery plants with mulches of Sphagnum moss and blackcurrant stem pieces

2020 ◽  
Vol 29 (3) ◽  
Author(s):  
Liisa Särkkä ◽  
Risto Tahvonen

Liverwort (Marchantia polymorpha L.) is a problematic weed on container surfaces in nurseries, because it hampers water and nutrient access to growing plants. No chemical herbicide against it is available in the EU. Mulches are the most common non-chemical weed control method. Mulches of Sphagnum moss and 1-cm blackcurrant stem pieces were used. Mulches’ effect on liverwort control continued for two years on highbush blueberry and blackcurrant, and one year on rhododendron. The blackcurrant stem pieces trial continued for one year. Blueberry and rhododendron demand acidic growing media, creating an acute need for liverwort control. Sieved moss was used in two different layers on top of a pot. The prevention rate of liverwort growth in blueberry in July–August was 95–99%, and in October 78–90%, depending on weather conditions; in rhododendron and blackcurrant, it was 90–95%. The control effect was diminished in more decomposed moss. No significant difference between thickness and coarseness of moss mulch layers was observed. Blackcurrant stem pieces controlled liverwort growth by almost 100%.

Author(s):  
Ileana BOGDAN ◽  
Teodor RUSU ◽  
Ştefania GADEA ◽  
Ilarie IVAN ◽  
Paula MORARU ◽  
...  

The paper present the results of 26 variants of weed control in maize (grouped in 6 distinctly strategies) which were tested in 2010 agricultural year in a one factorial stationary experiment. Three of the strategies were based on post emergence weed control methods, two of them – on pre-emergent weed control method and one of them – based on both methods. The main goal was establishing an optimal network weeds control in maize crop. Maize weed in Luduş area increased, due to weed seeds reserve in the arable stratum and weather conditions, which allow weeds to grow alternatively, and because of the development of problem-causing species during the maize vegetation period, when no tilling is performed.


2012 ◽  
Vol 60 (1) ◽  
pp. 159-164
Author(s):  
Marian Wesołowski ◽  
Elżbieta Harasim

The objective of the study was to determine the time of occurrence of the emergence, budding, fruiting and seed shedding stages, as well as the degree of advancement of the white goosefoot fruiting and diaspores shedding stages in fodder beet, spring wheat and faba bean crops under mechanical and chemical weed control. Phenological observations were conducted in the years 2000-2002 at 10-day intervals, starting from the day of crop sowing on alluvial soil made of light loam. Chemically weed controlled objects were treated with herbicides: fodder beet - lenacil 80%; spring wheat - MCPA 30% + dicamba 4%; faba bean - linuron 50%. It was proven that the times of occurrence and the scale of the studied phenological stages of white goosefoot depended on the crop species, the in-crop weed control method and the pattern of weather conditions in the study years. White goosefoot had the most favourable conditions of growth in the fodder beet crop. The herbicides in the fodder beet and faba bean crops delayed the emergence and the time of occurrence of successive white goosefoot growth stages. These agents also decreased the degree of diaspores shedding by the weed species studied. The most white goosefoot specimens shed fruits on the mechanically weed controlled plots. The diaspores dissemination was promoted by a warm and moist growing season.


2021 ◽  
Vol 13 (16) ◽  
pp. 8820
Author(s):  
YunEui Choi ◽  
Eunhye Ji ◽  
Jinhyung Chon

Creating a green infrastructure that is effective for reducing fine dust is a significant challenge for urban landscape planners. In this study, a fine dust reduction planting model that can be applied to socially vulnerable area was developed, and its effects were verified. Using PM10, PM2.5, temperature, relative humidity, wind direction, and wind speed measured for approximately one year, the changes in the concentration of fine dust according to the weather conditions were investigated. As a result of the analysis, there was a significant difference in the concentration of fine dust inside and outside the planting zone (p < 0.05). In addition, there is a significant difference between the fine dust reduction effect of the multilayered planting model and the single planting model (p < 0.05). The paper’s main findings are as follows: (1) When the green cover rate is over 50%, the concentration of fine dust is lower than that outside the planting zones. (2) Multilayered planting zones are more effective in reducing the concentration of fine dust than single-structured planting zones. (3) Multilayered planting zones reduce the concentration of fine dust by changing the microclimate. The results of this study can be used as basic data for small urban planting design to reduce fine dust for children’s health in socially vulnerable areas.


VASA ◽  
2017 ◽  
Vol 46 (6) ◽  
pp. 484-489 ◽  
Author(s):  
Tom Barker ◽  
Felicity Evison ◽  
Ruth Benson ◽  
Alok Tiwari

Abstract. Background: The invasive management of varicose veins has a known risk of post-operative deep venous thrombosis and subsequent pulmonary embolism. The aim of this study was to evaluate absolute and relative risk of venous thromboembolism (VTE) following commonly used varicose vein procedures. Patients and methods: A retrospective analysis of secondary data using Hospital Episode Statistics database was performed for all varicose vein procedures performed between 2003 and 2013 and all readmissions for VTE in the same patients within 30 days, 90 days, and one year. Comparison of the incidence of VTEs between procedures was performed using a Pearson’s Chi-squared test. Results: In total, 261,169 varicose vein procedures were performed during the period studied. There were 686 VTEs recorded at 30 days (0.26 % incidence), 884 at 90 days (0.34 % incidence), and 1,246 at one year (0.48 % incidence). The VTE incidence for different procedures was between 0.15–0.35 % at 30 days, 0.26–0.50 % at 90 days, and 0.46–0.58 % at one year. At 30 days there was a significantly lower incidence of VTEs for foam sclerotherapy compared to other procedures (p = 0.01). There was no difference in VTE incidence between procedures at 90 days (p = 0.13) or one year (p = 0.16). Conclusions: Patients undergoing varicose vein procedures have a small but appreciable increased risk of VTE compared to the general population, with the effect persisting at one year. Foam sclerotherapy had a lower incidence of VTE compared to other procedures at 30 days, but this effect did not persist at 90 days or at one year. There was no other significant difference in the incidence of VTE between open, endovenous, and foam sclerotherapy treatments.


1997 ◽  
Vol 78 (05) ◽  
pp. 1327-1331 ◽  
Author(s):  
Paul A Kyrle ◽  
Andreas Stümpflen ◽  
Mirko Hirschl ◽  
Christine Bialonczyk ◽  
Kurt Herkner ◽  
...  

SummaryIncreased thrombin generation occurs in many individuals with inherited defects in the antithrombin or protein C anticoagulant pathways and is also seen in patients with thrombosis without a defined clotting abnormality. Hyperhomocysteinemia (H-HC) is an important risk factor of venous thromboembolism (VTE). We prospectively followed 48 patients with H-HC (median age 62 years, range 26-83; 18 males) and 183 patients (median age 50 years, range 18-85; 83 males) without H-HC for a period of up to one year. Prothrombin fragment Fl+2 (Fl+2) was determined in the patient’s plasma as a measure of thrombin generation during and at several time points after discontinuation of secondary thromboprophylaxis with oral anticoagulants. While on anticoagulants, patients with H-HC had significantly higher Fl+2 levels than patients without H-HC (mean 0.52 ± 0.49 nmol/1, median 0.4, range 0.2-2.8, versus 0.36 ± 0.2 nmol/1, median 0.3, range 0.1-2.1; p = 0.02). Three weeks and 3,6,9 and 12 months after discontinuation of oral anticoagulants, up to 20% of the patients with H-HC and 5 to 6% without H-HC had higher Fl+2 levels than a corresponding age- and sex-matched control group. 16% of the patients with H-HC and 4% of the patients without H-HC had either Fl+2 levels above the upper limit of normal controls at least at 2 occasions or (an) elevated Fl+2 level(s) followed by recurrent VTE. No statistical significant difference in the Fl+2 levels was seen between patients with and without H-HC. We conclude that a permanent hemostatic system activation is detectable in a proportion of patients with H-HC after discontinuation of oral anticoagulant therapy following VTE. Furthermore, secondary thromboprophylaxis with conventional doses of oral anticoagulants may not be sufficient to suppress hemostatic system activation in patients with H-HC.


Author(s):  
A.J. Cresswell

This paper, as well as being a testimonial to the benefit the writer has received from the Grassland Association, shows how the knowledge of scientists has been used to increase lucerne seed yields by methods of growing resistant cultivars especially for seed production as opposed to growing for hay, silage or grazing. It shows how new cultivars can be multiplied quickly by growing two crops in one year, one in each hemisphere, by using low seeding rates, wide plant spacing and very good weed control. Increased flowering of the crop has been achieved by the use of boron and the choice of time of closing; better pollination has been achieved by the use of more efficient bees - two varieties of which have been imported from North America. Weed and insect pest control and the use of a desiccant at harvest are contributing to a four-fold increase in seed yield, which should double again soon,


2020 ◽  
Vol 16 (3) ◽  
Author(s):  
Apar Pokharel ◽  
Naganawalachullu Jaya Prakash Mayya ◽  
Nabin Gautam

Introduction: Deviated nasal septum is one of the most common causes for the nasal obstruction. The objective of this study is to compare the surgical outcomes in patients undergoing conventional septoplasty and endoscopic septoplasty in the management of deviated nasal septum. Methods:  Prospective comparative study was conducted on 60 patients who presented to the Department of ENT, College of Medical sciences, during a period of one year. The severity of the symptoms was subjectively assessed using NOSE score and objectively assessed using modified Gertner plate. Results: There was significant improvement in functional outcome like NOSE Score and area over the Gertner plate among patients who underwent endoscopic septoplasty. Significant difference in incidence of post-operative nasal synechae and haemorrhage was seen in conventional group compared to endoscopic group. Conclusions: Endoscopic surgery is an evolutionary step towards solving the problems related to deviated nasal septum. It is safe, effective and conservative, alternative to conventional septal surgery.


2011 ◽  
pp. 70-76
Author(s):  

Objectives: To evualate the effects of early intervention program after one year for 33 disabled children in Hue city in 2010. Objects and Methods: Conduct with practical work and assessment on developing levels at different skills of the children with developmental delay under 6 years old who are the objects of the program. Results: With the Portage checklist used as a tool for implementing the intervention at the community and assessing developing skills on Social, Cognition, Motor, Self-help and Language skills for children with developmental delay, there still exists significant difference (p ≤ 0.05) at developing level of all areas in the first assessment (January, 2010) and the second assessment (December, 2010) after 12 months. In comparison among skills of different types of disabilities, there is significant difference of p ≤ 0.05 of social, cognition and language skills in the first assessment and of social, cognition, motor and language skills in the second assessment. Conclusion: Home-based Early Intervention Program for children with developmental delay has achieved lots of progress in improving development skills of the children and enhancing the parents’ abilities in supporting their children at home.


HortScience ◽  
1998 ◽  
Vol 33 (6) ◽  
pp. 938-940 ◽  
Author(s):  
Monica Ozores-Hampton

Author(s):  
Tewogbade Adeoye Adedeji ◽  
Simeon Adelani. Adebisi ◽  
Nife Olamide Adedeji ◽  
Olusola Akanni Jeje ◽  
Rotimi Samuel Owolabi

Background: Human immunodeficiency virus (HIV) infection impairs renal function, thereby affecting renal phosphate metabolism. Objectives: We prospectively estimated the prevalence of phosphate abnormalities (mild, moderate to life-threatening hypophosphataemia, and hyperphosphataemia) before initiating antiretroviral therapy (ART). Methods: A cross-sectional analysis was performed on 170 consecutive newly diagnosed ART-naïve, HIV-infected patients attending our HIV/AIDS clinics over a period of one year. Fifty (50) screened HIV-negative blood donors were used for comparison (controls). Blood and urine were collected simultaneously for phosphate and creatinine assay to estimate fractional phosphate excretion (FEPi %) and glomerular filtration rate (eGFR). Results: eGFR showed significant difference between patients’ and controls’ medians (47.89ml/min/1.73m2 versus 60ml/min/1.73m2, p <0.001); which denotes a moderate chronic kidney disease in the patients. Of the 170 patients, 78 (45.9%) had normal plasma phosphate (0.6-1.4 mmol/L); 85 (50%) had hyperphosphataemia. Grades 1, 2 and 3 hypophosphataemia was observed in 3 (1.8%), 3 (1.8%), and 1(0.5%) patient(s) respectively. None had grade 4 hypophosphataemia. Overall, the patients had significantly higher median of plasma phosphate than the controls, 1.4 mmol/L (IQR: 1.0 – 2.2) versus 1.1 mmol/L (IQR: 0.3 – 1.6), p <0.001, implying hyperphosphataemia in the patients; significantly lower median urine phosphate than the controls, 1.5 mmol/L (IQR: 0.7 -2.1) versus 8.4 mmol/L (IQR: 3.4 – 16), p <0.001), justifying the hyperphosphataemia is from phosphate retention; but a non-significantly lower median FEPi% than the controls, 0.96 % (IQR: 0.3 -2.2) versus 1.4% (IQR: 1.2 -1.6), p > 0.05. Predictors of FEPi% were age (Odds ratio, OR 0.9, p = 0.009); weight (OR 2.0, p < 0.001); CD4+ cells count predicted urine phosphate among males (p = 0.029). Conclusion: HIV infection likely induces renal insufficiency with reduced renal phosphate clearance. Thus, hyperphosphataemia is highly prevalent, and there is mild to moderate hypophosphataemia but its life-threatening form (grade 4) is rare among ART-naïve HIV patients.


Sign in / Sign up

Export Citation Format

Share Document