scholarly journals Identification and pathogenicity of Botryosphaeria parva associated with grapevine decline in Kurdistan region - Iraq

2012 ◽  
Vol 65 (1) ◽  
pp. 71-78 ◽  
Author(s):  
Raed A. Haleem ◽  
Samir K. Abdullah ◽  
Jaladet M. S. Jubrael

During a survey on fungi associated with decline symptoms on grapevine cultivars growing in Kurdistan region of Iraq, several isolates of <i>Botryosphaeria</i> species were encountered. All isolates were identified as <i>Botryosphaeria parva</i> Pennycook and Samuels. Pathogenicity test for isolate DKI 1 was performed on two cultivars, Taefi and Rashmew. Under greenhouse conditions, one-year grape rooted cuttings were inoculated with the pathogen isolate by two methods, injecting the spore suspension into the green shoots and by artificial inoculation of wounded shoots with mycelial mat. The highest canker length (15.0 mm) was produced after four months on the shoots of the Taife cultivar artificially inoculated with mycelial mat of the pathogen. Under field conditions, two methods of inoculation were adopted, wounding the green shoots and drilling a hole in the arms of mature vine, followed by inoculation with mycelial mat. The highest canker length (11.17 mm) was obtained after 5 months on wounded shoots of the Rashmew cultivar and with a significant difference from the Taefi cultivar. The pathogen caused a reduction in fresh and dry weight of green shoots and roots compared with the non-inoculated control. This is the first report on <i>B. parva</i> in Iraq.

1993 ◽  
Vol 7 (4) ◽  
pp. 966-971 ◽  
Author(s):  
Robert T. Macdonald ◽  
J. Christopher Hall ◽  
James J. O'Toole ◽  
Clarence J. Swanton

Experiments were conducted under controlled environment and field conditions to evaluate the influence of growth stage and fluroxypyr dosage on control of field bindweed. In controlled environment studies fluroxypyr effectively controlled 8- to 12-leaf field bindweed. Shoot number, length, and dry weight, and root dry weight decreased as herbicide dose increased. The estimated ED50(effective dose for 50% reduction) values for shoot and root dry weight were 50 and 33 g ai/ha, respectively. The ED50for shoot length was 98 g ai/ha. Fluroxypyr was applied at rates from 0.2 to 0.4 kg/ha under field conditions to field bindweed at selected stages of growth. Regardless of herbicide dosage, fluroxypyr applied at the late flowering stage of growth controlled field bindweed better than when applied at the bud or early flower stage. Corn grain yield increased as a function of fluroxypyr dose in 1988 but not in 1987. Dry weight of roots and shoots of field bindweed harvested one year after treatment decreased with increasing rates of fluroxypyr. These studies demonstrate the potential of fluroxypyr for the control of field bindweed.


HortScience ◽  
1994 ◽  
Vol 29 (5) ◽  
pp. 551a-551
Author(s):  
R.R. Tripepi ◽  
M.W. George ◽  
A.G. Campbell

Pulp and paper sludge is a byproduct of paper production, yet this fibrous material may be suitable as an alternative amendment for peat moss in container media. Newsprint mill sludge was composted 6 weeks and cured before use. One-year-old seedlings of lilac (Syringa vulgaris L.) and amur maple (Acer ginnala Maxim.) as well as rooted cuttings o cistena plum Prunus × cistena Hansen) were planted in 3-liter pots that contained a barksand (2:1 by vol) mix, 25% or 50% peatamended media, or 25% or 50% sludge-amended media. After 14 weeks outdoors, shoot dry weight and changes in plant height were measured. All species planted in sludge-amended media grew as well as those potted in peat-amended media or the bark:sand mix. In fact, some species grew best in sludge-amended media. Lilac seedlings planted in 25% sludge produced almost double the amount of shoot dry weight and were 80% taller than plants in the bark:sand mix or 25% peat. Maple plants grown in 500% sludge produced over 100% or 3590 more shoot dry weight than those grown in 25% or 50% peat-amended media, respectively. Plum cuttings pottedin25910 sludge grew at least 53% taller than plants grown in either peat-amended medium. These results indicate that composted newsprint sludge can be used as a peat moss substitute in a container medium for the landscape plants tested.


VASA ◽  
2017 ◽  
Vol 46 (6) ◽  
pp. 484-489 ◽  
Author(s):  
Tom Barker ◽  
Felicity Evison ◽  
Ruth Benson ◽  
Alok Tiwari

Abstract. Background: The invasive management of varicose veins has a known risk of post-operative deep venous thrombosis and subsequent pulmonary embolism. The aim of this study was to evaluate absolute and relative risk of venous thromboembolism (VTE) following commonly used varicose vein procedures. Patients and methods: A retrospective analysis of secondary data using Hospital Episode Statistics database was performed for all varicose vein procedures performed between 2003 and 2013 and all readmissions for VTE in the same patients within 30 days, 90 days, and one year. Comparison of the incidence of VTEs between procedures was performed using a Pearson’s Chi-squared test. Results: In total, 261,169 varicose vein procedures were performed during the period studied. There were 686 VTEs recorded at 30 days (0.26 % incidence), 884 at 90 days (0.34 % incidence), and 1,246 at one year (0.48 % incidence). The VTE incidence for different procedures was between 0.15–0.35 % at 30 days, 0.26–0.50 % at 90 days, and 0.46–0.58 % at one year. At 30 days there was a significantly lower incidence of VTEs for foam sclerotherapy compared to other procedures (p = 0.01). There was no difference in VTE incidence between procedures at 90 days (p = 0.13) or one year (p = 0.16). Conclusions: Patients undergoing varicose vein procedures have a small but appreciable increased risk of VTE compared to the general population, with the effect persisting at one year. Foam sclerotherapy had a lower incidence of VTE compared to other procedures at 30 days, but this effect did not persist at 90 days or at one year. There was no other significant difference in the incidence of VTE between open, endovenous, and foam sclerotherapy treatments.


1997 ◽  
Vol 78 (05) ◽  
pp. 1327-1331 ◽  
Author(s):  
Paul A Kyrle ◽  
Andreas Stümpflen ◽  
Mirko Hirschl ◽  
Christine Bialonczyk ◽  
Kurt Herkner ◽  
...  

SummaryIncreased thrombin generation occurs in many individuals with inherited defects in the antithrombin or protein C anticoagulant pathways and is also seen in patients with thrombosis without a defined clotting abnormality. Hyperhomocysteinemia (H-HC) is an important risk factor of venous thromboembolism (VTE). We prospectively followed 48 patients with H-HC (median age 62 years, range 26-83; 18 males) and 183 patients (median age 50 years, range 18-85; 83 males) without H-HC for a period of up to one year. Prothrombin fragment Fl+2 (Fl+2) was determined in the patient’s plasma as a measure of thrombin generation during and at several time points after discontinuation of secondary thromboprophylaxis with oral anticoagulants. While on anticoagulants, patients with H-HC had significantly higher Fl+2 levels than patients without H-HC (mean 0.52 ± 0.49 nmol/1, median 0.4, range 0.2-2.8, versus 0.36 ± 0.2 nmol/1, median 0.3, range 0.1-2.1; p = 0.02). Three weeks and 3,6,9 and 12 months after discontinuation of oral anticoagulants, up to 20% of the patients with H-HC and 5 to 6% without H-HC had higher Fl+2 levels than a corresponding age- and sex-matched control group. 16% of the patients with H-HC and 4% of the patients without H-HC had either Fl+2 levels above the upper limit of normal controls at least at 2 occasions or (an) elevated Fl+2 level(s) followed by recurrent VTE. No statistical significant difference in the Fl+2 levels was seen between patients with and without H-HC. We conclude that a permanent hemostatic system activation is detectable in a proportion of patients with H-HC after discontinuation of oral anticoagulant therapy following VTE. Furthermore, secondary thromboprophylaxis with conventional doses of oral anticoagulants may not be sufficient to suppress hemostatic system activation in patients with H-HC.


2014 ◽  
Vol 1 (1) ◽  
pp. 25-29
Author(s):  
Rahim Mohammadian ◽  
Behnam Tahmasebpour ◽  
Peyvand Samimifar

A factorial experiment was conducted with a completely randomized design to evaluate the effects of planting date and density on calendula herbs and peppermint. It had 3 replicates and was done in Khosroshahr research farm, Tabriz in 2006. Under studied factors were: 3 planting dates (10 May, 25 May and 10 June) in 4 densities (25, 35, 45, 55) of the plant in square meters. The results of variance a nalysis showed that there was 1% probability significant difference between the effects of planting date and bush density on the leave number, bush height and the bush dry weight. But the mutual effect of the plant date in mentioned traits density was insignificant. Regarding the traits mean comparison, the total maximum dry weight was about the 55 bush density in mm. Also, the bush high density in mm causes the bush growth and its mass reduction. When there is the density grain, the flower number will increase due to bush grain in surface unit. Overall, we can conclude that 10 June planting and 45 bush density in mm is the most suitable items and results in favored production with high essence for these crops.


2020 ◽  
Vol 16 (3) ◽  
Author(s):  
Apar Pokharel ◽  
Naganawalachullu Jaya Prakash Mayya ◽  
Nabin Gautam

Introduction: Deviated nasal septum is one of the most common causes for the nasal obstruction. The objective of this study is to compare the surgical outcomes in patients undergoing conventional septoplasty and endoscopic septoplasty in the management of deviated nasal septum. Methods:  Prospective comparative study was conducted on 60 patients who presented to the Department of ENT, College of Medical sciences, during a period of one year. The severity of the symptoms was subjectively assessed using NOSE score and objectively assessed using modified Gertner plate. Results: There was significant improvement in functional outcome like NOSE Score and area over the Gertner plate among patients who underwent endoscopic septoplasty. Significant difference in incidence of post-operative nasal synechae and haemorrhage was seen in conventional group compared to endoscopic group. Conclusions: Endoscopic surgery is an evolutionary step towards solving the problems related to deviated nasal septum. It is safe, effective and conservative, alternative to conventional septal surgery.


2011 ◽  
pp. 70-76
Author(s):  

Objectives: To evualate the effects of early intervention program after one year for 33 disabled children in Hue city in 2010. Objects and Methods: Conduct with practical work and assessment on developing levels at different skills of the children with developmental delay under 6 years old who are the objects of the program. Results: With the Portage checklist used as a tool for implementing the intervention at the community and assessing developing skills on Social, Cognition, Motor, Self-help and Language skills for children with developmental delay, there still exists significant difference (p ≤ 0.05) at developing level of all areas in the first assessment (January, 2010) and the second assessment (December, 2010) after 12 months. In comparison among skills of different types of disabilities, there is significant difference of p ≤ 0.05 of social, cognition and language skills in the first assessment and of social, cognition, motor and language skills in the second assessment. Conclusion: Home-based Early Intervention Program for children with developmental delay has achieved lots of progress in improving development skills of the children and enhancing the parents’ abilities in supporting their children at home.


HortScience ◽  
1998 ◽  
Vol 33 (3) ◽  
pp. 485b-485
Author(s):  
Lisa M. Barry ◽  
Michael N. Dana

Nurse crops are often recommended in prairie restoration planting. This work investigated several alternative nurse crops to determine their utility in prairie planting. Nurse crops were composed of increasing densities (900, 1800, or 2700 seeds/m2) of partridge pea, spring oats, spring barley, Canada wild rye, or equal mixtures of partridge pea and one of the grasses. The experimental design was a randomized complete-block set in two sites with three blocks per site and 48 treatments per block. Each 3 × 3-m plot contained 1 m2 planted in Dec. 1995 or Mar. 1996 with an equal mix of seven prairie species. The nurse crops were sown over each nine square meter area in April 1996. Plots lacking nurse crops served as controls. Evaluated data consisted of weed pressure rankings and weed and prairie plant dry weight. Nurse crop treatments had a significant effect on weed pressure in both sites. Barley (1800 and 2700 seeds/m2) as well as partridge pea + barley (2700 seeds/m2) were most effective at reducing weed pressure. When weed and prairie plant biomass values were compared, a significant difference was observed for site quality and planting season. Prairie plant establishment was significantly greater in the poorly drained, less-fertile site and spring-sown plots in both sites had significantly higher prairie biomass values. Overall, after two seasons, there was no advantage in using nurse crops over the control. Among nurse crop treatments, oats were most effective in reducing weed competition and enhancing prairie plant growth.


Author(s):  
Tewogbade Adeoye Adedeji ◽  
Simeon Adelani. Adebisi ◽  
Nife Olamide Adedeji ◽  
Olusola Akanni Jeje ◽  
Rotimi Samuel Owolabi

Background: Human immunodeficiency virus (HIV) infection impairs renal function, thereby affecting renal phosphate metabolism. Objectives: We prospectively estimated the prevalence of phosphate abnormalities (mild, moderate to life-threatening hypophosphataemia, and hyperphosphataemia) before initiating antiretroviral therapy (ART). Methods: A cross-sectional analysis was performed on 170 consecutive newly diagnosed ART-naïve, HIV-infected patients attending our HIV/AIDS clinics over a period of one year. Fifty (50) screened HIV-negative blood donors were used for comparison (controls). Blood and urine were collected simultaneously for phosphate and creatinine assay to estimate fractional phosphate excretion (FEPi %) and glomerular filtration rate (eGFR). Results: eGFR showed significant difference between patients’ and controls’ medians (47.89ml/min/1.73m2 versus 60ml/min/1.73m2, p <0.001); which denotes a moderate chronic kidney disease in the patients. Of the 170 patients, 78 (45.9%) had normal plasma phosphate (0.6-1.4 mmol/L); 85 (50%) had hyperphosphataemia. Grades 1, 2 and 3 hypophosphataemia was observed in 3 (1.8%), 3 (1.8%), and 1(0.5%) patient(s) respectively. None had grade 4 hypophosphataemia. Overall, the patients had significantly higher median of plasma phosphate than the controls, 1.4 mmol/L (IQR: 1.0 – 2.2) versus 1.1 mmol/L (IQR: 0.3 – 1.6), p <0.001, implying hyperphosphataemia in the patients; significantly lower median urine phosphate than the controls, 1.5 mmol/L (IQR: 0.7 -2.1) versus 8.4 mmol/L (IQR: 3.4 – 16), p <0.001), justifying the hyperphosphataemia is from phosphate retention; but a non-significantly lower median FEPi% than the controls, 0.96 % (IQR: 0.3 -2.2) versus 1.4% (IQR: 1.2 -1.6), p > 0.05. Predictors of FEPi% were age (Odds ratio, OR 0.9, p = 0.009); weight (OR 2.0, p < 0.001); CD4+ cells count predicted urine phosphate among males (p = 0.029). Conclusion: HIV infection likely induces renal insufficiency with reduced renal phosphate clearance. Thus, hyperphosphataemia is highly prevalent, and there is mild to moderate hypophosphataemia but its life-threatening form (grade 4) is rare among ART-naïve HIV patients.


Sign in / Sign up

Export Citation Format

Share Document