scholarly journals Spatial-temporal Patterns and Risk Factors for Human Leptospirosis in Thailand, 2012-2018

Author(s):  
Sudarat Chadsuthi ◽  
Karine Chalvet-Monfray ◽  
Suchada Geawduanglek ◽  
Phrutsamon Wongnak ◽  
Julien Cappelle

Abstract Leptospirosis is a globally important zoonotic disease. The disease is particularly important in tropical and subtropical countries. Infections in humans can be caused by exposure to infected animals or contaminated soil or water, which are suitable for Leptospira. To explore the cluster area, the Global Moran’s I index was calculated for incidences per 100,000 population at the province level during 2012–2018, using the monthly and annual data. The high-risk and low-risk provinces were identified using the local indicators of spatial association (LISA). The risk factors for leptospirosis were evaluated using a generalized linear mixed model (GLMM) with zero-inflation. We also added spatial and temporal correlation terms to take into account the spatial and temporal structures. The Global Moran’s I index showed significant positive values. It did not demonstrate a random distribution throughout the period of study. The high-risk provinces were almost all in the lower north-east and south parts of Thailand. For yearly reported cases, the significant risk factors from the final best-fitted model were population density, elevation, and primary rice arable areas. Interestingly, our study showed that leptospirosis cases were associated with large areas of rice production but were less prevalent in areas of high rice productivity. For monthly reported cases, the model using temperature range was found to be a better fit than using percentage of flooded area. The significant risk factors from the model using temperature range were temporal correlation, average soil moisture, normalized difference vegetation index, and temperature range. Temperature range, which has strongly negative correlation to percentage of flooded area was a significant risk factor for monthly data. Flood exposure controls should be used to reduce the risk of leptospirosis infection. These results could be used to develop a leptospirosis warning system to support public health organizations in Thailand.

2020 ◽  
Vol 5 (3) ◽  
pp. 145-154
Author(s):  
Mohsen Shariati ◽  
◽  
Mahsa Jahangiri-rad ◽  
Fatima Mahmud Muhammad ◽  
Jafar Shariati ◽  
...  

Background: Iran detected its first COVID-19 case in February 2020 in Qom province, which rapidly spread to other cities in the country. Iran, as one of those countries with the highest number of infected people, has officially reported 1812 deaths from a total number of 23049 confirmed infected cases that we used in the analysis. Materials and Methods: Geographic distribution by the map of calculated incidence rates for COVID -19 in Iran within the period was prepared by GIS 10.6 Spatial autocorrelation (Global Moran’s I) and hot spot analysis were used to assess COVID -19 spatial patterns. The ordinary least square method was used to estimate the relationship between COVID -19 and the risk factors. The next step was to explore Geographically Weighted Regression (GWR) models that might better explain the variation in COVID -19 cases based on the environmental and socio-demographic factors. Results: The spatial autocorrelation (Global Moran’s I) result showed that COVID-19 cases in the studied area were in clustered patterns. For statistically significant positive z-scores, the larger the z-score is, the more intense the clustering of high values (hot spot), such as Semnan, Qom, Isfahan, Mazandaran, Alborz, and Tehran. Hot spot analysis detected clustering of a hot spot with confidence level 99% for Semnan, Qom, Isfahan, Mazandaran, Alborz, and Tehran, as well. The risk factors were removed from the model step by step. Finally, just the distance from the epicenter was adopted in the model. GWR efforts increased the explanatory value of risk factor with better special precision (adjusted R-squared=0.44) Conclusion: The highest CIR was concentrated around Qom. Also, the greater the distance from the center of prevalence (Qom), the fewer the patients. Hot spot analysis also implies that the neighboring provinces of prevalence centers exhibited hot spots with a 99% confidence level. Furthermore, the results of OLS analysis showed the significant correlation of CIR is with the distance from epicenter (Qom). The GWR can result in the spatial granularity providing an opportunity to well understand the relationship between environmental spatial heterogeneity and COVID-19 risk as entailed by the infection of CIR with COVID-19, which would make it possible to better plan managerial policies for public health.


2017 ◽  
Vol 8 (4) ◽  
Author(s):  
Matheus Supriyanto Rumetna ◽  
Eko Sediyono ◽  
Kristoko Dwi Hartomo

Abstract. Bantul Regency is a part of Yogyakarta Special Province Province which experienced land use changes. This research aims to assess the changes of shape and level of land use, to analyze the pattern of land use changes, and to find the appropriateness of RTRW land use in Bantul District in 2011-2015. Analytical methods are employed including Geoprocessing techniques and analysis of patterns of distribution of land use changes with Spatial Autocorrelation (Global Moran's I). The results of this study of land use in 2011, there are thirty one classifications, while in 2015 there are thirty four classifications. The pattern of distribution of land use change shows that land use change in 2011-2015 has a Complete Spatial Randomness pattern. Land use suitability with the direction of area function at RTRW is 24030,406 Ha (46,995406%) and incompatibility of 27103,115 Ha or equal to 53,004593% of the total area of Bantul Regency.Keywords: Geographical Information System, Land Use, Geoprocessing, Global Moran's I, Bantul Regency. Abstrak. Analisis Perubahan Tata Guna Lahan di Kabupaten Bantul Menggunakan Metode Global Moran’s I. Kabupaten Bantul merupakan bagian dari Provinsi Daerah Istimewa Yogyakarta yang mengalami perubahan tata guna lahan. Penelitian ini bertujuan untuk mengkaji perubahan bentuk dan luas penggunaan lahan, menganalisis pola sebaran perubahan tata guna lahan, serta kesesuaian tata guna lahan terhadap RTRW yang terjadi di Kabupaten Bantul pada tahun 2011-2015. Metode analisis yang digunakan antara lain teknik Geoprocessing serta analisis pola sebaran perubahan tata guna lahan dengan Spatial Autocorrelation (Global Moran’s I). Hasil dari penelitian ini adalah penggunaan tanah pada tahun 2011, terdapat tiga puluh satu klasifikasi, sedangkan pada tahun 2015 terdapat tiga puluh empat klasifikasi. Pola sebaran perubahan tata guna lahan menunjukkan bahwa perubahan tata guna lahan tahun 2011-2015 memiliki pola Complete Spatial Randomness. Kesesuaian tata guna lahan dengan arahan fungsi kawasan pada RTRW adalah seluas 24030,406 Ha atau mencapai 46,995406 % dan ketidaksesuaian seluas 27103,115 Ha atau sebesar 53,004593 % dari total luas wilayah Kabupaten Bantul. Kata Kunci: Sistem Informasi Georafis, tata guna lahan, Geoprocessing, Global Moran’s I, Kabupaten Bantul.


2021 ◽  
Vol 26 (1) ◽  
Author(s):  
Mukemil Awol ◽  
Zewdie Aderaw Alemu ◽  
Nurilign Abebe Moges ◽  
Kemal Jemal

Abstract Background In Ethiopia, despite the considerable improvement in immunization coverage, the burden of defaulting from immunization among children is still high with marked variation among regions. However, the geographical variation and contextual factors of defaulting from immunization were poorly understood. Hence, this study aimed to identify the spatial pattern and associated factors of defaulting from immunization. Methods An in-depth analysis of the 2016 Ethiopian Demographic and Health Survey (EDHS 2016) data was used. A total of 1638 children nested in 552 enumeration areas (EAs) were included in the analysis. Global Moran’s I statistic and Bernoulli purely spatial scan statistics were employed to identify geographical patterns and detect spatial clusters of defaulting immunization, respectively. Multilevel logistic regression models were fitted to identify factors associated with defaulting immunization. A p value < 0.05 was used to identify significantly associated factors with defaulting of child immunization. Results A spatial heterogeneity of defaulting from immunization was observed (Global Moran’s I = 0.386379, p value < 0.001), and four significant SaTScan clusters of areas with high defaulting from immunization were detected. The most likely primary SaTScan cluster was seen in the Somali region, and secondary clusters were detected in (Afar, South Nation Nationality of people (SNNP), Oromiya, Amhara, and Gambella) regions. In the final model of the multilevel analysis, individual and community level factors accounted for 56.4% of the variance in the odds of defaulting immunization. Children from mothers who had no formal education (AOR = 4.23; 95% CI: 117, 15.78), and children living in Afar, Oromiya, Somali, SNNP, Gambella, and Harari regions had higher odds of having defaulted immunization from community level. Conclusions A clustered pattern of areas with high default of immunization was observed in Ethiopia. Both the individual and community-level characteristics were statistically significant factors of defaulting immunization. Therefore, the Federal Ethiopian Ministry of Health should prioritize the areas with defaulting of immunization and consider the identified factors for immunization interventions.


Curationis ◽  
1978 ◽  
Vol 1 (3) ◽  
Author(s):  
J.V. Larsen

It has recently been demonstrated that about 56 percent of patients delivering in a rural obstetric unit had significant risk factors, and that 85 percent of these could have been detected by meticulous antenatal screening before the onset of labour. These figures show that the average rural obstetric unit in South Africa is dealing with a large percentage of high risk patients. In this work, it is hampered by: 1. Communications problems: i.e. bad roads, long distances. and unpredictable telephones. 2. A serious shortage of medical staff resulting in primary obstetric care being delivered by midwives with minimal medical supervision.


2008 ◽  
Vol 18 (2) ◽  
pp. 357-362 ◽  
Author(s):  
W.-G. Lu ◽  
F. Ye ◽  
Y.-M. Shen ◽  
Y.-F. Fu ◽  
H.-Z. Chen ◽  
...  

This study was designed to analyze the outcomes of chemotherapy for high-risk gestational trophoblastic neoplasia (GTN) with EMA-CO regimen as primary and secondary protocol in China. Fifty-four patients with high-risk GTN received 292 EMA/CO treatment cycles between 1996 and 2005. Forty-five patients were primarily treated with EMA-CO, and nine were secondarily treated after failure to other combination chemotherapy. Adjuvant surgery and radiotherapy were used in the selected patients. Response, survival and related risk factors, as well as chemotherapy complications, were retrospectively analyzed. Thirty-five of forty-five patients (77.8%) receiving EMA-CO as first-line treatment achieved complete remission, and 77.8% (7/9) as secondary treatment. The overall survival rate was 87.0% in all high-risk GTN patients, with 93.3% (42/45) as primary therapy and 55.6% (5/9) as secondary therapy. The survival rates were significantly different between two groups (χ2= 6.434, P = 0.011). Univariate analysis showed that the metastatic site and the number of metastatic organs were significant risk factors, but binomial distribution logistic regression analysis revealed that only the number of metastatic organs was an independent risk factor for the survival rate. No life-threatening toxicity and secondary malignancy were found. EMA-EP regimen was used for ten patients who were resistant to EMA-CO and three who relapsed after EMA-CO. Of those, 11 patients (84.6%) achieved complete remission. We conclude that EMA-CO regimen is an effective and safe primary therapy for high-risk GTN, but not an appropriate second-line protocol. The number of metastatic organs is an independent prognostic factor for the patient with high-risk GTN. EMA-EP regimen is a highly effective salvage therapy for those failing to EMA-CO.


Author(s):  
І. К. Чурпій

<p>To optimize the therapeutic tactics and improve the treatment of peritonitis on the basis of retrospective analysis there are determined the significant risk factors: female gender, age 60 – 90 years, time to hospitalization for more than 48 hours, a history of myocardial infarction, stroke, cardiac arrhythmia, biliary, fecal and fibrinous purulent exudate, the terminal phase flow, operations with resection of the intestine and postoperative complications such as pulmonary embolism, myocardial infarction, pleurisy, early intestinal obstruction. Changes in the electrolyte composition of blood and lower albumin &lt;35 % of high risk prognostic course of peritonitis that requires immediate correction in the pre-and postoperative periods. The combination of three or more risk factors for various systems, creating a negative outlook for further treatment and the patient's life.</p>


2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Simon P. Kigozi ◽  
Ruth N. Kigozi ◽  
Catherine M. Sebuguzi ◽  
Jorge Cano ◽  
Damian Rutazaana ◽  
...  

Abstract Background As global progress to reduce malaria transmission continues, it is increasingly important to track changes in malaria incidence rather than prevalence. Risk estimates for Africa have largely underutilized available health management information systems (HMIS) data to monitor trends. This study uses national HMIS data, together with environmental and geographical data, to assess spatial-temporal patterns of malaria incidence at facility catchment level in Uganda, over a recent 5-year period. Methods Data reported by 3446 health facilities in Uganda, between July 2015 and September 2019, was analysed. To assess the geographic accessibility of the health facilities network, AccessMod was employed to determine a three-hour cost-distance catchment around each facility. Using confirmed malaria cases and total catchment population by facility, an ecological Bayesian conditional autoregressive spatial-temporal Poisson model was fitted to generate monthly posterior incidence rate estimates, adjusted for caregiver education, rainfall, land surface temperature, night-time light (an indicator of urbanicity), and vegetation index. Results An estimated 38.8 million (95% Credible Interval [CI]: 37.9–40.9) confirmed cases of malaria occurred over the period, with a national mean monthly incidence rate of 20.4 (95% CI: 19.9–21.5) cases per 1000, ranging from 8.9 (95% CI: 8.7–9.4) to 36.6 (95% CI: 35.7–38.5) across the study period. Strong seasonality was observed, with June–July experiencing highest peaks and February–March the lowest peaks. There was also considerable geographic heterogeneity in incidence, with health facility catchment relative risk during peak transmission months ranging from 0 to 50.5 (95% CI: 49.0–50.8) times higher than national average. Both districts and health facility catchments showed significant positive spatial autocorrelation; health facility catchments had global Moran’s I = 0.3 (p < 0.001) and districts Moran’s I = 0.4 (p < 0.001). Notably, significant clusters of high-risk health facility catchments were concentrated in Acholi, West Nile, Karamoja, and East Central – Busoga regions. Conclusion Findings showed clear countrywide spatial-temporal patterns with clustering of malaria risk across districts and health facility catchments within high risk regions, which can facilitate targeting of interventions to those areas at highest risk. Moreover, despite high and perennial transmission, seasonality for malaria incidence highlights the potential for optimal and timely implementation of targeted interventions.


2016 ◽  
Vol 78 (11-3) ◽  
Author(s):  
Noor Khairiah A. Karim ◽  
Rohayu Hami ◽  
Nur Hashamimi Hashim ◽  
Nizuwan Azman ◽  
Ibrahim Lutfi Shuaib

The risk factors of breast cancer among women, such as genetic, family history and lifestyle factors, can be divided into high-, intermediate- and average-risk. Determining these risk factors may actually help in preventing breast cancer occurrence. Besides that, screening of breast cancer which include mammography, can be done in promoting early breast cancer detection. Breast magnetic resonance imaging (MRI) has been recommended as a supplemental screening tool in high risk women. The aim of this study was to identify the significant risk factor of breast cancer among women and also to determine the usefulness of breast MRI as an addition to mammography in detection of breast cancer in high risk women. This retrospective cohort study design was conducted using patients’ data taken from those who underwent mammography for screening or diagnostic purposes in Advanced Medical and Dental Institute, Universiti Sains Malaysia, from 2007 until 2015. Data from 289 subjects were successfully retrieved and analysed based on their risk factors of breast cancer. Meanwhile, data from 120 subjects who had high risks and underwent both mammography and breast MRI were further analysed. There were two significant risk factors of breast cancer seen among the study population: family history of breast cancer (p-value=0.012) and previous history of breast or ovarian cancer (p-value <0.001). Breast MRI demonstrated high sensitivity (90%) while mammography demonstrated high specificity (80%) in detection of breast cancer in all 120 subjects. The number of cases of breast cancer detection using breast MRI [46 (38.3%)] was higher compared to mammography [24 (20.0%)]. However, breast MRI was found to be non-significant as an adjunct tool to mammography in detecting breast cancer in high risk women (p-value=0.189). A comprehensive screening guideline and surveillance of women at high risk is indeed useful and should be implemented to increase cancer detection rate at early stage


2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Kyung-Hee Kim ◽  
Min-Hee Kim ◽  
Ye-Jee Lim ◽  
Ihn Suk Lee ◽  
Ja-Seong Bae ◽  
...  

Background. The measurement of stimulated thyroglobulin (sTg) after total thyroidectomy and remnant radioactive iodine (RAI) ablation is the gold standard for monitoring disease status in patients with papillary thyroid carcinomas (PTCs). The aim of this study was to determine whether sTg measurement during follow-up can be avoided in intermediate- and high-risk PTC patients.Methods. A total of 346 patients with PTCs with an intermediate or high risk of recurrence were analysed. All of the patients underwent total thyroidectomy as well as remnant RAI ablation and sTg measurements. Preoperative and postoperative parameters were included in the analysis.Results. Among the preoperative parameters, age below 45 years and preoperative Tg above 19.4 ng/mL were significant risk factors for predicting detectable sTg during follow-up. Among the postoperative parameters, thyroid capsular invasion, lymph node metastasis, and ablative Tg above 2.9 ng/mL were independently correlated with a detectable sTg range. The combination of ablative Tg less than 2.9 ng/mL with pre- and postoperative independent risk factors for detectable sTg increased the negative predictive value for detectable sTg up to 98.5%.Conclusions. Based on pre- and postoperative parameters, a substantial proportion of patients with PTCs in the intermediate- and high-risk classes could avoid aggressive follow-up measures.


Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 3811-3811
Author(s):  
Drorit Merkel ◽  
Kalman Filanovsky ◽  
Ariel Aviv ◽  
Moshe E. Gatt ◽  
Yair Herishanu ◽  
...  

Abstract Abstract 3811 Background: Azacitidine is an effective therapy for high risk myelodysplastic syndrome (MDS). Neutropenic fever is a common life threatening complication during azacitidine therapy, however predicting it, is challenging. Despite a number of large scale prospective studies, there are no established indications for primary or secondary prophylactic antibiotics or for the use of granulocyte colony-stimulating factor (G-CSF) (Pierre Fenauxa et al. Leukemia Research 2010). We used a retrospective survey of 98 high risk MDS and AML patients treated with Azacitidine, to develop a predicting model for infection during each cycle of Azacitidine therapy. Methods: We retrospectively studied 82 high risk MDS and 16 AML patients treated with 456 azacitidine cycles between 9.2008 and 7.2011at 11 institutions from Israel. Information, of complete blood count, creatinine and liver enzymes was documented prior to initiation of each cycle. Results: Patients' median age was 71 (range 27–92) and 57 (58%) of them males. Poor cytogenetic abnormalities were detected in 30.8% (25 of 82 patients with available cytogenetic) and 65 (67%) were transfusions dependent. The median interval between the initial diagnosis and the initiation of azacitidine therapy was 187 days (range 4 days – 18 years). Azacitidine was administrated as first line therapy in 24 (24%) of patients, 37 (38%) had failed growth factors, 5 (5%) were relapsing after allogeneic transplantation and 32 (33%) were given different chemotherapies prior to azacitidine therapy. Doses and schedule of azacitidine data were available for 98% (446/456) of cycles. The prevalence of 7 days cycles of 75mg/m2, 5 days cycles of 75mg/m2 or attenuated doses were 50.4%, 30%, 16.9% respectively. Adverse events were obtained from patient's charts. 13 major bleeding and 78 infections episodes (2.85% and 16.9% of all cycles) were recorded. Due to the low number of bleeding events we focused on factors predicting infection episodes. Infection rates of 22.7%, 14.2% and 6.9% correlated with azacitidine dose (75mg/m2x7d Vs 5d) and lower respectively). Excluding 87 cycles of doses lower than 75mg/m2 for 5 days, predictors of infections were evaluated in 369 cycles. Nine parameters were included in final analysis: age, sex, cytogenetics, being transfusion dependent prior to first cycle, time from diagnosis to the first cycle, azacitidine dose and neutrophil, thrombocyte and creatinine values prior to each cycle. The odd ratio off infections related to neutrophils count was higher than ANC, so we used neutrophils counts as a predictor. For each cycle we considered full 7 days Vs 5 days schedule, neutrophil above or below 500 cells/mcl, platelet above or below 20,000 cells/mcl and creatinine level prior to the first day of cycle. In univariate analysis neutrophil below 500, platelet below 20,000, creatinine level, azacitidine dose and being transfusion dependent were correlated with infection. In a multivariate analysis (table 1) transfusion dependency and platelets lower than 20,000 were the only significant parameters. Risk of infection was higher when a full seven days cycle was administrated but haven't reach statistical significance (p=0.07). Conclusions: Transfusion dependency prior to first cycle and platelets lower than 20,000 prior to each cycle, are the main significant risk factors for infections during azacitidine therapy. Neutropenia and age are known risk factors for infections in general, but were not significant in our study. We assume that in high risk MDS patients when most off the patients are old and neutropenic, thrombocytopenia is a surrogate marker of disease status which makes the patient more prone to infections. Therefore physicians should considerer these two parameters prior to every azacitidine cycle as guidance in the debate of concurrent prophylactic antibiotics, G-CSF or a tolerable dose of azacitidine. Our findings should be confirmed in a larger sample set but may pave the road for prospective studies of infection prophylaxis during azacitidine therapy. Disclosures: No relevant conflicts of interest to declare.


Sign in / Sign up

Export Citation Format

Share Document