Abstract 17083: Hunting For Clot: Strokes In Patients With Cardiac Amyloid

Circulation ◽  
2021 ◽  
Vol 144 (Suppl_2) ◽  
Author(s):  
Douglas Kyrouac ◽  
Kartik Agusala

Case Presentation: A 76-year-old female with history of amyloid light chain (AL) cardiac amyloidosis and prior stroke presented with acute bilateral leg weakness. MRI of the brain revealed subacute infarcts in the right basal ganglia and precentral gyrus as well as chronic left cerebellar infarctions. CT angiogram of the head and neck did not show any significant intracranial or carotid artery disease. All current and previous admission ECGs and telemetry showed sinus rhythm with no atrial arrhythmias. Transthoracic echocardiogram with bubble study noted LV ejection fraction of 62%, normal left atrial size and no evidence of an intracardiac shunt. Despite the lack of traditional indicators, a cardioembolic source was feared given the high risk of intracardiac thrombi in AL amyloid patients. A TEE was performed and confirmed a large left atrial appendage (LAA) thrombus with severely reduced velocities and spontaneous echo contrast (Figure 1). The patient was started on apixaban for secondary stroke prevention. Discussion: Amyloid proteins deposit throughout heart tissue, distorting mechanical function and electric conduction resulting in reduced blood flow and a high risk of intracardiac thrombi that is 10 times that of nonamyloid patients. Patients with cardiac amyloidosis have a high prevalence of atrial arrhythmias, particularly atrial fibrillation, but remain at significant risk of thrombi even in the absence of these arrhythmias, ranging from 3-20%. Risk factors for clot include AL type amyloid and features of worsened hemodynamics such as LV systolic and diastolic dysfunction, elevated heart rates and lower systolic blood pressure. This case demonstrates the importance of aggressively “hunting” for intracardiac thrombi in amyloid patients presenting with stroke even without perceived risk factors, especially given the protective effect of starting anticoagulation.

Curationis ◽  
1978 ◽  
Vol 1 (3) ◽  
Author(s):  
J.V. Larsen

It has recently been demonstrated that about 56 percent of patients delivering in a rural obstetric unit had significant risk factors, and that 85 percent of these could have been detected by meticulous antenatal screening before the onset of labour. These figures show that the average rural obstetric unit in South Africa is dealing with a large percentage of high risk patients. In this work, it is hampered by: 1. Communications problems: i.e. bad roads, long distances. and unpredictable telephones. 2. A serious shortage of medical staff resulting in primary obstetric care being delivered by midwives with minimal medical supervision.


2008 ◽  
Vol 18 (2) ◽  
pp. 357-362 ◽  
Author(s):  
W.-G. Lu ◽  
F. Ye ◽  
Y.-M. Shen ◽  
Y.-F. Fu ◽  
H.-Z. Chen ◽  
...  

This study was designed to analyze the outcomes of chemotherapy for high-risk gestational trophoblastic neoplasia (GTN) with EMA-CO regimen as primary and secondary protocol in China. Fifty-four patients with high-risk GTN received 292 EMA/CO treatment cycles between 1996 and 2005. Forty-five patients were primarily treated with EMA-CO, and nine were secondarily treated after failure to other combination chemotherapy. Adjuvant surgery and radiotherapy were used in the selected patients. Response, survival and related risk factors, as well as chemotherapy complications, were retrospectively analyzed. Thirty-five of forty-five patients (77.8%) receiving EMA-CO as first-line treatment achieved complete remission, and 77.8% (7/9) as secondary treatment. The overall survival rate was 87.0% in all high-risk GTN patients, with 93.3% (42/45) as primary therapy and 55.6% (5/9) as secondary therapy. The survival rates were significantly different between two groups (χ2= 6.434, P = 0.011). Univariate analysis showed that the metastatic site and the number of metastatic organs were significant risk factors, but binomial distribution logistic regression analysis revealed that only the number of metastatic organs was an independent risk factor for the survival rate. No life-threatening toxicity and secondary malignancy were found. EMA-EP regimen was used for ten patients who were resistant to EMA-CO and three who relapsed after EMA-CO. Of those, 11 patients (84.6%) achieved complete remission. We conclude that EMA-CO regimen is an effective and safe primary therapy for high-risk GTN, but not an appropriate second-line protocol. The number of metastatic organs is an independent prognostic factor for the patient with high-risk GTN. EMA-EP regimen is a highly effective salvage therapy for those failing to EMA-CO.


Author(s):  
І. К. Чурпій

<p>To optimize the therapeutic tactics and improve the treatment of peritonitis on the basis of retrospective analysis there are determined the significant risk factors: female gender, age 60 – 90 years, time to hospitalization for more than 48 hours, a history of myocardial infarction, stroke, cardiac arrhythmia, biliary, fecal and fibrinous purulent exudate, the terminal phase flow, operations with resection of the intestine and postoperative complications such as pulmonary embolism, myocardial infarction, pleurisy, early intestinal obstruction. Changes in the electrolyte composition of blood and lower albumin &lt;35 % of high risk prognostic course of peritonitis that requires immediate correction in the pre-and postoperative periods. The combination of three or more risk factors for various systems, creating a negative outlook for further treatment and the patient's life.</p>


2016 ◽  
Vol 78 (11-3) ◽  
Author(s):  
Noor Khairiah A. Karim ◽  
Rohayu Hami ◽  
Nur Hashamimi Hashim ◽  
Nizuwan Azman ◽  
Ibrahim Lutfi Shuaib

The risk factors of breast cancer among women, such as genetic, family history and lifestyle factors, can be divided into high-, intermediate- and average-risk. Determining these risk factors may actually help in preventing breast cancer occurrence. Besides that, screening of breast cancer which include mammography, can be done in promoting early breast cancer detection. Breast magnetic resonance imaging (MRI) has been recommended as a supplemental screening tool in high risk women. The aim of this study was to identify the significant risk factor of breast cancer among women and also to determine the usefulness of breast MRI as an addition to mammography in detection of breast cancer in high risk women. This retrospective cohort study design was conducted using patients’ data taken from those who underwent mammography for screening or diagnostic purposes in Advanced Medical and Dental Institute, Universiti Sains Malaysia, from 2007 until 2015. Data from 289 subjects were successfully retrieved and analysed based on their risk factors of breast cancer. Meanwhile, data from 120 subjects who had high risks and underwent both mammography and breast MRI were further analysed. There were two significant risk factors of breast cancer seen among the study population: family history of breast cancer (p-value=0.012) and previous history of breast or ovarian cancer (p-value <0.001). Breast MRI demonstrated high sensitivity (90%) while mammography demonstrated high specificity (80%) in detection of breast cancer in all 120 subjects. The number of cases of breast cancer detection using breast MRI [46 (38.3%)] was higher compared to mammography [24 (20.0%)]. However, breast MRI was found to be non-significant as an adjunct tool to mammography in detecting breast cancer in high risk women (p-value=0.189). A comprehensive screening guideline and surveillance of women at high risk is indeed useful and should be implemented to increase cancer detection rate at early stage


2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Kyung-Hee Kim ◽  
Min-Hee Kim ◽  
Ye-Jee Lim ◽  
Ihn Suk Lee ◽  
Ja-Seong Bae ◽  
...  

Background. The measurement of stimulated thyroglobulin (sTg) after total thyroidectomy and remnant radioactive iodine (RAI) ablation is the gold standard for monitoring disease status in patients with papillary thyroid carcinomas (PTCs). The aim of this study was to determine whether sTg measurement during follow-up can be avoided in intermediate- and high-risk PTC patients.Methods. A total of 346 patients with PTCs with an intermediate or high risk of recurrence were analysed. All of the patients underwent total thyroidectomy as well as remnant RAI ablation and sTg measurements. Preoperative and postoperative parameters were included in the analysis.Results. Among the preoperative parameters, age below 45 years and preoperative Tg above 19.4 ng/mL were significant risk factors for predicting detectable sTg during follow-up. Among the postoperative parameters, thyroid capsular invasion, lymph node metastasis, and ablative Tg above 2.9 ng/mL were independently correlated with a detectable sTg range. The combination of ablative Tg less than 2.9 ng/mL with pre- and postoperative independent risk factors for detectable sTg increased the negative predictive value for detectable sTg up to 98.5%.Conclusions. Based on pre- and postoperative parameters, a substantial proportion of patients with PTCs in the intermediate- and high-risk classes could avoid aggressive follow-up measures.


Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 3811-3811
Author(s):  
Drorit Merkel ◽  
Kalman Filanovsky ◽  
Ariel Aviv ◽  
Moshe E. Gatt ◽  
Yair Herishanu ◽  
...  

Abstract Abstract 3811 Background: Azacitidine is an effective therapy for high risk myelodysplastic syndrome (MDS). Neutropenic fever is a common life threatening complication during azacitidine therapy, however predicting it, is challenging. Despite a number of large scale prospective studies, there are no established indications for primary or secondary prophylactic antibiotics or for the use of granulocyte colony-stimulating factor (G-CSF) (Pierre Fenauxa et al. Leukemia Research 2010). We used a retrospective survey of 98 high risk MDS and AML patients treated with Azacitidine, to develop a predicting model for infection during each cycle of Azacitidine therapy. Methods: We retrospectively studied 82 high risk MDS and 16 AML patients treated with 456 azacitidine cycles between 9.2008 and 7.2011at 11 institutions from Israel. Information, of complete blood count, creatinine and liver enzymes was documented prior to initiation of each cycle. Results: Patients' median age was 71 (range 27–92) and 57 (58%) of them males. Poor cytogenetic abnormalities were detected in 30.8% (25 of 82 patients with available cytogenetic) and 65 (67%) were transfusions dependent. The median interval between the initial diagnosis and the initiation of azacitidine therapy was 187 days (range 4 days – 18 years). Azacitidine was administrated as first line therapy in 24 (24%) of patients, 37 (38%) had failed growth factors, 5 (5%) were relapsing after allogeneic transplantation and 32 (33%) were given different chemotherapies prior to azacitidine therapy. Doses and schedule of azacitidine data were available for 98% (446/456) of cycles. The prevalence of 7 days cycles of 75mg/m2, 5 days cycles of 75mg/m2 or attenuated doses were 50.4%, 30%, 16.9% respectively. Adverse events were obtained from patient's charts. 13 major bleeding and 78 infections episodes (2.85% and 16.9% of all cycles) were recorded. Due to the low number of bleeding events we focused on factors predicting infection episodes. Infection rates of 22.7%, 14.2% and 6.9% correlated with azacitidine dose (75mg/m2x7d Vs 5d) and lower respectively). Excluding 87 cycles of doses lower than 75mg/m2 for 5 days, predictors of infections were evaluated in 369 cycles. Nine parameters were included in final analysis: age, sex, cytogenetics, being transfusion dependent prior to first cycle, time from diagnosis to the first cycle, azacitidine dose and neutrophil, thrombocyte and creatinine values prior to each cycle. The odd ratio off infections related to neutrophils count was higher than ANC, so we used neutrophils counts as a predictor. For each cycle we considered full 7 days Vs 5 days schedule, neutrophil above or below 500 cells/mcl, platelet above or below 20,000 cells/mcl and creatinine level prior to the first day of cycle. In univariate analysis neutrophil below 500, platelet below 20,000, creatinine level, azacitidine dose and being transfusion dependent were correlated with infection. In a multivariate analysis (table 1) transfusion dependency and platelets lower than 20,000 were the only significant parameters. Risk of infection was higher when a full seven days cycle was administrated but haven't reach statistical significance (p=0.07). Conclusions: Transfusion dependency prior to first cycle and platelets lower than 20,000 prior to each cycle, are the main significant risk factors for infections during azacitidine therapy. Neutropenia and age are known risk factors for infections in general, but were not significant in our study. We assume that in high risk MDS patients when most off the patients are old and neutropenic, thrombocytopenia is a surrogate marker of disease status which makes the patient more prone to infections. Therefore physicians should considerer these two parameters prior to every azacitidine cycle as guidance in the debate of concurrent prophylactic antibiotics, G-CSF or a tolerable dose of azacitidine. Our findings should be confirmed in a larger sample set but may pave the road for prospective studies of infection prophylaxis during azacitidine therapy. Disclosures: No relevant conflicts of interest to declare.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 3741-3741
Author(s):  
Ursula Creutzig ◽  
Claudia Rossig ◽  
Michael Dworzak ◽  
Arendt von Stackelberg ◽  
Wilhelm Woessmann ◽  
...  

Abstract Background. The risk of early death (ED) due to bleeding and/or leukostasis is high in AML patients with initial hyperleukocytosis (white blood cell count [WBC] > 100 000/µl) and highest in those with hyperleukocytosis and mono- or myelomonocytic leukemia (FAB M4/M5). (Creutzig, et al 1987) Within the AML-BFM studies, emergency strategies for children with AML and high risk for bleeding and leukostasis included exchange transfusion (ET) or leukapheresis (LPh). In order to determine whether these interventions reduced the rate of ED, 1251 AML-BFM patients from the trials AML-BFM 98 and 04 were analyzed. Risk factors for ED and interventions performed were verified focusing on patients with hyperleukocytosis. Patients . 238 of 1251 (19%) AML-patients <18 years of age (FAB M3 excluded) presented with hyperleukocytosis. Twenty-three out of 1251 (1.8%) patients died by bleeding and leukostasis within 15 days from diagnosis, 18 (78%) of these 23 ED patients had hyperleukocytosis. Seventy-two patients received ET and 17 LPh (including 14 patients with WBC counts < 100 000/µl). 149 patients with hyperleukocytosis did not receive ET/LPh. The median age of patients receiving ET was significantly lower compared to those with LPh (3.5 years vs 12.6 years, p = 0.015). WBC counts were similar in both treatment groups (ET median 224 000/µl vs LPh 218 000/µl, p = 0.20). Results. The percentage of ED by bleeding/leukostasis increased with higher WBC counts and was highest in 105 patients with WBC > 200 000/µl (14.3%). The ED rates were even higher in patients with FAB M4/M5 and hyperleukocytosis >200 000/µl compared to others with WBC >200 000/µl (M4/M5 13/65 [20%] vs. others 2/40 [5%], p=0.04). Patients with WBC counts >200 000/µl did slightly better with ET or LPh compared to those without ET/LPh (ED rate 7.5 % vs 21.2 %, p=0.055). Patients with WBC between 100 000/µl and 200 000/µl received ET/LPh less frequently compared to those with WBC >200 000/µl (22/133 [17%] vs. 53/105 [50%]). ET/LPh was mainly given in case of clinical symptoms of bleeding or leukostasis or coagulopathies or insufficient reduction (or increase) of WBC counts despite low dose chemotherapy. 15/80 (19%) patients with FAB M4/5 and WBC 100 000-200 000/µl received ET/LPh and none of these patients died early. ET/LPh was even given in 14 patients with WBC <100 000/µl because of clinical symptoms of bleeding or leukostasis or coagulopathies or rising WBC counts. In multivariate analysis WBC >200 000/µl was the strongest independent risk factor for ED (hazard ratio =15.0, 95% confidence interval 4.9-46.3, p(chi)<0.0001). FAB M4/M5 subtypes, general condition grade 4 and initial bleeding were also significant risk factors. Application of ET/LPh seems to have a non-significant benefit for a reduced ED rate by bleeding/leukostasis (p=0.13). The assessment of the general clinical condition of the patients plays a major role for the decision for ET/LPh. However, this possibly selective cofactor could not be included completely in our calculation because of lack of standardized assessments and documentation of clinical reasons for decision making in particular in patients without ET/LPh. There was no difference in ED rates between ET and LPh. (2/17 vs 5/72, p =0.61). Compared to LPh, ET can be given without time delay. ET is easier to perform especially in young children who need smaller exchange volumes, as well as in adolescents where it can be applied also as partial ET. ET also corrects metabolic disturbances and avoids deterioration of coagulation. Conclusion. Our data confirm the high risk of bleeding/leukostasis in patients with hyperleukocytosis. Although we could only disclose a trend for a clinical benefit of ET/LPh in this retrospective analysis - probably due to some negative selection bias in patients with the intervention - we strongly advocate ET/LPh in AML patients with WBC >200 000/µl, and in particular in those with FAB M4/M5 subtypes or with clinical symptoms of bleeding or leukostasis or coagulopathies even with lower WBC (100 000/µl - 200 000/µl). Creutzig, U. et al. (1987) Early deaths due to hemorrhage and leukostasis in childhood acute myelogenous leukemia: Associations with hyperleukocytosis and acute monocytic leukemia. Cancer,60, 3071-3079. Disclosures No relevant conflicts of interest to declare.


2018 ◽  
Vol 21 (6) ◽  
pp. E489-E496 ◽  
Author(s):  
Sophie Z Lin ◽  
Todd C Crawford ◽  
Alejandro Suarez-Pierre ◽  
J Trent Magruder ◽  
Michael V Carter ◽  
...  

Background: Atrial fibrillation (AF) is common after cardiac surgery and contributes to increased morbidity and mortality. Our objective was to derive and validate a predictive model for AF after CABG in patients, incorporating novel echocardiographic and laboratory values. Methods: We retrospectively reviewed patients at our institution without preexisting dysrhythmia who underwent on-pump, isolated CABG from 2011-2015. The primary outcome was new onset AF lasting >1 hour on continuous telemetry or requiring medical treatment. Patients with a preoperative echocardiographic measurement of left atrial diameter were included in a risk model, and were randomly divided into derivation (80%) and validation (20%) cohorts. The predictors of AF after CABG (PAFAC) score was derived from a multivariable logistic regression model by multiplying the adjusted odds ratios of significant risk factors (P < .05) by a factor of 4 to derive an integer point system. Results: 1307 patients underwent isolated CABG, including 762/1307 patients with a preoperative left atrial diameter measurement. 209/762 patients (27%) developed new onset AF including 165/611 (27%) in the derivation cohort. We identified four risk factors independently associated with postoperative AF which comprised the PAFAC score: age > 60 years (5 points), White race (5 points), baseline GFR < 90 mL/min (4 points) and left atrial diameter > 4.5 cm (4 points). Scores ranged from 0-18. The PAFAC score was then applied to the validation cohort and predicted incidence of AF strongly correlated with observed incidence (r = 0.92). Conclusion: The PAFAC score is easy to calculate and can be used upon ICU admission to reliably identify patients at high risk of developing AF after isolated CABG.


2021 ◽  
Author(s):  
Sudarat Chadsuthi ◽  
Karine Chalvet-Monfray ◽  
Suchada Geawduanglek ◽  
Phrutsamon Wongnak ◽  
Julien Cappelle

Abstract Leptospirosis is a globally important zoonotic disease. The disease is particularly important in tropical and subtropical countries. Infections in humans can be caused by exposure to infected animals or contaminated soil or water, which are suitable for Leptospira. To explore the cluster area, the Global Moran’s I index was calculated for incidences per 100,000 population at the province level during 2012–2018, using the monthly and annual data. The high-risk and low-risk provinces were identified using the local indicators of spatial association (LISA). The risk factors for leptospirosis were evaluated using a generalized linear mixed model (GLMM) with zero-inflation. We also added spatial and temporal correlation terms to take into account the spatial and temporal structures. The Global Moran’s I index showed significant positive values. It did not demonstrate a random distribution throughout the period of study. The high-risk provinces were almost all in the lower north-east and south parts of Thailand. For yearly reported cases, the significant risk factors from the final best-fitted model were population density, elevation, and primary rice arable areas. Interestingly, our study showed that leptospirosis cases were associated with large areas of rice production but were less prevalent in areas of high rice productivity. For monthly reported cases, the model using temperature range was found to be a better fit than using percentage of flooded area. The significant risk factors from the model using temperature range were temporal correlation, average soil moisture, normalized difference vegetation index, and temperature range. Temperature range, which has strongly negative correlation to percentage of flooded area was a significant risk factor for monthly data. Flood exposure controls should be used to reduce the risk of leptospirosis infection. These results could be used to develop a leptospirosis warning system to support public health organizations in Thailand.


Sign in / Sign up

Export Citation Format

Share Document