Modifiable factors that influence colorectal cancer lymph nodes sampling and examination

2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 4047-4047
Author(s):  
M. E. Valsecchi ◽  
J. Leighton ◽  
W. Tester

4047 Background: Colorectal cancer is the fourth most common malignancy in the United States. The single most important prognostic factor is lymph node involvement. Multiple guidelines recommend a minimum of 12 nodes should be sampled in order to insure accurate staging and treatment. However this standard of care requirement is not always achieved. The objective of this study is to identify potential modifiable factors that may explain this inadequacy between the optimal approach and routine practice. Methods: The medical charts of all patients treated for colorectal cancer stage I-III between 1999–2007 at Albert Einstein Medical Center were reviewed. The association between multiple variables and the presence of ≥12 lymph nodes reported were examined using logistic regression models. Results: A total of 337 patients were included; 173 (51%) had ≥12 lymph nodes retrieved with a mean of 12.7 (SD±7.6). Demographic characteristics: 78% older than 60 years old; 161 patients (47.8%) male; white (27%), black (67%) and other race (6%). Using a univariate analysis the following variables were statistically associated with ≥12 lymph nodes reported: Colon size (20.6±14.7 vs. 29.9±23.1 cm, P<.001); Mesocolon thickness (3.8±0.9 vs. 4.2±0.9 cm, P<.001); Tumor size (4.14±2.3 vs. 4.6±2.1, P=.03); Site of tumor (Right vs. Left, P<.001); Pathologist (P=.06); Pathologist's Assistant (P=.006); Type of surgery (Right or Sub-Total Colectomy vs. Others, P<.001), Individual Surgeon (P=.009). The results of the multivariate logistic regression analysis, adjusting for age, sex and race, are presented in the Table . Conclusions: This studied showed that multiple factors influence the number of lymph nodes sampled. The role of the surgeon, the pathologist and specially the pathologist's assistant are potentially improvable factors with appropriate education. [Table: see text] No significant financial relationships to disclose.

2017 ◽  
Vol 102 (3-4) ◽  
pp. 102-108
Author(s):  
Shiki Fujino ◽  
Norikatsu Miyoshi ◽  
Masayuki Ohue ◽  
Masayoshi Yasui ◽  
Keijiro Sugimura ◽  
...  

In colorectal cancer (CRC), the possibility of lymph node (LN) metastasis is an important consideration when deciding on treatment. We developed a nomogram for predicting lymph node metastasis of submucosal (SM) CRC. The medical records of 509 patients with SM CRC from 1984 to 2012 were retrospectively investigated. All the patients underwent curative surgical resection at the Osaka Medical Center for Cancer and Cardiovascular Diseases. A total 113 patients with inadequate data were excluded. Using a group of 293 patients who underwent surgery from 1984 to 2008, a logistic regression model was used to develop a prediction model for LN metastasis. The prediction model was validated in an additional group of 103 patients who underwent surgery from 2009 to 2012. Univariate analysis of pathologic factors showed the influence of low histologic grade (muc, por, sig; P &lt; 0.001), positive lymphatic invasion (P &lt; 0.001), positive vascular invasion (P = 0.036), and tumor SM invasion depth (P = 0.098) in LN metastasis. Using these variables, a nomogram predicting LN metastasis was constructed using a logistic regression model with an area under the curve (AUC) of 0.717. The prediction model was validated by an external dataset in an independent patient group with an AUC of 0.920. We developed a novel and reliable nomogram predicting LN metastasis through the integration of 4 pathologic factors. This prediction model may help clinicians to decide on personalized treatment following endoscopic resection.


2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Sara Fernandes ◽  
Beatriz Donato ◽  
Adriana Paixão Fernandes ◽  
Luís Falcão ◽  
Mário Raimundo ◽  
...  

Abstract Background and Aims Anemia is a well-know complication of Chronic Kidney Disease (CKD) and it seems to contribute for deterioration of kidney function. Experimental data suggest that anemia produces hypoxia of tubular cells which leads to tubulointerstitial damage resulting on CKD progression. Other mechanism described is that red blood cells have antioxidant properties that prevent the damage of tubulointerstitial cells and glomerulosclerosis from oxidative stress. There aren’t many observational studies that evaluated the association between anemia and progression of CKD. Therefore, our aim was to evaluate the association of anemia and CKD progression and its association outcomes in an outpatient ND-CKD population. Method We conduct a retrospective, patient-level, cohort analysis of all adult ND-CKD patients evaluated in an outpatient nephrology clinic over a 6 years period. The follow up time was at least 12 months. Anemia was defined according to the WHO definition (hemoglobin [hb] &lt; 13.0 g/dL in men and 12.0 g/dL in women). Progression of CKD was defined by one of the following criteria: decline in eGFR (CKD-EPI) superior to 5 ml/min/1.73 m2/year; duplication of serum creatinine or the need renal replacement therapy. Demographics and clinical data were also accessed. Results Out of 3008 patients referred to the nephrology clinic, 49.9% had anemia (mean age 71.9±15.9 years; 50.4% male; 92% white; mean follow-up time of 2.3±1.2 years). The mean Hb was 11.8 ±1.9 g/dL. Important cardiovascular comorbidities in patients with anemia were arterial hypertension (86.7%), obesity (65.5%), Diabetes Mellitus (DM) (52%) and dyslipidemia (46%). In univariate analysis, mortality was associated with anemia (36.9 vs 13.0%, p&lt;0.001), obesity (30.1 vs 21.8%, p&lt;0.001) and DM (30.1 vs 21.1%, p&lt;0.001). Of the patients with anemia, 738 met the criteria for CKD progression. In univariate analysis, CKD progression was associated with anemia (49.6 vs 43.9%, p=0.002), male gender (49.5 vs 43.6% p= 0.001); DM (49.6 vs 44.8 % p=0.009) and hypertension (47.9 vs 42.3% p=0.0018). In multivariate logistic regression analysis, anemia emerged was an independent predictor of CKD progression (OR 1.435, CI 95% 1.21-1.71, p&lt;0,001). Comparing hb values intervals (hb ≤10g/dl; hb10-12 g/dL; hb ≥12 g/dL), in the multivariate logistic regression analysis, hb ≤10g/dl was not associated with CKD progression and hb value between 10-12 g/dL was associated (OR 1,486, CI 95% 1.23-1.80, p&lt;0,001), when compared with the group with hb ≥12g/dL. In multivariate logistic regression analysis, the independent predictors of mortality were: older age (OR per 1 year increase: 1.048, 95% CI 95% 1.04-1.06, p&lt;0.001); arterial hypertension (OR 0.699 CI 95% 0.51-0.96, p=0.0029); obesity (OR 0.741, CI 95% 0.60-0.91, p=0.004) and hb value (OR per 1 g/dL decrease: 1.301, CI 95% 1.23-1.38, p&lt;0.001). Cardiovascular events were correlated with Hb levels between 10-12 g/dL (univariate analysis: OR 2.021, CI 95% 1.27-3.22, P=0.003), but not with the group with hb≤10 g/dL (univariate analysis: OR 1.837, CI 95% 0.96-3.51, P=0.066), having the group with hb ≥12g/dL was reference. Anemia was strongly associated with hospitalizations (multivariate logistic regression analysis: OR per 1 g/dL of Hb decrease: 1.256 CI 95% 1.12-1.32 p&lt;0.001), and this strong association was also observed on the groups with hb hb≤10 g/dL (multivariate logistic regression analysis: OR 3.591 CI 95% 32.67-4.84 p&lt;0.001) and between 10-12 g/dL (multivariate logistic regression analysis: OR 1.678 CI 95% 1.40-2.02, p&lt;0.001) Conclusion Our study suggests that anemia, at first consultation, increases the risk for rapid CKD progression and global mortality. This study could guide us on the development of futures studies in order to prove if anemia correction can slow the progression of CKD.


2018 ◽  
Vol 6 (7_suppl4) ◽  
pp. 2325967118S0012
Author(s):  
Tetsuya Matsuura ◽  
Toshiyuki Iwame ◽  
Koichi Sairyo

Objectives: With the incidence of Little League elbow increasing, pitch limit recommendations for preventing throwing injuries have been developed in the United States and Japan. In 1995, the Japanese Society of Clinical Sports Medicine announced limits of 50 pitches per day and 200 pitches per week to prevent throwing injuries in younger than 12 years old. However the relationship between pitch limit recommendation and elbow injuries among pitchers has not been adequately studied. The aim of our study was to evaluate the association between pitch counts and elbow injuries in youth pitchers. Methods: A total of 149 pitchers without prior elbow pain were observed prospectively for 1 season to study injury incidence in relation to specific risk factors. Average age was 10.1 years (range, 7-11 years). One year later, all pitchers were examined by questionnaire. Subjects were asked whether they had experienced any episodes of elbow pain during the season. The questionnaire was also used to gather data on pitch counts per day and per week, age, number of training days per week, and number of games per year. We investigated the following risk factors for elbow injury: pitch counts, age, position, number of training days per week, and number of games per year. Data were analyzed by multivariate logistic regression models and presented as odds ratio (OR) and profile likelihood 95% confidence interval (CI) values. The likelihood-ratio test was also performed. A two-tailed P value of less than .05 was considered significant. All analysis was done in the SAS software package (version 8.2). Results: Of the 149 subjects, 66 (44.3%) reported episodes of pain in the throwing elbow during the season. 1. Analysis for pitch count per day Univariate analysis showed that elbow pain was significantly associated with more than 50 pitches per day. Multivariate analysis showed that more than 50 pitches per day (OR, 2.44; 95% CI, 1.22-4.94), and more than 70 games per year (OR, 2.47; 95% CI, 1.24-5.02) were risk factors significantly associated with elbow pain. Age and number of training days per week were not significantly associated with elbow pain. 1. Analysis for pitch count per week Univariate analysis showed that elbow pain was significantly associated with more than 200 pitches per week. Multivariate analysis showed that more than 200 pitches per week (OR, 2.04; 95% CI, 1.03-4.10), and more than 70 games per year (OR, 2.41; 95% CI, 1.22-4.87) were risk factors significantly associated with elbow pain. Age was not significantly associated with elbow pain. Conclusion: A total of 44.3% of youth baseball pitchers had elbow pain during the season. Multivariable logistic regression revealed that elbow pain was associated with more than 50 pitches per day, more than 200 pitches per week, and more than 70 games per year. Previous studies have revealed the risk factor with the strongest association to injury is pitcher. Our data suggest that compliance with pitch limit recommendations including limits of 50 pitches per day and 200 pitches per week may be protective against elbow injuries. Those who played more than 70 games per year had a notably increased risk of injury. With increasing demand on youth pitchers to play more, there is less time for repair of bony and soft tissues in the elbow. In conclusion, among youth pitchers, limits of 50 pitches per day, 200 pitches per week, and limits of 70 games per year may protect elbow injuries.


2009 ◽  
Vol 20 (3) ◽  
pp. e43-e48 ◽  
Author(s):  
Marianna Ofner-Agostini ◽  
Andrew Simor ◽  
Michael Mulvey ◽  
Alison McGeer ◽  
Zahir Hirji ◽  
...  

BACKGROUND: Clinical features associated with Gram-negative bacterial isolates with extended-spectrum beta-lactamase (ESBL)- and AmpC-mediated resistance identified in Canadian hospitals is largely unknown. The objective of the present study was to determine the demographics, risk factors and outcomes of patients with ESBL- or AmpC-mediated resistant organisms in Canadian hospitals.METHODS: Patients with clinical cultures ofEscherichia coliorKlebsiellaspecies were matched with patients with a similar organism but susceptible to third-generation cephalosporins. Molecular identification of the AmpC or ESBL was determined using a combination of polymerase chain reaction and sequence analysis. Univariate and multivariate logistic regression analysis was performed to identify variables associated with becoming a case.RESULTS: Eight Canadian hospitals identified 106 cases (ESBL/AmpC) and 106 controls. All risk factors identified in the univariate analysis as a predictor of being an ESBL/AmpC cases at the 0.20 P-value were included in the multivariate analysis. No significant differences in outcomes were observed (unfavourable responses 17% versus 15% and mortality rates 13% versus 7%, P not significant). Multivariate logistic regression found an association of becoming an ESBL/AmpC case with: previous admission to a nursing home (OR 8.28, P=0.01) or acute care facility (OR 1.96, P=0.03), length of stay before infection (OR 3.05, P=0.004), and previous use of first-generation cephalosporins (OR 2.38, P=0.02) or third-generation cephalosporins (OR 4.52, P=0.01). Appropriate antibiotics were more likely to be given to controls (27.0% versus 13.3%, P=0.05) and number of days to appropriate antibiotics was longer for cases (median 2.8 days versus 1.2 days, P=0.05).CONCLUSION: The importance of patient medical history, present admission and antibiotic use should be considered for allE coliorKlebsiellaspecies patients pending susceptibility testing results.


2020 ◽  
Author(s):  
Lisa M. Kuhns ◽  
Brookley Rogers ◽  
Katie Greeley ◽  
Abigail L. Muldoon ◽  
Niranjan Karnik ◽  
...  

Abstract Background: Despite recent reductions, youth substance use continues to be a concern in the United States. Structured primary care substance use screening among adolescents is recommended, but not widely implemented. The purpose of this study was to describe the distribution and characteristics of adolescent substance use screening in outpatient clinics in a large academic medical center and assess related factors (i.e., patient age, race/ethnicity, gender, and insurance type) to inform and improve the quality of substance use screening in practice. Methods: We abstracted a random sample of 127 records of patients aged 12-17 and coded clinical notes (e.g., converted open-ended notes to discrete values) to describe screening cases and related characteristics (e.g., which substances screened, how screened). We then analyzed descriptive patterns within the data to calculate screening rates, characteristics of screening, and used multiple logistic regression to identify related factors. Results: Among 127 records, rates of screening by providers were 72% (each) for common substances (alcohol, marijuana, tobacco). The primary method of screening was use of clinical mnemonic cues rather than standardized screening tools. A total of 6% of patients reported substance use during screening. Older age and racial/ethnic minority status were associated with provider screening in multiple logistic regression models. Conclusions: Despite recommendations, low rates of structured screening in primary care persist. Failure to use a standardized screening tool may contribute to low screening rates and biased screening. These findings may be used to inform implementation of standardized and structured screening in the clinical environment.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Yisen Zhang ◽  
Chao Wang ◽  
Zhongbin Tian ◽  
Wei Zhu ◽  
Wenqiang Li ◽  
...  

Abstract Background The aim of this study was to comprehensively evaluate the risk factors of periprocedural ischemic stroke associated with endovascular treatment of intracranial aneurysms using a real-world database. Methods From August 2016 to March 2017, 167 patients were enrolled. Univariate analysis and multivariate logistic regression analysis were used to examine the risk factors for periprocedural ischemic stroke. Results Among the 167 cases, periprocedural ischemic stroke occurred in 20 cases (11.98%). After univariate analysis, the ischemic group had a higher proportion of large (≥ 10 mm) aneurysms than the control group (45.0% vs. 23.1%, p = 0.036). The incidence of periprocedural ischemic stroke was higher in cases treated by flow diverter (21.6%) or stent-assisted coiling (11.8%) than in cases treated by coiling only (2.7%), and the differences were statistically significant (p = 0.043). After multivariate logistic regression analysis, treatment modality was the independent risk factor for periprocedural ischemic stroke. Compared with the coiling-only procedure, flow diverter therapy was associated with a significantly higher rate of periprocedural ischemic stroke (OR 9.931; 95% CI 1.174–84.038; p = 0.035). Conclusions Aneurysm size and treatment modality were associated with periprocedural ischemic stroke. Larger aneurysms were associated with increased risk of periprocedural ischemic stroke. Flow diverter therapy was associated with significantly more periprocedural ischemic stroke than the coiling procedure alone.


2021 ◽  
Vol 12 (2) ◽  
pp. e0012
Author(s):  
Steven Fuchs ◽  
◽  
Itamar Ashkenazi ◽  
◽  

Background: Adequate lymphadenectomy is an important factor affecting survival in gastric cancer patients. Retrieval and examination of at least 15 lymph nodes is recommended in order to properly stage gastric malignancies. The objectives of this study were to evaluate the proportion of patients undergoing inadequate lymphadenectomies and possible risk factors for inadequate surgery. Methods: This was a retrospective study that included patients, 18 years and older, who underwent gastrectomies with oncologic intent in the Hillel Yaffe Medical Center. We analyzed the association of demographic, clinical, and pathological variables with adequate number of lymph nodes. Results: The retrieval of less than 15 lymph nodes was reported in 51% (53/104) patients undergoing gastrectomies with oncologic intent. The extent of surgery was the only variable associated with inadequate lymphadenectomy on univariate analysis: subtotal/proximal versus total gastrectomy (P=0.047). Differ¬ences observed for previous surgery (P=0.193), T stage (P=0.053), N stage (P=0.051), and lymphovascular invasion (P=0.14) did not reach significance. Subtotal/proximal gastrectomy resulted in inadequate resec¬tion of lymph nodes in 56% of the patients, while this occurred in only 30% of the patients undergoing total gastrectomy (relative risk 1.865; 95% CI 0.93, 3.741). Logistic regression confirmed that only subtotal/prox¬imal versus total gastrectomy was associated with inadequate number of lymph nodes resected (P=0.043). Discussion and Conclusion: In this study we analyzed the association of patient, tumor, and surgery-related factors on adequate lymphadenectomy in patients undergoing gastrectomies for possible gastric cancer. Larger extent of the surgery (total, rather than subtotal/proximal gastrectomy) was revealed to be the only indicator positively associated with adequate lymphadenectomy.


2021 ◽  
Vol 27 ◽  
pp. 107602962110379
Author(s):  
Xiao Li ◽  
Shu-Ling Hou ◽  
Xi Li ◽  
Li Li ◽  
Ke Lian ◽  
...  

This study investigated the risk factors of thromboembolism (TE) in lymphoma patients undergoing chemotherapy and its clinical significance. A total of 304 lymphoma patients who received chemotherapy from January 2012 to July 2019 were retrospectively analyzed, including 111 patients with and 193 patients without TE. The clinical characteristics and related laboratory test results were compared between the 2 groups using univariate analysis, while the risk factors for TE in lymphoma patients undergoing chemotherapy were analyzed using multivariate logistic regression analysis. Univariate analysis revealed an increase in the risk of TE among lymphoma patients with chemotherapy in the following categories: female patients, patients with body mass index <18.5 or > 24, patients aged ≥60 years, those with platelet abnormality before chemotherapy, single hospital-stay patients, and Ann Arbor stage III/IV patients. Multivariate logistic regression analysis revealed that for platelet count abnormality before chemotherapy, Ann Arbor stage III/IV and female patients represented independent risk factors for TE among lymphoma patients after chemotherapy ( P < .05). For lymphoma patients treated with chemotherapy, the risk of TE occurring in women, patients with platelet abnormalities before chemotherapy, and patients at Ann Arbor stage III/IV was significantly higher compared with other patients. For these patients, we recommend prophylactic anticoagulant therapy.


2019 ◽  
Vol 47 (1-2) ◽  
pp. 88-94 ◽  
Author(s):  
Changyi Wang ◽  
Linghui Deng ◽  
Shi Qiu ◽  
Haiyang Bian ◽  
Lu Wang ◽  
...  

Background and Objective: Hemorrhagic transformation (HT) is a major complication of acute ischemic stroke (AIS). Serum albumin is known for its neuroprotective effects and is a marker of improved AIS patient outcomes. However, it is not known whether there is a relationship between serum albumin and HT. Methods: AIS patients admitted to the Department of Neurology of West China Hospital from 2012 to 2016 were prospectively and consecutively enrolled. Baseline characteristics were collected. HT during hospitalization was diagnosed by brain imaging. Multivariate logistic regression analysis was performed to determine the relationship between serum albumin and HT. Confounding factors were identified by univariate analysis. Stratified logistic regression analysis was performed to identify effect modifiers. Results: A total of 1996 AIS patients were recruited, of whom 135 (6.8%) developed HT. Serum albumin negatively correlated with HT. Patients in the upper serum albumin tertile (42.6–54.1 g/L) had a 46% lower risk of HT than patients in the lower tertile (19.3–39.1 g/L) after adjustment for potential confounders (OR 0.54, 95% CI 0.29–0.99, p = 0.04). Risk of HT decreased stepwise with higher serum albumin tertile (p for trend = 0.04). There was a significant interaction between serum albumin and age (p = 0.02), with no significant correlation between serum albumin and HT in patients over 60 years of age. Conclusions: Higher serum albumin is associated with lower HT risk in a dose-dependent manner in AIS patients younger than 60 years.


2013 ◽  
Vol 79 (3) ◽  
pp. 296-300 ◽  
Author(s):  
John Trombold ◽  
Russellw Farmer ◽  
Michael McCafferty

Colon and rectal cancer is the second most common cause of cancer death in the United States. Screening effectively decreases colorectal cancer mortality. This study aims to evaluate the impact of colorectal cancer screening within a Veterans Affairs Medical Center and treatment outcomes. Institutional Review Board approval was obtained for a retrospective analysis of all colorectal cancer cases that were identified through the Tumor Registry of the Robley Rex VA Medical Center from 2000 to 2009. Data collected included age at diagnosis, race, risk factors, diagnosis by screening versus symptomatic evaluation, screening test, tumor location and stage, operation performed, operative mortality, and survival. A value of P < 0.05 on Fisher's exact, χ2, analysis of variance, or Cox regression analyses was considered significant. Three hundred fifty-four patients with colorectal cancer (255 colon, 99 rectal) were identified. One hundred twenty-one patients (34%) were diagnosed by screening. In comparison with those diagnosed by symptom evaluation (n = 233), these patients had earlier stage cancers, were more likely to have a curative intent procedure, and had improved 5-year survival rates. Older patients (older than 75 years old) were more likely to present with symptoms. High-risk patients were more likely to have colonoscopic screening than fecal occult blood testing. More blacks had Stage IV disease than nonblacks. Curative intent 30-day operative mortality was 2.1 per cent for colectomy and 0 per cent for rectal resection. Screening for colorectal cancer in the veteran population allows for better survival, detection at an earlier stage, and higher likelihood of resection.


Sign in / Sign up

Export Citation Format

Share Document