Predicting Daily Maintenance Dose of Fluindione, an Oral Anticoagulant Drug

1996 ◽  
Vol 75 (05) ◽  
pp. 731-733 ◽  
Author(s):  
V Cazaux ◽  
B Gauthier ◽  
A Elias ◽  
D Lefebvre ◽  
J Tredez ◽  
...  

SummaryDue to large inter-individual variations, the dose of vitamin K antagonist required to target the desired hypocoagulability is hardly predictible for a given patient, and the time needed to reach therapeutic equilibrium may be excessively long. This work reports on a simple method for predicting the daily maintenance dose of fluindione after the third intake. In a first step, 37 patients were delivered 20 mg of fluindione once a day, at 6 p.m. for 3 consecutive days. On the morning of the 4th day an INR was performed. During the following days the dose was adjusted to target an INR between 2 and 3. There was a good correlation (r = 0.83, p<0.001) between the INR performed on the morning of day 4 and the daily maintenance dose determined later by successive approximations. This allowed us to write a decisional algorithm to predict the effective maintenance dose of fluindione from the INR performed on day 4. The usefulness and the safety of this approach was tested in a second prospective study on 46 patients receiving fluindione according to the same initial scheme. The predicted dose was compared to the effective dose soon after having reached the equilibrium, then 30 and 90 days after. To within 5 mg (one quarter of a tablet), the predicted dose was the effective dose in 98%, 86% and 81% of the patients at the 3 times respectively. The mean time needed to reach the therapeutic equilibrium was reduced from 13 days in the first study to 6 days in the second study. No hemorrhagic complication occurred. Thus the strategy formerly developed to predict the daily maintenance dose of warfarin from the prothrombin time ratio or the thrombotest performed 3 days after starting the treatment may also be applied to fluindione and the INR measurement.

2020 ◽  
Author(s):  
FuMei Chen ◽  
Ke Wang ◽  
KangLi Xu ◽  
Li Wang ◽  
TianXiang Zhan ◽  
...  

Abstract Objective To investigate predictors of postoperative acute intracranial hemorrhage (AIH) and recurrence of chronic subdural hematoma (CSDH) after burr hole drainage. Methods A multicenter retrospective study of patients who underwent burr hole drainage for CSDH between January 2013 and March 2019. Results A total of 448 CSDH patients were enrolled in the study. CSDH recurrence occurred in 60 patients, with a recurrence rate of 13.4%. The mean time interval between initial burr hole drainage and recurrence was 40.8±28.3 days. Postoperative AIH developed in 23 patients, with an incidence of 5.1%. The mean time interval between initial burr hole drainage and postoperative AIH was 4.7±2.9 days. Bilateral hematoma, hyperdense hematoma and anticoagulant drug use were independent predictors of recurrence in the multiple logistic regression analyses. Preoperative headache was an independent risk factor of postoperative AIH in the multiple logistic regression analyses, however, intraoperative irrigation reduced the incidence of postoperative AIH. Conclusions This study found that bilateral hematoma, hyperdense hematoma and anticoagulant drug use were independently associated with CSDH recurrence. Clinical presentation of headache was the strongest predictor of postoperative AIH, and intraoperative irrigation decreased the incidence of postoperative AIH.


2020 ◽  
Author(s):  
FuMei Chen ◽  
Ke Wang ◽  
KangLi Xu ◽  
Li Wang ◽  
TianXiang Zhan ◽  
...  

Abstract Background To investigate predictors of postoperative acute intracranial hemorrhage (AIH) and recurrence of chronic subdural hematoma (CSDH) after burr hole drainage. Methods A multicenter retrospective study of patients who underwent burr hole drainage for CSDH between January 2013 and March 2019. Results A total of 448 CSDH patients were enrolled in the study. CSDH recurrence occurred in 60 patients, with a recurrence rate of 13.4%. The mean time interval between initial burr hole drainage and recurrence was 40.8±28.3 days. Postoperative AIH developed in 23 patients, with an incidence of 5.1%. The mean time interval between initial burr hole drainage and postoperative AIH was 4.7±2.9 days. Bilateral hematoma, hyperdense hematoma and anticoagulant drug use were independent predictors of recurrence in the multiple logistic regression analyses. Preoperative headache was an independent risk factor of postoperative AIH in the multiple logistic regression analyses, however, intraoperative irrigation reduced the incidence of postoperative AIH. Conclusions This study found that bilateral hematoma, hyperdense hematoma and anticoagulant drug use were independently associated with CSDH recurrence. Clinical presentation of headache was the strongest predictor of postoperative AIH, and intraoperative irrigation decreased the incidence of postoperative AIH.


2017 ◽  
Vol 23 (1) ◽  
pp. 3-7
Author(s):  
Marzena Mrozowska ◽  
Paweł Kukołowicz

Abstract Aim: The aim of the study was to compare several methods of dose prescription, the mean dose, the median dose, the effective dose and the generalized Equivalent Uniform Dose (gEUD). Background: The dose distribution in the planning target volume is never fully homogenous. Depending on the dose prescription method for the same prescribed dose different biologically equivalent doses are delivered. The latest ICRU Report 83 proposes to prescribe the dose to the median dose in the PTV. Several other methods are also in common use. It is important to know what are differences of doses actually delivered depending on the dose prescription method. Materials and methods: The study was performed for three groups of patients treated radically with external beams in Brzozow, over the 2012-2013 period. The groups were of patients with breast, lung and prostate cancer. There were 10 patients in each group. For each patient all metrics, i.e. the mean dose, the median dose, the effective dose and the generalized Equivalent Uniform Dose, were calculated. The influence of the dose homogeneity in the PTV on the results is also evaluated. The gEUD was used as a reference dose prescription method. Results: For all patients, an almost perfect correlation between the median dose and the gEUD was obtained. Worse correlation was obtained between other metrics and the gEUD. The median dose is almost always a little higher than the gEUD, but the ratio of these two values never exceeded 1.013. Conclusion: The median dose seems to be a good and simple method of dose prescription.


1974 ◽  
Vol 13 (02) ◽  
pp. 193-206
Author(s):  
L. Conte ◽  
L. Mombelli ◽  
A. Vanoli

SummaryWe have put forward a method to be used in the field of nuclear medicine, for calculating internally absorbed doses in patients. The simplicity and flexibility of this method allow one to make a rapid estimation of risk both to the individual and to the population. In order to calculate the absorbed doses we based our procedure on the concept of the mean absorbed fraction, taking into account anatomical and functional variability which is highly important in the calculation of internal doses in children. With this aim in mind we prepared tables which take into consideration anatomical differences and which permit the calculation of the mean absorbed doses in the whole body, in the organs accumulating radioactivity, in the gonads and in the marrow; all this for those radionuclides most widely used in nuclear medicine. By comparing our results with dose obtained from the use of M.I.R.D.'s method it can be seen that when the errors inherent in these types of calculation are taken into account, the results of both methods are in close agreement.


1966 ◽  
Vol 53 (2) ◽  
pp. 177-188 ◽  
Author(s):  
P. Lund-Johansen ◽  
T. Thorsen ◽  
K. F. Støa

ABSTRACT A comparison has been made between (A), a relatively simple method for the measurement of aldosterone secretion rate, based on paper chromatography and direct densitometry of the aldosterone spot and (B) a more elaborate isotope derivative method. The mean secretion rate in 9 normal subjects was 112 ± 26 μg per 24 hours (method A) and 135 ± 35 μg per 24 hours (method B). The »secretion rate« in one adrenalectomized subject after the intravenous injection of 250 μg of aldosterone was 230 μg per 24 hours (method A) and 294 μg per 24 hours (method B). There was no significant difference in the mean values, and correlation between the two methods was good (r = 0.80). It is concluded that the densitometric method is suitable for clinical purposes as well as research, being more rapid and less expensive than the isotope derivative method. Method A also measures the urinary excretion of the aldosterone 3-oxo-conjugate, which is of interest in many pathological conditions. The densitometric method is obviously the less sensitive and a prerequisite for its use is an aldosterone secretion of 20—30 μg per 24 hours. Lower values are, however, rare in adults.


2008 ◽  
Vol 47 (04) ◽  
pp. 175-177 ◽  
Author(s):  
J. Dolezal

SummaryAim: To assess a radiation exposure and the quality of radiation protection concerning a nuclear medicine staff at our department as a six-year retrospective study. Therapeutic radionuclides such as 131I, 153Sm, 186Re, 32P, 90Y and diagnostic ones as a 99mTc, 201Tl, 67Ga, 111In were used. Material, method: The effective dose was evaluated in the period of 2001–2006 for nuclear medicine physicians (n = 5), technologists (n = 9) and radiopharmacists (n = 2). A personnel film dosimeter and thermoluminescent ring dosimeter for measuring (1-month periods) the personal dose equivalent Hp(10) and Hp(0,07) were used by nuclear medicine workers. The wearing of dosimeters was obligatory within the framework of a nationwide service for personal dosimetry. The total administered activity of all radionuclides during these six years at our department was 17,779 GBq (99mTc 14 708 GBq, 131I 2490 GBq, others 581 GBq). The administered activity of 99mTc was similar, but the administered activity of 131I in 2006 increased by 200%, as compared with the year 2001. Results: The mean and one standard deviation (SD) of the personal annual effective dose (mSv) for nuclear medicine physicians was 1.9 ± 0.6, 1.8 ± 0.8, 1.2 ± 0.8, 1.4 ± 0.8, 1.3 ± 0.6, 0.8 ± 0.4 and for nuclear medicine technologists was 1.9 ± 0.8, 1.7 ± 1.4, 1.0 ± 1.0, 1.1 ± 1.2, 0.9 ± 0.4 and 0.7 ± 0.2 in 2001, 2002, 2003, 2004, 2005 and 2006, respectively. The mean (n = 2, estimate of SD makes little sense) of the personal annual effective dose (mSv) for radiopharmacists was 3.2, 1.8, 0.6, 1.3, 0.6 and 0.3. Although the administered activity of 131I increased, the mean personal effective dose per year decreased during the six years. Conclusion: In all three professional groups of nuclear medicine workers a decreasing radiation exposure was found, although the administered activity of 131I increased during this six-year period. Our observations suggest successful radiation protection measures at our department.


1991 ◽  
Vol 21 (3) ◽  
pp. 265-269 ◽  
Author(s):  
J. Cuvellier ◽  
P. Meynadier ◽  
P. Pujo ◽  
O. Sublemontier ◽  
J-P Visticot ◽  
...  

2021 ◽  
pp. 107815522110160
Author(s):  
Bernadatte Zimbwa ◽  
Peter J Gilbar ◽  
Mark R Davis ◽  
Srinivas Kondalsamy-Chennakesavan

Purpose To retrospectively determine the rate of death occurring within 14 and 30 days of systemic anticancer therapy (SACT), compare this against a previous audit and benchmark results against other cancer centres. Secondly, to determine if the introduction of immune checkpoint inhibitors (ICI), not available at the time of the initial audit, impacted mortality rates. Method All adult solid tumour and haematology patients receiving SACT at an Australian Regional Cancer Centre (RCC) between January 2016 and July 2020 were included. Results Over a 55-month period, 1709 patients received SACT. Patients dying within 14 and 30 days of SACT were 3.3% and 7.0% respectively and is slightly higher than our previous study which was 1.89% and 5.6%. Mean time to death was 15.5 days. Males accounted for 63.9% of patients and the mean age was 66.8 years. 46.2% of the 119 patients dying in the 30 days post SACT started a new line of treatment during that time. Of 98 patients receiving ICI, 22.5% died within 30 days of commencement. Disease progression was the most common cause of death (79%). The most common place of death was the RCC (38.7%). Conclusion The rate of death observed in our re-audit compares favourably with our previous audit and is still at the lower end of that seen in published studies in Australia and internationally. Cases of patients dying within 30 days of SACT should be regularly reviewed to maintain awareness of this benchmark of quality assurance and provide a feedback process for clinicians.


2021 ◽  
pp. 1-7
Author(s):  
Naomi Vather-Wu ◽  
Matthew D. Krasowski ◽  
Katherine D. Mathews ◽  
Amal Shibli-Rahhal

Background: Expert guidelines recommend annual monitoring of 25-hydroxyvitamin D (25-OHD) and maintaining 25-OHD ≥30 ng/ml in patients with dystrophinopathies. Objective: We hypothesized that 25-OHD remains stable and requires less frequent monitoring in patients taking stable maintenance doses of vitamin D. Methods: We performed a retrospective cohort study, using the electronic health record to identify 26 patients with dystrophinopathies with a baseline 25-OHD ≥30 ng/mL and at least one additional 25-OHD measurement. These patients had received a stable dose of vitamin D for ≥3 months prior to their baseline 25-OHD measurement and throughout follow-up. The main outcome measured was the mean duration time the subjects spent with a 25-OHD ≥30 ng/mL. Results: Only 19% of patients dropped their 25-OHD to <  30 ng/ml, with a mean time to drop of 33 months and a median nadir 25-OHD of 28 ng/mL. Conclusions: These results suggest that measurement of 25-OHD every 2–2.5 years may be sufficient in patients with a baseline 25-OHD ≥30 ng/mL and who are on a stable maintenance dose of vitamin D. Other patients may require more frequent assessments.


Electronics ◽  
2021 ◽  
Vol 10 (8) ◽  
pp. 876
Author(s):  
Igor Gonçalves ◽  
Laécio Rodrigues ◽  
Francisco Airton Silva ◽  
Tuan Anh Nguyen ◽  
Dugki Min ◽  
...  

Surveillance monitoring systems are highly necessary, aiming to prevent many social problems in smart cities. The internet of things (IoT) nowadays offers a variety of technologies to capture and process massive and heterogeneous data. Due to the fact that (i) advanced analyses of video streams are performed on powerful recording devices; while (ii) surveillance monitoring services require high availability levels in the way that the service must remain connected, for example, to a connection network that offers higher speed than conventional connections; and that (iii) the trust-worthy dependability of a surveillance system depends on various factors, it is not easy to identify which components/devices in a system architecture have the most impact on the dependability for a specific surveillance system in smart cities. In this paper, we developed stochastic Petri net models for a surveillance monitoring system with regard to varying several parameters to obtain the highest dependability. Two main metrics of interest in the dependability of a surveillance system including reliability and availability were analyzed in a comprehensive manner. The analysis results show that the variation in the number of long-term evolution (LTE)-based stations contributes to a number of nines (#9s) increase in availability. The obtained results show that the variation of the mean time to failure (MTTF) of surveillance cameras exposes a high impact on the reliability of the system. The findings of this work have the potential of assisting system architects in planning more optimized systems in this field based on the proposed models.


Sign in / Sign up

Export Citation Format

Share Document