scholarly journals Switching from conventional therapy to burosumab injection has the potential to prevent nephrocalcinosis in patients with X-linked hypophosphatemic rickets

Author(s):  
Daisuke Harada ◽  
Kaoru Ueyama ◽  
Kyoko Oriyama ◽  
Yoshihito Ishiura ◽  
Hiroko Kashiwagi ◽  
...  

Abstract Objectives X-linked hypophosphatemic rickets (XLH) is a congenital fibroblast growth factor (FGF)23-related metabolic bone disease that is treated with active vitamin D and phosphate as conventional therapies. Complications of these therapies include nephrocalcinosis (NC) caused by excessive urine calcium and phosphate concentrations. Recently, an anti-FGF23 antibody, burosumab, was developed and reported to be effective in poorly-controlled or severe XLH patients. This study aimed to reveal the impact of switching treatments in relatively well-controlled XLH children with the Rickets Severity Scale less than 2.0. Methods The effects of the two treatments in eight relatively well-controlled XLH children with a mean age of 10.4 ± 1.9 years were compared retrospectively for the same treatment duration (31 ± 11 months) before and after the baseline. Results Actual doses of alfacalcidol and phosphate as conventional therapy were 150.9 ± 43.9 ng/kg and 27.5 ± 6.3 mg/kg per day, respectively. Renal echography revealed spotty NC in 8/8 patients, but no aggravation of NC was detected by switching treatments. Switching treatments increased TmP/GFR (p=0.002) and %TRP (p<0.001), and improved the high urine calcium/creatinine ratio to the normal range (p<0.001) although both treatments controlled disease markers equally. Additionally, low intact parathyroid hormone during conventional therapy was increased within the normal range by switching treatments. Conclusions Our results suggest that a high dose of alfacalcidol was needed to control the disease, but it caused hypercalciuria and NC. We concluded that switching treatments in relatively well-controlled XLH children improved renal phosphate reabsorption and decreased urine calcium extraction, and may have the potential to prevent NC.

Blood ◽  
2004 ◽  
Vol 104 (11) ◽  
pp. 4907-4907 ◽  
Author(s):  
Monika Engelhardt ◽  
Daniel Räpple ◽  
Andreas Weis ◽  
Emanuel Bisse ◽  
Gabriele Ihorst

Abstract So far, data based on small patient (pt) population suggest that the measurement of serum FLC from MM pts undergoing high-dose chemotherapy (HDCT) with stem cell transplantation (SCT) may be a sensitive marker for monitoring therapy success and for early detection of relapse. For further evaluation of the impact of FLCs on the assessment of treatment efficacy of standard- (ST) and HDCT with SCT, we performed a prospective analysis on serial serum specimens from 86 MM and 9 control pts. Measurement of FLC concentration was performed with the commercially available Freelite™ kit (Binding Site). For statistical analysis, pts’ clinical history, age at diagnosis, sex, current state of disease, karyotype and serum parameters, such as ß2-microglobulin, calcium levels and serum creatinine were evaluated. In the control group (NHL=6, AML=1, non-hematological disease=2), median concentrations of kappa(k)- and lambda(l)- FLC were 9.8 mg/l and 12.8 mg/l, respectively, corresponding to reference intervals for healthy individuals with normal kappa(k)/lambda(l)-ratios. In MM, 40 (46.5%) pts displayed kappa(k)-FLC levels above the upper range of 19mg/l, 26 (30%) had lambda(l)-FLC levels above the upper range of 26 mg/l and 9 pts (10.4%) had both elevated kappa(k)- and lambda(l)-FLC serum levels. An abnormal kappa(k)/lambda(l)-ratio was observed in 45 (52,3%) MM pts. Pts with a known kappa(k)-paraprotein (n=58) had a median FLC kappa(k)-concentration of 38 mg/l, but lambda(l)-FLC within the normal range. For pts with a known lambda(l)-paraprotein (n=27), reciprocal findings (76.4 mg/l for lambda(l)- vs kappa(k)-FLC in the normal range) were observed. Pts with responsive disease (CR, PR and SD) had both kappa(k)- and lambda(l)- FLC levels within the normal range, whereas newly diagnosed pts (ED) and those with PD had kappa(k)- FLC levels approx. 3-times the normal range, with lambda(l)- FLC levels at the upper limit of normal. Pts receiving ST as compared with HDCT had higher FLC levels. This is also observed in pts with amyloidosis, renal impairment or PD. Our results suggest that serum FLC assay allows monitoring of the therapy response and early detection of relapse. Determination of FLCs is also important, when evaluating new therapeutic substances, and for detection of prognostic patterns for better risk-based stratification of treatment.


2017 ◽  
Vol 104 (4) ◽  
pp. 334-343
Author(s):  
M Tokodi ◽  
E Csábi ◽  
Á Kiricsi ◽  
E Kollár ◽  
AH Molnár ◽  
...  

Purpose This study aims to compare the impact of active allergic rhinitis on physical and cognitive abilities of trained allergic athletes to untrained allergic patients. Methods Cognitive, respiratory, and fitness functions were assessed before and after allergen exposure. Participants in both groups were provoked intranasally with ragweed allergen. Results The group of athletes revealed significantly higher average values in peak inspiratory flow and fitness index before and after provocation. In neuropsychological assessments, athletes performed significantly better after allergen provocation in complex working memory capacity. Due to single acute allergen exposure, the size of the nasal cavity and nasal inspiratory peak flow significantly decreased in both groups. The physical performance of both groups did not change after provocation. Executive functions and complex working memory capacity of athletes significantly improved resulting from provocation. Conclusions A single-shot allergen in high dose might cause an increase in mental concentration, which was more pronounced in the group of athletes. This study indicates that acute exposure to allergen cannot affect the physical performance and may result in increased mental focus in patients with allergy notwithstanding the declining respiratory functions.


2014 ◽  
Vol 31 (3) ◽  
pp. 167-173 ◽  
Author(s):  
J. Kelly ◽  
F. Kelly ◽  
K. Santlal ◽  
S. O’Ceallaigh

ObjectivesTo examine the impact of a change in local prescribing policy on the adherence to evidence-based prescribing guidelines for antipsychotic medication in a general adult psychiatric hospital.MethodsAll adult in-patients had their clinical record and medication sheet reviewed. Antipsychotic prescribed, dose prescribed and documented indications for prescribing were recorded. This was done before and after the implementation of the change in hospital antipsychotic prescribing policy.ResultsThere were no significant differences in age, sex, Mental Health Act status, psychiatric diagnosis or documented indications for prescribing multiple or high dose antipsychotics between the two groups. There was an increase in the preferential prescribing of multiple second-generation antipsychotics (p=0.01) in the context of a significant reduction in the prescribing of multiple antipsychotics overall (p=0.02). There were no significant reductions in prescribing of mixed generations of antipsychotics (p=0.12), high dose antipsychotics (p=1.00) or as required (PRN) antipsychotics (p=0.74).ConclusionsChanges in local prescribing policy can improve adherence to quality prescribing guidelines and cause clinically significant improvements in patterns of prescribing in a general adult psychiatric hospital.


2019 ◽  
Vol 58 (1) ◽  
pp. 33-39
Author(s):  
Camille Chenevier-Gobeaux ◽  
Marie Rogier ◽  
Imane Dridi-Brahimi ◽  
Eugénie Koumakis ◽  
Catherine Cormier ◽  
...  

Abstract Background Measuring 24 h-urine calcium concentration is essential to evaluate calcium metabolism and excretion. Manufacturers recommend acidifying the urine before a measurement to ensure calcium solubility, but the literature offers controversial information on this pre-analytical treatment. The objectives of the study were (1) to compare pre-acidification (during urine collection) versus post-acidification (in the laboratory), and (2) to evaluate the impact of acidification on urinary calcium measurements in a large cohort. Methods We evaluated the effects of pre- and post-acidification on 24-h urine samples collected from 10 healthy volunteers. We further studied the impact of acidification on the calcium results for 567 urine samples from routine laboratory practice, including 46 hypercalciuria (≥7.5 mmol/24 h) samples. Results Calciuria values in healthy volunteers ranged from 0.6 to 12.5 mmol/24 h, and no statistical significance was found between non-acidified, pre-acidified and post-acidified conditions. A comparison of the values (ranging from 0.21 to 29.32 mmol/L) for 567 urine samples before and after acidification indicated 25 samples (4.4%) with analytical differences outside limits of acceptance. The bias observed for these deviant values ranged from −3.07 to 1.32 mmol/L; no patient was re-classified as hypercalciuric after acidification, and three patients with hypercalciuria were classified as normocalciuric after acidification. These three deviant patients represent 6.5% of hypercalciuric patients. Conclusions Our results indicate that pre- and post-acidification of urine is not necessary prior to routine calcium analysis.


Cells ◽  
2020 ◽  
Vol 9 (4) ◽  
pp. 842 ◽  
Author(s):  
Franz Felix Konen ◽  
Ulrich Wurster ◽  
Torsten Witte ◽  
Konstantin Fritz Jendretzky ◽  
Stefan Gingele ◽  
...  

Background: Kappa free light chains (KFLC) are a promising new biomarker to detect neuroinflammation. Still, the impact of pre-analytical effects on KFLC concentrations was not investigated. Methods: KFLC concentrations were measured in serum and cerebrospinal fluid (CSF) of patients with a newly diagnosed multiple sclerosis (MS) or clinically isolated syndrome (CIS) before (n = 42) or after therapy with high-dose methylprednisolone (n = 65). In prospective experiments, KFLC concentrations were analyzed in the same patients in serum before and after treatment with high-dose methylprednisolone (n = 16), plasma exchange (n = 12), immunoadsorption (n = 10), or intravenous immunoglobulins (n = 10). In addition, the influence of storage time, sample method, and contamination of CSF with blood were investigated. Results: Patients diagnosed with MS/CIS and treated with methylprednisolone showed significantly lower KFLC concentrations in serum as untreated patients. Repeated longitudinal investigations revealed that serum KFLC concentrations continuously decreased after each application of methylprednisolone. In contrast, other immune therapies and further pre-analytical conditions did not influence KFLC concentrations. Conclusion: Our results show prominent effects of steroids on KFLC concentrations. In contrast, various other pre-analytical conditions did not influence KFLC concentrations, indicating the stability of this biomarker.


2022 ◽  
Vol 23 (2) ◽  
pp. 934
Author(s):  
Rocío Fuente ◽  
María García-Bengoa ◽  
Ángela Fernández-Iglesias ◽  
Helena Gil-Peña ◽  
Fernando Santos ◽  
...  

X-linked hypophosphatemia (XLH), the most common form of hereditary hypophosphatemic rickets, is caused by inactivating mutations of the phosphate-regulating endopeptidase gene (PHEX). XLH is mainly characterized by short stature, bone deformities and rickets, while in hypophosphatemia, normal or low vitamin D levels and low renal phosphate reabsorption are the principal biochemical aspects. The cause of growth impairment in patients with XLH is not completely understood yet, thus making the study of the growth plate (GP) alterations necessary. New treatment strategies targeting FGF23 have shown promising results in normalizing the growth velocity and improving the skeletal effects of XLH patients. However, further studies are necessary to evaluate how this treatment affects the GP as well as its long-term effects and the impact on adult height.


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 3335-3335
Author(s):  
H. Kent Holland ◽  
Scott R. Solomon ◽  
Asad Bashey ◽  
Lawrence E. Morris

Abstract Previous randomized trials evaluating the benefit of high-dose Melphalan and ASCT for MM patients receiving conventional induction chemotherapy have demonstrated that patients who achieve either a CR or VgPR post transplant have a superior PFS and overall survival (OS) compared to those who do not. We report on our 6-year experience comparing the outcome benefit of newly diagnosed MM patients who received either conventional VAD versus IMiDs based induction pre-transplant therapy and the impact on response on 112 consecutive patients transplanted between 5/2002 through 2/2008. The median age was 58 years (33–74). Disease status was Stage III in 92 (82%), Stage II in 11 (10%) and Stage in 9 (8%) patients. Conventional VAD induction chemotherapy was administered pre-transplant in 36 patients and IMiDs (thalidomide, lenalidomide) based therapy in 76 patients. Single ASCT was administered in 78 patients and Tandem ASCT in 34 patients (8 received conventional therapy and 26 received IMiDs, p=NS). Pre-transplant achievement of CR/VgPR was observed in 1/36 (1%) of patients receiving VAD chemotherapy and in 15/76 (20%) of patients receiving IMiDs (p=.017). Post-transplant, 5/36 (14%) patients that received VAD conventional therapy and 39/76 (51%) that received IMiDs achieved CR/VgPR (p=.002). Patients that entered into a CR/VgPR had both a superior PFS and OS (Median PFS 1,118 days vs. 670 days, p=.02/HR.49; Median OS “Not achieved” vs. 1,387 days, p=.05/HR 0.36). Single vs. Tandem transplant had no impact on outcome. These observations support the use of IMiDs as pre-transplant induction therapy over historical VAD chemotherapy. Figure Figure


1991 ◽  
Vol 65 (05) ◽  
pp. 504-510 ◽  
Author(s):  
Raffaele De Caterina ◽  
Rosa Sicari ◽  
Walter Bernini ◽  
Guido Lazzerini ◽  
Giuliana Buti Strata ◽  
...  

SummaryTiclopidine (T) and aspirin (ASA) are two antiplatelet drugs both capable of prolonging bleeding time (BT), with a different mechanism of action. A synergism in BT prolongation has been reported and is currently considered an argument for not recommending their combination. However, a profound suppression of platelet function might be a desirable counterpart of a marked prolongation of BT, with a possible use in selected clinical situations. We therefore studied ex vivo platelet function (aggregation by ADP 0.5-1-2.5 μM; adrenaline 0.75-2.5 μM; collagen 1.5-150 μg/ml; arachidonic acid 1 mM; PAF 1 μM; adrenaline 0.17 μM + ADP 0.62 μM; serum thromboxane ([TX]B2 generation) and BT (Mielke) in 6 patients with stable coronary artery disease receiving such combination. Patients underwent sequential laboratory evaluations at baseline, after 7 days of T 250 mg b.i.d., before and after the intravenous administration of ASA 500 mg, respectively, and, finally, after a minimum of 7 days of sole ASA oral administration (50 mg/day). The experimental design, therefore, allowed a comparison of T and ASA effects (2nd and 4th evaluation), and an assessment of the combination effect (3rd evaluation). Platelet aggregation in response to all doses of ADP was depressed more by T than by ASA. Conversely, responses to adrenaline, and arachidonate were affected more by ASA than by T. For all other agents, differences were not significant. T + ASA combination was more effective (p <0.05) than either treatment alone in depressing responses to high-dose collagen (% over control, mean ± SEM: T: 95 ± 3; ASA: 96 ± 5; T + ASA: 89 ± 4). Serum TXB2 (basal, ng/ml: 380 ± 54) did not change with T (372 ± 36), dropped to <1 ng/ml on ASA injection and slightly re-increased to 9.1 ± 3.1 ng/ml on oral low-dose ASA. BT (basal 7.4 ± 0.6 min) was affected similarly by T (9.2 ± 0.8) or ASA (9.7 ± 0.9) alone, but increased to 15.0 ± 0.7 min on combination treatment (106% increase over control). Thus, the strong synergism in BT prolongation by ASA-T combination has a counterpart in the inhibition of platelet function in response to strong stimuli such as high-dose collagen, not otherwise affected significantly by single-drug treatment. This effect is a possible rationale for the clinical evaluation of T + ASA combination.


2018 ◽  
Vol 15 (1) ◽  
pp. 55-72
Author(s):  
Herlin Hamimi ◽  
Abdul Ghafar Ismail ◽  
Muhammad Hasbi Zaenal

Zakat is one of the five pillars of Islam which has a function of faith, social and economic functions. Muslims who can pay zakat are required to give at least 2.5 per cent of their wealth. The problem of poverty prevalent in disadvantaged regions because of the difficulty of access to information and communication led to a gap that is so high in wealth and resources. The instrument of zakat provides a paradigm in the achievement of equitable wealth distribution and healthy circulation. Zakat potentially offers a better life and improves the quality of human being. There is a human quality improvement not only in economic terms but also in spiritual terms such as improving religiousity. This study aims to examine the role of zakat to alleviate humanitarian issues in disadvantaged regions such as Sijunjung, one of zakat beneficiaries and impoverished areas in Indonesia. The researcher attempted a Cibest method to capture the impact of zakat beneficiaries before and after becoming a member of Zakat Community Development (ZCD) Program in material and spiritual value. The overall analysis shows that zakat has a positive impact on disadvantaged regions development and enhance the quality of life of the community. There is an improvement in the average of mustahik household incomes after becoming a member of ZCD Program. Cibest model demonstrates that material, spiritual, and absolute poverty index decreased by 10, 5, and 6 per cent. Meanwhile, the welfare index is increased by 21 per cent. These findings have significant implications for developing the quality of life in disadvantaged regions in Sijunjung. Therefore, zakat is one of the instruments to change the status of disadvantaged areas to be equivalent to other areas.


Sign in / Sign up

Export Citation Format

Share Document