optimal time point
Recently Published Documents


TOTAL DOCUMENTS

44
(FIVE YEARS 21)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Kristian Valind ◽  
Jonas Jögi ◽  
David Minarik ◽  
Gustav Brolin ◽  
Elin Trägårdh

Abstract Background In Prostate-specific membrane antigen (PSMA) positron emission tomography with computed tomography (PET-CT), there is significant renal uptake. The standard in renal cortical functional imaging is scintigraphy with technetium-99m labeled dimercaptosuccinic acid (DMSA). Using [68Ga]Ga-PSMA-11 PET for renal imaging has been suggested, but using [18F]PSMA-1007 has not been explored. The aims of this study were to establish the optimal time point for renal imaging after [18F]PSMA-1007 injection, to investigate the reproducibility of split renal uptake measurements, and to determine the margin for reduction in administered activity. Methods Twelve adult male patients with prostate cancer underwent [18F]PSMA-1007 PET-CT at 8 time points up to 5.5 h post-injection (p.i.). List-mode data were binned to durations of 10 to 120 s per bed position (bp). Left renal percentage of total renal uptake (LRU%) was measured, and the difference between highest and lowest measurement per patient (“delta max”) was calculated. Images acquired at 1 h, 2 h, and 5.5 h p.i. with durations of 10 to 120 s/bp were rated regarding image quality. Results Imaging at 2 h p.i. with 60 s/bp yielded acceptable quality in all cases. Increasing acquisition time to 15 min for a single bp would allow reducing administered activity to 0.27 MBq/kg, resulting in an effective dose of 0.4 mSv for a 1-year old child weighing 10 kg. The median delta max of LRU% measurements was 2.7% (range 1.8–7.3%). Conclusions Renal [18F]PSMA-1007 PET-CT is feasible, with imaging 2 h p.i., acceptable split renal uptake variability, and effective dose and acquisition time comparable to those of [99mTc]Tc-DMSA scintigraphy.


2021 ◽  
Vol 17 (7) ◽  
pp. 609-610
Author(s):  
Yusuke Watanabe ◽  
Yoshinobu Murasato ◽  
Nobuaki Suzuki ◽  
Ken Kozuma

2021 ◽  
Author(s):  
Dongdong Xia ◽  
Qiuhe Wang ◽  
Wei Bai ◽  
Enxin Wang ◽  
Zhexuan Wang ◽  
...  

Abstract BackgroundObjective response rate (ORR) under mRECIST criteria after transarterial chemoembolization (TACE) has been identified as a surrogate endpoint of overall survival (OS). However, its optimal time-point remains controversial and may be influenced by tumor burden. We aim to investigate the surrogacy of initial/best ORR in relation to tumor burden. MethodsA total of 1549 eligible treatment-naïve patients with unresectable hepatocellular carcinoma (HCC), Child-Pugh score ≤7, and performance status score ≤1 undergoing TACE between January 2010 and May 2016 from 17 academic hospitals were retrospectively analyzed. ResultsBoth initial and best ORRs interacted with tumor burden defined as our previously proposed “six-and-twelve” criteria. Both initial and best ORRs could equivalently predict and correlated with OS in low (adjusted HR: 2.55 and 2.95, respectively, both P<0.001; R=0.84, P=0.035 and R=0.97, P=0.002, respectively) and intermediate tumor burden strata (adjusted HR: 1.81 and 2.22, respectively, both P<0.001; R=0.74, P=0.023, and R=0.9, P=0.002, respectively). For high strata, only best ORR exhibited qualified prognostic value (adjusted HR: 2.61, P<0.001) with a satisfying correlation (R=0.70, P=0.035), whereas initial ORR was not statistically significant (adjusted HR: 1.08 P=0.357; R=0.22, P=0.54). ConclusionsORR after TACE as a surrogate of OS is closely associated with tumor burden. For patients with low/intermediate tumor burden, initial ORR should be preferred over best ORR in its early availability upon similar sensitivity; whereas for patients with high tumor burden, best ORR has optimal sensitivity. Timing of OR assessment should be tailored according to tumor burden in future clinical trials and practice.


Acarologia ◽  
2021 ◽  
Vol 61 (2) ◽  
pp. 403-411
Author(s):  
David Rodríguez ◽  
Ricardo Palacios ◽  
Jorge Martínez ◽  
Jorge A. Guisantes ◽  
Idoia Postigo

Currently, several mite growth culture media used in the production of allergenic extracts contain animal-derived components that limit their use in diagnostic and/or therapeutic applications. The aim of this study was to evaluate the growth of D. pteronyssinus and D. farinae mites in a semi-synthetic medium without animal-derived proteins to produce highly reproducible allergenic extracts for diagnostic and therapeutic purposes to be more consistent with the regulations of health authorities. Both species of mites showed optimal growth in the semi-synthetic culture medium. The highest expression of allergens Der p 1 and Der f 1 was observed at the last phases of mite growth. Semi-synthetic media without animal-derived proteins facilitated excellent growth rates of house dust mites in cultures. Adjusting the cultivation time to decide the optimal time point for the processing of the extracts is decisive.


Author(s):  
Anne-Sophie Schuurman ◽  
Anirudh Tomer ◽  
K. Martijn Akkerhuis ◽  
Ewout J. Hoorn ◽  
Jasper J. Brugts ◽  
...  

Abstract Background High mortality and rehospitalization rates demonstrate that improving risk assessment in heart failure patients remains challenging. Individual temporal evolution of kidney biomarkers is associated with poor clinical outcome in these patients and hence may carry the potential to move towards a personalized screening approach. Methods In 263 chronic heart failure patients included in the prospective Bio-SHiFT cohort study, glomerular and tubular biomarker measurements were serially obtained according to a pre-scheduled, fixed trimonthly scheme. The primary endpoint (PE) comprised cardiac death, cardiac transplantation, left ventricular assist device implantation or heart failure hospitalization. Personalized scheduling of glomerular and tubular biomarker measurements was compared to fixed scheduling in individual patients by means of a simulation study, based on clinical characteristics of the Bio-SHiFT study. For this purpose, repeated biomarker measurements and the PE were jointly modeled. For personalized scheduling, using this fitted joint model, we determined the optimal time point of the next measurement based on the patient’s individual risk profile as estimated by the joint model and the maximum information gain on the patient’s prognosis. We compared the schedule’s capability of enabling timely intervention before the occurrence of the PE and number of measurements needed. Results As compared to a pre-defined trimonthly scheduling approach, personalized scheduling of glomerular and tubular biomarker measurements showed similar performance with regard to prognostication, but required a median of 0.4–2.7 fewer measurements per year. Conclusion Personalized scheduling is expected to reduce the number of patient visits and healthcare costs. Thus, it may contribute to efficient monitoring of chronic heart failure patients and could provide novel opportunities for timely adaptation of treatment. Graphic abstract


Author(s):  
Soha Hassan ◽  
Amira Ali ◽  
Dennis Sohn ◽  
Uli Floegel ◽  
Reiner Jaenicke ◽  
...  

This study investigates whether a chronotherapeutic treatment of hepatocellular carcinoma (HCC) may improve treatment efficacy and mitigate side effects on healthy liver (HL). HCC was induced in Per2::luc mice which were irradiated at four time points of the day. Proliferation and DNA-double strand breaks were investigated in irradiated and non-irradiated organotypic slice culture (OSC) and ex vivo samples by detection of Ki67 and &gamma;-H2AX. OSC proved useful to determine dose-dependent effects on proliferation and DNA damage but appeared unsuited to test the chronotherapeutic approach. Irradiation of ex vivo samples was most effective at the proliferation peaks in HCC at ZT02 (early inactivity phase) and ZT20 (late activity phase). Irradiation effects on HL were minimal at ZT20. Ex vivo samples revealed disruption in daily variation and down-regulation of all investigated clock genes except Per1 in non-irradiated HCC as compared with HL. Irradiation affected rhythmic clock gene expression in HL and HCC at all ZTs except at ZT20. Irradiation at ZT20 had no effect on total leukocyte numbers. Our results indicate ZT20 as the optimal time point for irradiation of HCC in mice. Translational studies are now needed to evaluate whether the late activity phase is the optimal time point for irradiation of HCC in man.


Author(s):  
Ricardo Rio-Tinto ◽  
Jorge Canena

Postcholecystectomy leaks may occur in 0.3–2.7% of patients. Bile leaks associated with laparoscopy are often more complex and difficult to treat than those occurring after open cholecystectomy. Furthermore, their incidence has remained unchanged despite improvements in laparoscopic training and technological developments. The management of biliary leaks has evolved from surgery into a minimally invasive endoscopic procedural approach, namely, endoscopic retrograde cholangiopancreatography (ERCP), which decreases or eliminates the pressure gradient between the bile duct and the duodenum, thus creating a preferential transpapillary bile flow and allowing the leak to seal. For simple leaks, the success rate of endotherapy is remarkably high. However, there are more severe and complex leaks that require multiple endoscopic interventions, and clear strategies for endoscopic treatment have not emerged. Therefore, there is still some debate regarding the optimal time point at which to intervene, which technique to use (sphincterotomy alone or in association with the placement of stents, whether metallic or plastic stents should be used, and, if plastic stents are used, whether they should be single or multiple), how long the stents should remain in place, and when to consider treatment failure. Here, we review the types and classification of postoperative biliary injuries, particularly leaks, as well as the evidence for endoscopic treatment of the latter.


Sign in / Sign up

Export Citation Format

Share Document