wear time
Recently Published Documents


TOTAL DOCUMENTS

198
(FIVE YEARS 83)

H-INDEX

24
(FIVE YEARS 4)

Author(s):  
Chrisanda Sanchez ◽  
Jennifer Coto ◽  
Daniela Berrios ◽  
Ivette Cejas

Purpose: This study examined changes in datalogging for children attending an auditory–oral educational program with integrated audiology services versus children attending a mainstream or nonspecialized program. Method: Eighty children participated in this study, half of which were enrolled in an auditory–oral educational program versus the nonspecialized or mainstream setting. Datalogging for cochlear implant and hearing aid users was obtained via retrospective medical and educational chart review from 2016 to 2019. Results: Results demonstrated that at post-enrollment, children attending the auditory–oral educational program significantly increased device wear time (as measured by average hours/day) when compared to the control group. Children using hearing aids enrolled in the specialized educational program obtained the largest improvement in overall wear time, averaging an increase of 5 hr/day of device use from pre- to post-enrollment. Conclusions: This is the first study to document the association of specialized educational programs on device use. Clinical and educational programs should collaborate to provide integrated services to lessen family burden and increase a child's device use and retention.


2021 ◽  
Vol Publish Ahead of Print ◽  
Author(s):  
Samir Sabharwal ◽  
Richard L. Skolasky ◽  
Jason M. Souza ◽  
Benjamin K. Potter ◽  
Jonathan A. Forsberg

2021 ◽  
Vol 23 (Supplement_4) ◽  
pp. iv3-iv3
Author(s):  
Seema Dadhania ◽  
Lillie Pakzad-Shahabi ◽  
Kerlann Le Calvez ◽  
Waqar Saleem ◽  
James Wang ◽  
...  

Abstract Aims In patients with HGG, we know that QoL and physical function decline with progressive disease (PD) and fatigue is a strong predictor of survival in recurrent disease. Despite notable technical advances in therapy for in the past decade, survival has not improved. The role of physical function as a predictor of QoL, treatment tolerance and as an early indicator of worsening morbidity (e.g. tumour recurrence) is an area of growing importance. Recent advancements in wearable technology allow us the opportunity to gather high-quality, continuous and objective data BrainWear is a feasibility study collecting longitudinal physical activity (PA) data from patients with primary and secondary brain tumours and we hypothesise changes in PA over time, are a potentially sensitive biomarker for PD both at diagnosis and relapse. Method Here we show early analysis of this novel dataset of 42 HGG patients and will present: 1) feasibility and acceptability 2) how digitally captured PA changes through treatment and at PD/hospitalization 3) the correlation between patient reported outcomes (PRO) and PA data 4) how PA in HGG patients compares with healthy UK Biobank participants. PA data is collected via a wrist-worn accelerometer. Raw accelerometer data is processed using the UK Biobank Accelerometer Analysis pipeline in python 3.7, and evaluated for good quality wear-time. Overall activity is represented as vector magnitude in milligravity units(mg) and a machine-learning classifier classifies daily activity into 5 separate groups (walking, tasks-light, moderate, sedentary and sleep). Descriptive statistics summarise baseline characteristics and unadjusted mean used to present vector magnitude and accelerometer-predicted functional behaviours (in h/day) by age, sex, radiotherapy and weekend days. Mixed effect models for repeated measures are used for longitudinal data evaluation of PA. Results Between October 2018 and March 2021, 42 patients with a suspected HGG were recruited; 16 females and 26 males with a median age of 59. 40 patients had surgery and 35 patients had adjuvant primary radiotherapy, 23 of whom had a 6-week course. They have provided 3458 days of accelerometer data, 80% of which has been classified as good quality wear-time. There are no statistical differences in mean activity between gender, patients >60 years show statistical difference in time spent doing moderate activity compared to those <60 years, and there are significant differences in mean vector magnitude and walking between radiotherapy and non-radiotherapy days. In patients having a 6-week RT course, time spent in daily moderate activity falls 4-fold between week 1 and the second week following RT completion (70 minutes to 16 minutes). HGG versus healthy UK Biobank participants shows significant differences in all measures of PA. Conclusion Here we present preliminary analysis of this highly novel dataset in adult high grade glioma patients, and show digital remote health monitoring is feasible and acceptable with 80% of data classified as high quality wear-time suggesting good patient adherence. We are able to objectively describe how PA changes through standard treatments and understand the inter and intra-patient variation in PA, and whether there are correlates with patient-centred measures, clinical measures and early indicators of worsening disease. We will present further data on changes in PA prior to hospitalisation and at disease progression, and discuss some of the challenges of running a digital health trial. The passive and objective nature of wearable activity monitors gives clinicians the opportunity to evaluate and monitor the patient in motion, rather than the episodic snapshot we currently see, and in turn has the potential to improve our clinical decision making and potentially outcomes.


Author(s):  
Jourdan T. Holder ◽  
René H. Gifford

Purpose Despite the recommendation for cochlear implant (CI) processor use during all waking hours, variability in average daily wear time remains high. Previous work has shown that objective wear time is significantly correlated with speech recognition outcomes. We aimed to investigate the causal link between daily wear time and speech recognition outcomes and assess one potential underlying mechanism, spectral processing, driving the causal link. We hypothesized that increased CI use would result in improved speech recognition via improved spectral processing. Method Twenty adult CI recipients completed two study visits. The baseline visit included auditory perception testing (speech recognition and spectral processing measures), questionnaire administration, and documentation of data logging from the CI software. Participants watched an educational video, and they were informed of the compensation schedule. Participants were then asked to increase their daily CI use over a 4-week period during everyday life. Baseline measures were reassessed following the 4-week period. Results Seventeen out of 20 participants increased their daily CI use. On average, participants’ speech recognition improved by 3.0, 2.4, and 7.0 percentage points per hour of increased average daily CI use for consonant–nucleus–consonant words, AzBio sentences, and AzBio sentences in noise, respectively. Questionnaire scores were similar between visits. Spectral processing showed significant improvement and accounted for a small amount of variance in the change in speech recognition values. Conclusions Improved consistency of processor use over a 4-week period yielded significant improvements in speech recognition scores. Though a significant factor, spectral processing is likely not the only mechanism driving improvement in speech recognition; further research is warranted.


2021 ◽  
pp. 1-6
Author(s):  
Nicholas P. Giuliani

Purpose A retrospective analysis was conducted to explore how tinnitus, one or more neurologic conditions, unaided speech intelligibility index, and other comorbidities impact the average number of hours hearing aids are worn each day by U.S. Military Veterans. Method Medical records and a hearing aid database were queried to obtain information regarding active medical problems and average daily hearing aid wear time. Multiple linear regression was used to explore these relationships for 215 male Veterans whose records were available from 2009 to 2020. To be analyzed, Veterans must have possessed their hearing aid(s) for at least 3 consecutive months. Results An active problem of subjective tinnitus was associated with increased hearing aid wear time (positive association) and one or more active neurologic conditions were associated with decreased hearing aid wear time (negative association). A high unaided speech intelligibility index (greater access to speech sounds without hearing aids) was also associated with decreased hearing aid wear time (negative association). Conclusions There are many complex audiologic and medical concerns that may affect hearing aid wear time in U.S. Military Veterans. Therefore, the information from this study should be expanded on prospectively by further exploring these associations, and their severity, on hearing aid wear time. The information from this and future studies may lead to clinical recommendations with the goal of increasing daily hearing aid use in this and other populations.


2021 ◽  
Vol 23 (Supplement_2) ◽  
pp. ii13-ii14
Author(s):  
S Dadhania ◽  
L Pakzad-Shahabi ◽  
S Mistry ◽  
K Le-Calvez ◽  
W Saleem ◽  
...  

Abstract BACKGROUND In patients with High Grade Glioma (HGG), QoL and physical function decline with progressive disease (PD). Objective assessment of physical functioning is challenging as patients spend most of their time away from the hospital. Wearable technology allows measurement of objective, continuous activity data in a non-obtrusive manner. BrainWear is a phase II feasibility study, collecting longitudinal physical activity (PA) data from patients with primary and secondary brain tumours. MATERIAL AND METHODS All agreed to wear an Axivity AX3 triaxial accelerometer and completed the EORTC QLQ C30 and BN20, the Montreal Cognitive Assessment (MoCA) and Multidimensional fatigue inventory (MFI) questionnaires. Accelerometers were changed at 14-day intervals, and PRO questionnaires completed at pre-specified study intervals. Age-sex matched controls were identified from the UK Biobank 7-day accelerometer study. Raw accelerometer data was processed using UK Biobank accelerometer software and inclusion of high-quality wear time selected as ≥72 hours of data in a 7-day data collection and data in each 1-hour period of a 24-hour cycle over multiple days. We analysed variation in activity by patient demographics and treatment days. The wilcoxin-signed rank test was used to compare participant activity between radiotherapy treatment days and non-treatment days, mixed effects models were used to evaluate longitudinal changes in activity and we used k-means clustering to characterise clusters of PA behaviours. RESULTS We have collected 3458 days of accelerometer data from 42 HGG patients with a median age of 59, 80% of which has been classified as high quality. Patients >60 years spend more time doing moderate activity compared to those <60 years (52 vs 33 minutes/day, p=0.012), and there are significant differences in mean vector magnitude (17.12 vs 16.85 mg, p=0.013) and walking (91 vs 72 minutes/day) between radiotherapy and non-radiotherapy days. In patients having a 6-week RT course, time spent in daily moderate activity falls 4-fold between week 1 and the second week after RT completion (70 minutes to 16 minutes/day). Comparing HGG patients to healthy controls shows a significant difference in time spent across all activities (p<0.05). K-means clustering analysis shows three distinct clusters, with 87% of HGG patients falling into the very inactive or moderately active groups. CONCLUSION Digital remote health monitoring is feasible and acceptable with 80% of data classified as high-quality wear-time suggesting good patient adherence. Triaxial accelerometer data collection captures objective evidence of a significant reduction in moderate daily activity at the time of expected peak RT side-effects and patients walk almost 30% less on non-RT treatment days. HGG patients show significantly lower levels of activity compared to matched healthy controls.


2021 ◽  
Vol 10 (17) ◽  
pp. 3811
Author(s):  
Boldizsar Kovacs ◽  
Haran Burri ◽  
Andres Buehler ◽  
Sven Reek ◽  
Christian Sticherling ◽  
...  

Background: The wearable cardioverter defibrillator (WCD) uses surface electrodes to detect arrhythmia before initiating a treatment sequence. However, it is also prone to inappropriate detection due to artefacts. Objective: The aim of this study is to assess the alarm burden in patients and its impact on clinical outcomes. Methods: Patients from the nationwide Swiss WCD Registry were included. Clinical characteristics and data were obtained from the WCDs. Arrhythmia recordings ≥30 s in length were analysed and categorized as VT/VF, atrial fibrillation (AF), supraventricular tachycardia (SVT) or artefact. Results: A total of 10653 device alarms were documented in 324 of 456 patients (71.1%) over a mean WCD wear-time of 2.0 ± 1.6 months. Episode duration was 30 s or more in 2996 alarms (28.2%). One hundred and eleven (3.7%) were VT/VF episodes. The remaining recordings were inappropriate detections (2736 (91%) due to artefacts; 117 (3.7%) AF; 48 (1.6%) SVT). Two-hundred and seven patients (45%) had three or more alarms per month. Obesity was significantly associated with three or more alarms per month (p = 0.01, 27.7% vs. 15.9%). High alarm burden was not associated with a lower average daily wear time (20.8 h vs. 20.7 h, p = 0.785) or a decreased implantable cardioverter defibrillator implantation rate after stopping WCD use (48% vs. 47.3%, p = 0.156). Conclusions: In patients using WCDs, alarms emitted by the device and impending inappropriate shocks were frequent and most commonly caused by artefacts. A high alarm burden was associated with obesity but did not lead to a decreased adherence.


2021 ◽  
Author(s):  
Larissa C Hunt ◽  
Josef Fritz ◽  
Michael Herf ◽  
Céline Vetter

Wearable light sensors are increasingly used in intervention and population-based studies investigating the consequences of environmental light exposure on human physiology. An important step in such analyses is the reliable detection of non-wear time. We observed in light data that days with less wear-time also have lower variability in the light signal, and we sought to test if the standard deviation of the change between subsequent samples can detect this condition. In this study, we propose and validate an easy-to-implement algorithm designed to discriminate between days with a non-wear time >4h ('invalid days') vs. ≤4h ('valid days') and investigate to which extent values of commonly used physiologically meaningful light variables differ between invalid days, valid days, and algorithm-selected non-wear days. We used 83 days of light data from a field study with high participant compliance, complemented by 47 days of light data where free-living individuals were instructed not to wear the sensor for varying amounts of time. Light data were recorded every two minutes using the pendant-worn f.luxometer light sensor; validity was derived from daily logs where participants recorded all non wear time. The algorithm-derived score discriminated well between valid and invalid days (area under the curve (AUC): 0.77, 95% CI: 0.67-0.87). The best cut-off value (i.e., highest Youden index) correctly recognized valid days with a probability of 87% ('sensitivity'), and invalid days with a probability of 63% ('specificity'). Values of various light variables derived from algorithm selected days only (median: 264.3 (Q1: 153.6, Q3: 420.0) for 24h light intensity (in lux); 496.0 (404.0, 582.0) for time spent above 50-lux) gave values close to those derived from self reported valid days only. However, these values did not significantly differ when derived across all days compared to self-reported valid days. Our results suggest that our proposed algorithm discriminates well between valid and invalid days. However, in high compliance cohorts, distortions in aggregated light measures of individual-level environmental light recordings across days appear to be small, making the application of our algorithm optional, but not necessary.


2021 ◽  
Author(s):  
Cansın Kutay ◽  
Hülya Kılıçoğlu ◽  
Gülşilay Sayar

ABSTRACT Objectives To assess the objective compliance levels in skeletal Class II patients with mandibular retrognathia wearing monoblock and twin-block appliances. Materials and Methods A prospective clinical study was conducted with 30 patients between 10 and 15 years old who were equally divided into two study groups. Group 1 was treated with monoblock, and group 2 was treated with twin-block appliances. The patients were instructed to wear their appliance for 15 hours per day. Wear times were monitored by a microsensor. (TheraMon; MCTechnology, Hargelsberg, Austria) for an average of six appointments. Patients were also instructed to record their wear times on a chart, and this record was admitted as subjective wear time. Statistical analysis was performed with the data derived from both the patients' charts and the monitoring records. Results The mean wear time by the patients was 10.67 ± 3.93 hours, which was less than the 15 hours prescribed by the orthodontist, with no difference between the two appliances (P > .05). The regular use rate, which included the days with a wear time of 8 hours or more per day, was 75%. Compliance levels decreased by 35% throughout the six control appointments. Patients declared that their wear time was more than their objective wear time by an average of 3.76 hours. Conclusions Despite their different designs, there was no significant difference between the monoblock and twin-block appliances in terms of compliance.


Sign in / Sign up

Export Citation Format

Share Document