Semel Institute, University of California Los Angeles, CA, USA
Received date June 30, 2016; Accepted date August 02, 2016; Published date August 09, 2016
Citation: Arevian AC, Patel H, Chen ST (2016) Personalized Audio Assessment and Temporal Patterns of Dementia-Related Behavioral Disturbances. J Alzheimers Dis Parkinsonism 6:252. doi:10.4172/2161-0460.1000252
Copyright: ©2016 Arevian AC, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Visit for more related articles at Journal of Alzheimers Disease & Parkinsonism
Vocally disruptive behaviors; Agitation; Alzheimer’s disease; Audio recording; Automated event detection; Temporal dynamics; Assessment
Behavioral disturbances are common clinical features of many neuropsychiatric disorders, including dementia, that are challenging to assess and treat [1,2], diminishes quality of life of patients and caregivers [3] and precipitates institutionalization [4]. A key challenge in improving care and developing more effective interventions is the accurate assessment of behavioral disturbances [5], which is at present largely limited to observer-rated scales at a specific point in time. In this case report, we describe a novel method of assessing dementiarelated vocally disruptive behaviors (VDB) [6] using audio recordings and automated analysis to objectively measure VDB and their temporal patterns and response to treatment. This method of assessment may potentially be developed and implemented in care settings to more effectively measure specific types of behavioral disturbances and guide treatment for individual patients.
An 83 year old man with history of dementia as well as major depressive disorder was admitted to a university-based inpatient geriatric psychiatry unit for progressive VDB manifested by repeated loud grunting. Mini Mental State Exam score on admission was 21/30. Outpatient medications included citalopram 20 mg daily, lithium 450 mg nightly, olanzapine 15 mg nightly and buspirone 30 mg twice daily. During hospitalization, he received 12 electroconvulsive therapy (ECT) treatments and olanzapine 5 mg three times daily, mirtazapine 30 mg nightly and trazodone 50 mg nightly. He was discharged on these medications as well as valproate 125 mg three times daily.
A key barrier identified in caring for this patient was difficulty in both assessing symptom burden and response to treatment interventions (such as ECT or medication administration), making selection of treatments and dosing challenging. To address this, we placed an inexpensive, digital audio recorder (Sony ICD-PX820) in the patient’s breast pocket or nightstand to capture 19,605 min of audio over 18 days (76% of the total time). We then used off-the-shelf software (Song Scope, Wildlife Acoustics), designed to detect bird songs in nature [7], to train an automated audio classifier that identified a total of 3,179 behavioral events from the recordings. We assessed the accuracy of the algorithm by comparing the algorithm’s predicted count to our manual count of VDB events in 22 fifteen minute samples, resulting in a sensitivity (true positive rate) of 92% and positive predictive value of 85%. We were then able to visualize the pattern of behaviors throughout the recording period (Figure 1A), showing decreased VDB over time. Interestingly, we observed a three-peak intraday pattern that was preserved throughout the period but with reduction in amplitude in the second half of recordings (Figure 1B). This temporal pattern was similar to the pattern of medication administration events for antipsychotic or sedating medications (Figure 1B, dotted line).
Figure 1: Temporal patterns of behavioral activity. A) Count of vocalization events detected by automated classifier binned in 15 min intervals and plotted across all days recorded. B) 2 h moving average of the count of vocalization events per 15 min interval, split between first half of recordings (days 1-10, grey line), second half of recordings (days 11-18, black line) and count of medication administration events (dotted line).
A promising new direction for the assessment and treatment of behavioral disturbances is real-time sensing in conjunction with automated detection of behavioral events [8], including signals derived from audio [9]. In this case report, we demonstrate that audio recording and automated analysis using off-the-shelf components enabled an objective characterization of this patient’s VDB over an extended period of time and enabled us to explore relationships of this behavior to treatment interventions and time of day. Interpretation of these relationships is limited by the presence of multiple factors that could influence them, including the use of concomitant medications, ECT, environmental cues and circadian rhythms [10]. Although case reports have limitations, our findings suggest the potential for the clinical use of automated audio assessments and highlight the need for systematic controlled studies to investigate their utility. Future studies may explore the use of these automated assessments, possibly in combination with other biomarkers, to better understand disease pathology, the temporal relation between behaviors, treatment interventions and changes in biological state, as well as to inform advances in precision medicine approaches in the care of dementia-related behaviors.
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals