Morse code-based communication system focused on amyotrophic lateral sclerosis patients

Author(s):  
Franklin Rosado ◽  
Iliana Rumbo ◽  
Felicia Daza ◽  
Hernis Mercado ◽  
Diana Mier
2019 ◽  
Vol 400 (5) ◽  
pp. 651-661 ◽  
Author(s):  
Chang Liu ◽  
Kun Hong ◽  
Huifang Chen ◽  
Yanping Niu ◽  
Weisong Duan ◽  
...  

Abstract Aberrant microglial activation and neuroinflammation is a pathological hallmark of amyotrophic lateral sclerosis (ALS). Fractalkine (CX3CL1) is mostly expressed on neuronal cells. The fractalkine receptor (CX3CR1) is predominantly expressed on microglia. Many progressive neuroinflammatory disorders show disruption of the CX3CL1/CX3CR1 communication system. But the exact role of the CX3CL1/CX3CR1 in ALS pathology remains unknown. F1 nontransgenic/CX3CR1+/− females were bred with SOD1G93A/CX3CR1+/− males to produce F2 SOD1G93A/CX3CR1−/−, SOD1G93A/CX3CR1+/+. We analyzed end-stage (ES) SOD1G93A/CX3CR1−/− mice and progression-matched SOD1G93A/CX3CR1+/+ mice. Our study showed that the male SOD1G93A/CX3CR1−/− mice died sooner than male SOD1G93A/CX3CR1+/+ mice. In SOD1G93A/CX3CR1−/− mice demonstrated more neuronal cell loss, more microglial activation and exacerbated SOD1 aggregation at the end-stage of ALS. The NF-κB pathway was activated; the autophagy-lysosome degradation pathway and the autophagosome maturation were impaired. Our results indicated that the absence of CX3CR1/CX3CL1 signaling in the central nervous system (CNS) may worsen neurodegeneration. The CX3CL1/CX3CR1 communication system has anti-inflammatory and neuroprotective effects and plays an important role in maintaining autophagy activity. This effort may lead to new therapeutic strategies for neuroprotection and provide a therapeutic target for ALS patients.


There are many people in this world who don’t have the ability to communicate with others due to some unforeseen accident. User’s who are paralyzed and/or suffering from different Motor Neuron Diseases (MND) like Amyotrophic Lateral Sclerosis (ALS), Primary Lateral Sclerosis etc, by making them more independent. Patients suffering from these diseases are not able to move their arms and legs, lose their body balance and the ability to speak. Here we propose an IoT based communication controller using the concept of Morse Code Technology which controls the smartphone of the user. This paper proposes a solution to give the user ability to communicate to other people using machine as an intermediator. The device will require minimal inputs from the user.


2021 ◽  
Author(s):  
Kuniaki Ozawa ◽  
Masayoshi Naito, ◽  
Naoki Tanaka ◽  
Shiryu Wada

People with severe physical impairment such as amyotrophic lateral sclerosis (ALS) in a completely locked-in state (CLIS) suffer from inability to express their thoughts to others. To solve this problem, many brain-computer interface (BCI) systems have been developed, but they have not proven sufficient for CLIS. In this paper, we propose a word communication system: a BCI with partner assist, in which partners play an active role in helping patients express a word. We report here that five ALS patients in late stages (one in CLIS and four almost in CLIS) succeeded in expressing their own words (in Japanese) in response to wh-questions that could not be answered “yes/no.” Each subject sequentially selected vowels (maximum three) contained in the word that he or she wanted to express, by using a “yes/no” communication aid based on near-infrared spectroscopy. Then, a partner entered the selected vowels into a dictionary with vowel entries, which returned candidate words having those vowels. When there were no appropriate words, the partner changed one vowel and searched again or started over from the beginning. When an appropriate word was selected, it was confirmed by the subject via “yes/no” answers. Two subjects confirmed the selected word six times out of eight (credibility of 91.0% by a statistical measure); two subjects, including the one in CLIS, did so five times out of eight (74.6%); and one subject did so three times out of four (81.3%). We have thus taken the first step toward a practical word communication system for such patients.


2021 ◽  
Author(s):  
Kuniaki Ozawa ◽  
Masayoshi Naito, ◽  
Naoki Tanaka ◽  
Shiryu Wada

People with severe physical impairment such as amyotrophic lateral sclerosis (ALS) in a completely locked-in state (CLIS) suffer from inability to express their thoughts to others. To solve this problem, many brain-computer interface (BCI) systems have been developed, but they have not proven sufficient for CLIS. In this paper, we propose a word communication system: a BCI with partner assist, in which partners play an active role in helping patients express a word. We report here that five ALS patients in late stages (one in CLIS and four almost in CLIS) succeeded in expressing their own words (in Japanese) in response to wh-questions that could not be answered “yes/no.” Each subject sequentially selected vowels (maximum three) contained in the word that he or she wanted to express, by using a “yes/no” communication aid based on near-infrared spectroscopy. Then, a partner entered the selected vowels into a dictionary with vowel entries, which returned candidate words having those vowels. When there were no appropriate words, the partner changed one vowel and searched again or started over from the beginning. When an appropriate word was selected, it was confirmed by the subject via “yes/no” answers. Two subjects confirmed the selected word six times out of eight (credibility of 91.0% by a statistical measure); two subjects, including the one in CLIS, did so five times out of eight (74.6%); and one subject did so three times out of four (81.3%). We have thus taken the first step toward a practical word communication system for such patients.


2020 ◽  
Vol 63 (1) ◽  
pp. 59-73 ◽  
Author(s):  
Panying Rong

Purpose The purpose of this article was to validate a novel acoustic analysis of oral diadochokinesis (DDK) in assessing bulbar motor involvement in amyotrophic lateral sclerosis (ALS). Method An automated acoustic DDK analysis was developed, which filtered out the voice features and extracted the envelope of the acoustic waveform reflecting the temporal pattern of syllable repetitions during an oral DDK task (i.e., repetitions of /tɑ/ at the maximum rate on 1 breath). Cycle-to-cycle temporal variability (cTV) of envelope fluctuations and syllable repetition rate (sylRate) were derived from the envelope and validated against 2 kinematic measures, which are tongue movement jitter (movJitter) and alternating tongue movement rate (AMR) during the DDK task, in 16 individuals with bulbar ALS and 18 healthy controls. After the validation, cTV, sylRate, movJitter, and AMR, along with an established clinical speech measure, that is, speaking rate (SR), were compared in their ability to (a) differentiate individuals with ALS from healthy controls and (b) detect early-stage bulbar declines in ALS. Results cTV and sylRate were significantly correlated with movJitter and AMR, respectively, across individuals with ALS and healthy controls, confirming the validity of the acoustic DDK analysis in extracting the temporal DDK pattern. Among all the acoustic and kinematic DDK measures, cTV showed the highest diagnostic accuracy (i.e., 0.87) with 80% sensitivity and 94% specificity in differentiating individuals with ALS from healthy controls, which outperformed the SR measure. Moreover, cTV showed a large increase during the early disease stage, which preceded the decline of SR. Conclusions This study provided preliminary validation of a novel automated acoustic DDK analysis in extracting a useful measure, namely, cTV, for early detection of bulbar ALS. This analysis overcame a major barrier in the existing acoustic DDK analysis, which is continuous voicing between syllables that interferes with syllable structures. This approach has potential clinical applications as a novel bulbar assessment.


Sign in / Sign up

Export Citation Format

Share Document