MAITREYEE WAIRAGKAR

About

Maitreyee Wairagkar

I am Postdoctoral Researcher in translational neuroengineering in the Neuroprosthetics Lab at UC Davis and BrainGate Consortium. I build neurotechnologies to enable people with nervous system injury or disease (e.g., stroke, ALS, dementia, Parkinson’s disease) to restore their lost movement and speech abilities via brain-computer interfaces (BCIs) and provide neurorehabilitation and assistive technologies. My research focuses on breaking barriers between humans and technology by developing intuitive modes of interaction with technology using brain signals, movement, and natural language. My work spans multidisciplinary areas of human-centric AI, machine learning, signal processing, time series analysis of neurophysiological signals, inertial sensors and speech signals, neural decoding, natural language processing and social robots to build neurotechnologies for healthcare. I am primarily interested in understanding brain signals and other human physiological signals to build life enabling technologies.

Currently, in my postdoc research, I am developing intracortical BCI to restore lost speech in people with severe brain injury or disease by decoding their speech-related neural activity recorded using Utah multielectrode arrays implanted on brain and translating it to continuously synthesised speech. Thus, enabling people to speak using their brain signals.

Previously, I was Postdoctoral Researcher in affective robotics in Biomechatronics Lab at Imperial College London and UK Dementia Research Institute where I developed affective social robots and conversational AI to support people with dementia by improving their engagement, providing personalised interventions, and interactively assessing their health and wellbeing. I was also venture lead in MedTech SuperConnectorTM accelerator where I led the development of social robot platform - Brainbot for mental health and telemedicine.

I received PhD degree in Cybernetics and MEng degree in AI and Cybernetics from the University of Reading, UK. During my PhD research in Brain Embodiment Lab, I studied changes in temporal dynamics of broadband brain signals (EEG) during voluntary movement. I developed a novel approach of modelling broadband EEG (instead of brain waves with narrow frequency bands) using non-stationary time series model to predict movement intention for motor control BCI.

I was postgraduate research assistant at the University of Reading where I developed interactive neurorehabilitation tools providing combined motor and language therapy for stroke and brain injury in home environment. This technology was transferred for commercialisation. I was also postgraduate research assistant in the SPHERE project at the University of Reading and University of Southampton, where I worked on modelling motion kinematics and classifying movements from wearable inertial sensors for people with Parkinson's disease.

I enjoy collaborating with multidisciplinary teams of medical practitioners, patients, designers, industry experts to find technological solutions to real world health challenges. I also love to share my research with the general public via science outreach. I have presented live demos of my BCI and neurorehab tech in Science Museum London, Royal Institution, hospitals, schools, and universities. My BCI was featured in the Royal Institution Christmas Lecture.

BrainGate personal profile
Current affiliations
UC Davis UK DRI
Previous affiliations
Imperial College London UK DRI University of Reading MedTech SuperConnector
University of Southampton

Research


Brain-to-Speech: Restoring Lost Speech via Intracortical Brain-Computer Interfaces (BCIs)

speech-BCI

The ability to speak is a key determinant of quality of life, but it is disrupted in people with brain injury and neurodegenerative diseases such as ALS. Brain-computer interfaces (BCIs) can potentially restore speech in individuals who have lost the ability to speak by interpreting their speech-related neural activity. Current intracortical BCIs can enable users to communicate via point-and-click and handwriting with high accuracy, but these modes of communication are slow and do not capture the full expressive range of speech. Intelligible speech synthesis from brain signals has not yet been demonstrated. In my current postdoctoral research in the Neuroprosthetics Lab at UC Davis, I am building speech-BCI to enable people speak by decoding their neural activity and translating it to speech.

This research is part of BrainGate clinical trial. Intracortical high-resolution neural activity is recorded from the speech motor cortex of human participants using chronically implanted Utah multielectrode arrays. My research focuses on developing a neural decoder to synthesise speech directly from this intracortical neural activity. I am integrating deep learning models with signal processing to uncover neural dynamics and correlates of speech production and translate these into synthesised speech. This neural decoder can instantaneously synthesize voice from intracortical neural activity and provides real-time closed-loop audio feedback. The show improvement over the state-of-the-art prior research in speech synthesis from brain activity. This approach is suitable for real-time applications in BCIs to restore lost speech, which is the focus of my ongoing work.
[More info]



An instantaneous voice synthesis neuroprosthesis
Maitreyee Wairagkar, Nicholas S Card, Tyler Singer-Clark, Xianda Hou, Carrina Iacobacci, Leigh R Hochberg, David M Brandman*, Sergey D Stavisky*,
bioRxiv, 2024 [In Review] [Preprint]

An accurate and rapidly calibrating speech neuroprosthesis
Nicholas S Card, Maitreyee Wairagkar, Carrina Iacobacci, Xianda Hou, Tyler Singer-Clark, Francis R Willett, Erin M Kunz, Chaofei Fan, Maryam Vahdati Nia, Darrel R Deo, Eun Young Choi, Matthew F Glasser, Leigh R Hochberg, Jaimie M Henderson, Kiarash Shahlaie, David M Brandman*, Sergey D Stavisky*,
The New England Journal of Medicine, 2024 [Link] [Preprint]

Synthesizing Speech by Decoding Intracortical Neural Activity from Dorsal Motor Cortex
Maitreyee Wairagkar, Leigh R Hochberg, David M Brandman*, Sergey D Stavisky*,
11th International IEEE EMBS Conference on Neutral Engineering (NER), 2023 [Link]

Decoding intracortical neural activity from motor cortex to synthesise speech
Maitreyee Wairagkar, Leigh R Hochberg, David M Brandman*, Sergey D Stavisky*,
Society for Neuroscience - Neuroscience 2022 , 2022 [Open Access Link]



Predicting Movement Intention from Broadband EEG for Brain-Computer Interfaces (BCIs)

BCI

Brain-computer interfaces (BCIs) provide a direct mode of interaction with external devices using brain signals. Movement is our fundamental mode of interaction with the environment, and hence detecting movement intention reliably from brain signals is important to develop intuitive motor control BCI. During my doctoral research at the University of Reading, I investigated temporal dynamics of broadband EEG to identify robust markers of movement intention. Brain activity is composed of oscillatory and broadband arrhythmic components and undergoes complex changes during voluntary movement. Traditionally characterisation of movement from EEG has focused mostly on narrowband oscillatory processes such as event-related (de)synchronisation in sensorimotor rhythms and slow non-oscillatory event-related potentials such as motor-related cortical potentials. However, temporal dynamics of broadband arrhythmic EEG remain unexplored and broadband EEG was considered as background noise.

I discovered new neural correlates of movement intention in long-range temporal correlations of broadband EEG which are complementary to the above conventional correlates, providing previously inaccessible motor information from EEG leading to the earlier prediction of movement intention before its onset and improved classification accuracies. I developed a novel approach to modelling these long- and short-range temporal correlations in broadband EEG using non-stationary time series ARFIMA model and machine learning classifiers to predict movement intention for robust BCI.
[More info]



Dynamics of Long-Range Temporal Correlations in Broadband EEG During Different Motor Execution and Imagery Tasks
Maitreyee Wairagkar, Yoshikatsu Hayashi, Slawomir J Nasuto
Frontiers in Neuroscience, 2021 [Open Access Link]

Modeling the ongoing dynamics of short and long-range temporal correlations in broadband EEG during movement
Maitreyee Wairagkar, Yoshikatsu Hayashi, Slawomir J Nasuto
Frontiers in Systems Neuroscience, 2019 [Open Access Link]

[PhD Thesis] Ongoing temporal dynamics of broadband EEG during movement intention for BCI
Maitreyee Wairagkar
University of Reading, 2019 [Open Access Link]

Exploration of neural correlates of movement intention based on characterisation of temporal dependencies in electroencephalography
Maitreyee Wairagkar, Yoshikatsu Hayashi, Slawomir J Nasuto
PLOS ONE, 2018 [Open Access Link]



Affective Social Robots for Dementia Care

Social Robots

Social robots are anthropomorphised robots capable of using natural language and facial expressions for engaging interactions. Dementia is a neurodegenerative disorder leading to progressive decline in cognitive abilities requiring continuous care and support. Social robots can promote independence, improve cognition and social interactions, provide assistance and interventions to maintain quality of life, and can be used for telemedicine and remote care of persons with early stages of dementia and mild cognitive impairment. In my postdoctoral research at Imperial College London and UK Dementia Research Institute, I am developing different types of conversational AI, affective social robots, and a framework for interactive robotic interventions for dementia care. I am conducting longitudinal experiments with persons with dementia to investigate how such robots can be used to monitor their health and wellbeing automatically and non-intrusively by analysing human-robot interactions using machine learning and natural language processing. Early results have shown that interactions with robots give insight into their health and wellbeing, and which will help in personalising the robot’s functionality for adaptive support. I am also studying physiological responses in EEG to human-robot interaction.

I was venture lead in MedTech SuperConnectorTM accelerator where I led the development of BrainBot, an affective social robot that interacts with users with natural language and human-like facial expressions. It also provides an interactive robotic telemedicine platform for clinicians for remote therapy. We are testing our robotic platform for remote cognitive engagement therapy in collaboration with SCARF India to assess its use as an affordable tool for dementia care and remote therapy.
[More info]



Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots
Maitreyee Wairagkar, Maria R. Lima, Daniel Bazo, et al.
IEEE Internet of Things Journal, 2021 [Open Access Link] [Preprint]

Robotic telemedicine for mental health: a multimodal approach to improve human-robot engagement
Maria R Lima, Maitreyee Wairagkar, Nirupama Natarajan, et al.
Frontiers in Robotics and AI , 2021 [Open Access Link]

Acceptability of Social Robots and Adaptation of Hybrid-Face Robot for Dementia Care in India: A Qualitative Study
Nirupama Natarajan, Sridhar Vaitheswaran, Maria R Lima, Maitreyee Wairagkar, et al.
The American Journal of Geriatric Psychiatry , 2021 [Link] [Open Access Link]
(Featured in the American Journal of Geriatric Psychiatry editorial)

Conversational Affective Social Robots for Ageing and Dementia Support: A Review
Maria R Lima, Maitreyee Wairagkar, Manish Gupta, et al.
IEEE Transactions on Cognitive and Developmental Systems, 2021 [In Press]



Neurorehabilitation Tools for Combined Motor and Language Therapy (MaLT)

MaLT

Neurorehabilitation is an essential component of recovery after stroke and brain injury. The functional connectivity and structural proximity of elements of the language and motor systems result in frequent co-morbidity post brain injury; however, treatment for language and motor functions often occurs in isolation. Due to the care burden in rehab centres and hospitals, it is not possible to provide the necessary amount of high-intensity therapy to patients. Hence, in this project as a research assistant at the University of Reading, I have developed interactive combined motor and language therapy tools (MaLT) for long term home-based rehabilitation in collaboration with language therapists, assistive technology researchers, physiotherapists and clinicians from NHS, and patient and carer groups. MaLT comprises a suite of Kinect-based interactive games targeting both language and upper-limb motor therapy and records patient performance for therapists to assess progress. The games target four major language therapy tasks involving single word comprehension, initial phoneme identification, rhyme identification and a naming task with eight levels of difficulty by programmatically generating appropriate questions providing unique gameplay every time. MaLT was tested with stroke survivors at home and in NHS hospital with a positive response.

MaLT technology has been transferred for commercialisation to Evolv and is now included in their suite of virtual rehabilitation tools. I have also developed a mobile app for speech and language therapy (SpeLT).



MaLT– Combined Motor and Language Therapy tool for Brain Injury Patients using Kinect
Maitreyee Wairagkar, Rachel McCrindle, Holly Robson, et al.
Methods of Information in Medicine, 2017 [Link] [Open Access Link]

Combined Language and Motor Therapy for Brain Injury Patients
Maitreyee Wairagkar, Rachel McCrindle, Holly Robson, et al.
Proceedings of the 3rd Workshop on ICTs for Improving Patients Rehabilitation Research Techniques, 2015 [Link]



Modelling Movement Kinematics With Wearable Inertial Sensors for Parkinson's Disease

Modelling Movement Kinematics

Studying kinematics of movement can provide insight in the assessment of neurodegenerative Parkinson's disease, monitoring its progression, and developing rehabilitation strategies. Wearable inertial sensors are a cost-effective mode of assessing movement in the clinical setting and home environment. As a research assistant in the SPHERE project at the University of Reading, I worked on developing a new approach of integrating modelling and classifying sit-to-stand movement kinematics using extended Kalman filter, logistic regression, and unsupervised machine learning with only two inertial sensors placed on shank and back. Sit-to-stand transitions are an important part of activities of daily living and play a key role in functional mobility in humans and are often affected in older adults due to frailty and in people with motor impairments. This model was successfully used to characterise and compare sit-to-stand angular kinematics in younger healthy adults, older healthy adults, and people with Parkinson's disease.

In the SPHERE project at the University of Southampton, I developed an algorithm to estimate motion intensities and energy expenditure for assessing mobility in people with Parkinson's and stroke during different activities using wearable inertial sensors. We used this to monitor movement in the free-living condition in the home environment continuously for multiple days to study how mobility is affected by different physiological conditions.
[More info]



[Preprint, in review] A Novel Approach for Modelling and Classifying Sit-to-Stand Kinematics using Inertial Sensors
Maitreyee Wairagkar, Emma Villeneuve, Rachel King, et al.
arXiv preprint, 2021 [Open Access Link]

Preprints


Journal Articles [Google Scholar]


Datasets


Selected Conference Proceedings

Talks, Presentations and Science Outreach

CV

Work and Research Experience


Education


Awards and Honours


Research Grants and Bursaries


Teaching and Mentoring


Scientific Reviewer


Scientific and Academic Service

Latest News & Events

3rd Oct 2024
I will present my work on voice neuroprosthesis for restoring lost speech to a person with ALS at SfN 2024 on Saturday, 5thOct. Poster board H30, 1-5 pm.

1rd Sep 2024
We are conducting tutorial on Intracortical speech neuroprosthesis- fundamentals and techniques" at the Interspeech 2024, Kos, Greece. Join us this afternoon to learn about decoding brain signals into text and voice to help people with ALS speak.

25th Jan 2023
I am honoured to have received the India UK Achievers Honours in the category of Education, Science and Innovation, and selected as one of the “75 at 75” inspiring achievers. The honours celebrate the achievement of 75 Indians educated in UK universities who have excelled in different fields through their impactful work to commemorate India's 75 years of independence and India-UK education ties.

11th Oct 2022
Update: I am honoured and delighted to be the runner-up for the Nature Inspiring Women in Science Award 2022 in the Scientific Achievement category.

28th Sep 2022
I have been shortlisted as a finalist for the prestigious Nature Inspiring Women in Science Award in the Scientific Achievement category among top six scientists.

24th Sep 2022 Upcoming Talk
I will give a talk on 'Assistive neurotechnologies for communication and rehabilitation' at the University California Neurotrauma Symposium on 28thth Sep 2022.

17th Jul 2022 Upcoming Talk
I will give a talk on 'Ongoing long-range temporal correlations in broadband EEG and intracortical neural activity during voluntary movement' on 18th Jul 2022, 15:00 CET online in Neurocybernetics workshop, IEEE COMPENG 2022

1st Apr 2022
I have joined Neuroprosthetics Lab at UC Davis as postdoctoral scholar. I will be working on developing intracortical BCIs to restore lost speech in people with severe brain injury and neural disorders.

11th Mar 2022
My profile was showcased by Imperial College London as part of Early Career Researcher Profile Series on the occasion of the Women at Imperial Week 2022.

2nd Mar 2022
I have been awarded the < a href="https://ukdri.ac.uk/news-and-events/inflammation-social-robots-and-sleep-fourth-round-of-pilot-awards-announced" class=inlineLink2>UK Dementia Research Institute Pilot Programme Award of £43,300 for my project 'Automated Assessment of Emotional Wellbeing of People with Dementia'. This grant is designed to encourage UK DRI Early Career Researchers to take the next step towards independence and advance research in neurodegenerative diseases.

10th Feb 2022
I am honoured to be recognised by the Deptartment of Mechanical Engineering, Imperial College London for the exceptional ongoing individual research achievement.

14th Jan 2022
I was selected for NEUROHACK - a competitive international hackathon from 11th-14th Jan to promote brain health and combat neurodegeneration organised by the DEMON Network. Our team successfully developed a prediction model with ML for dementia diagnosis in the USA and India.

26th Nov 2021 Upcoming Talk
I will give a talk on 'AI and Ethics in Healthcare' on 27th Nov 2021, 15:30 IST online at Centre for Corporate Law Studies, ILNU. [Webinar link]

13th Nov 2021
Our social robotics research group in Biomechatronics Lab, Imperial College London received an unrestricted gift from Google to support excellent research in academia for our work in improving engagement in human-robot interaction and facilitating mental health telemedicine.

7th Oct 2021
New video on our dementia technology research in the Care Research and Technology Centre was released today at the UK Dementia Research Institute Connectome 2021.

29th Sep 2021
Our work on social robotics for dementia care has been featured in the editorial of The American Journal of Geriatric Psychiatry.

In News and Media

Videos

Decoding intracortical neural activity to synthesise speech for BCIs to restore lost speech


Motor Imagery BCI to control VR and Soft Robotic rehab device


Neurofeedback BCI


MaLT - Combined Motor and Language Therapy tool for Brain Injury


Conversational AI for Dementia Care

[starts at 08:09]

UK DRI CR&T New Horizons in Dementia Care from Helix Centre on Vimeo.

Our BCI featured in Royal Institution Christmas Lectures 2017 (BBC FOUR)

[starts at 53:01]

BCI live demo in Science Museum Lates


BCI live demo in Royal Institution Lates