Maitreyee Wairagkar

I am Postdoc Researcher in affective robotics in Biomechatronics Lab at Imperial College London and UK Dementia Research Institute. My research focuses on breaking barriers between humans and technology by developing intuitive modes of interaction with technology using brain signals, movement, and natural language. I work on human-centric AI, machine learning, signal processing, time series analysis of physiological, inertial sensors and speech signals, neuroengineering, and natural language processing to build Brain Computer Interface (BCI), social robots, and neurorehabilitation and assistive technology for healthcare. I am primarily interested in understanding brain signals and other human physiological signals to build life enabling technologies.

I received PhD degree in Cybernetics and MEng degree in AI and Cybernetics from the University of Reading, UK. During my PhD research in Brain Embodiment Lab, I studied changes in temporal dynamics of broadband brain signals (EEG) during voluntary movement. I developed a novel approach of modelling broadband EEG (instead of brain waves with narrow frequency bands) using non-stationary time series model to predict movement intention for motor control BCI.

Currently, in my postdoc research, I am developing affective social robots and conversational AI to support people with dementia by improving their engagement, providing personalised interventions, and interactively assessing their health and wellbeing. I was also venture lead in MedTech SuperConnectorTM accelerator where I led the development of social robot platform - Brainbot for mental health and telemedicine.

Previously, I was research assistant at the University of Reading where I developed interactive neurorehabilitation tools providing combined motor and language therapy for stroke and brain injury in home environment. This technology was transferred for commercialisation. I was also research assistant in the SPHERE project at the University of Reading and University of Southampton, where I worked on modelling motion kinematics and classifying movements from wearable inertial sensors for people with Parkinson's disease.

I enjoy collaborating with multidisciplinary teams of medical practitioners, patients, designers, industry experts to find technological solutions to real world health challenges. I also love to share my research with the general public via science outreach. I have presented live demos of my BCI and neurorehab tech in Science Museum London, Royal Institution, hospitals, schools, and universities. My BCI was featured in the Royal Institution Christmas Lecture.

Imperial College London personal homepage
Current affiliations
Imperial College London UK DRI
Previous affiliations
University of Reading MedTech SuperConnector
University of Southampton


Predicting Movement Intention from Broadband EEG for Brain-Computer Interface (BCI)


Brain-computer interfaces (BCIs) provide a direct mode of interaction with external devices using brain signals. Movement is our fundamental mode of interaction with the environment, and hence detecting movement intention reliably from brain signals is important to develop intuitive motor control BCI. During my doctoral research at the University of Reading, I investigated temporal dynamics of broadband EEG to identify robust markers of movement intention. Brain activity is composed of oscillatory and broadband arrhythmic components and undergoes complex changes during voluntary movement. Traditionally characterisation of movement from EEG has focused mostly on narrowband oscillatory processes such as event-related (de)synchronisation in sensorimotor rhythms and slow non-oscillatory event-related potentials such as motor-related cortical potentials. However, temporal dynamics of broadband arrhythmic EEG remain unexplored and broadband EEG was considered as background noise.

I discovered new neural correlates of movement intention in long-range temporal correlations of broadband EEG which are complementary to the above conventional correlates, providing previously inaccessible motor information from EEG leading to the earlier prediction of movement intention before its onset and improved classification accuracies. I developed a novel approach to modelling these long- and short-range temporal correlations in broadband EEG using non-stationary time series ARFIMA model and machine learning classifiers to predict movement intention for robust BCI.
[More info]

Dynamics of Long-Range Temporal Correlations in Broadband EEG During Different Motor Execution and Imagery Tasks
Maitreyee Wairagkar, Yoshikatsu Hayashi, Slawomir J Nasuto
Frontiers in Neuroscience, 2021 [Open Access Link]

Modeling the ongoing dynamics of short and long-range temporal correlations in broadband EEG during movement
Maitreyee Wairagkar, Yoshikatsu Hayashi, Slawomir J Nasuto
Frontiers in Systems Neuroscience, 2019 [Open Access Link]

[PhD Thesis] Ongoing temporal dynamics of broadband EEG during movement intention for BCI
Maitreyee Wairagkar
University of Reading, 2019 [Open Access Link]

Exploration of neural correlates of movement intention based on characterisation of temporal dependencies in electroencephalography
Maitreyee Wairagkar, Yoshikatsu Hayashi, Slawomir J Nasuto
PLOS ONE, 2018 [Open Access Link]

Affective Social Robots for Dementia Care

Social Robots

Social robots are anthropomorphised robots capable of using natural language and facial expressions for engaging interactions. Dementia is a neurodegenerative disorder leading to progressive decline in cognitive abilities requiring continuous care and support. Social robots can promote independence, improve cognition and social interactions, provide assistance and interventions to maintain quality of life, and can be used for telemedicine and remote care of persons with early stages of dementia and mild cognitive impairment. In my postdoctoral research at Imperial College London and UK Dementia Research Institute, I am developing different types of conversational AI, affective social robots, and a framework for interactive robotic interventions for dementia care. I am conducting longitudinal experiments with persons with dementia to investigate how such robots can be used to monitor their health and wellbeing automatically and non-intrusively by analysing human-robot interactions using machine learning and natural language processing. Early results have shown that interactions with robots give insight into their health and wellbeing, and which will help in personalising the robot’s functionality for adaptive support. I am also studying physiological responses in EEG to human-robot interaction.

I was venture lead in MedTech SuperConnectorTM accelerator where I led the development of BrainBot, an affective social robot that interacts with users with natural language and human-like facial expressions. It also provides an interactive robotic telemedicine platform for clinicians for remote therapy. We are testing our robotic platform for remote cognitive engagement therapy in collaboration with SCARF India to assess its use as an affordable tool for dementia care and remote therapy.
[More info]

Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots
Maitreyee Wairagkar, Maria R. Lima, Daniel Bazo, et al.
IEEE Internet of Things Journal, 2021 [Open Access Link] [Preprint]

Robotic telemedicine for mental health: a multimodal approach to improve human-robot engagement
Maria R Lima, Maitreyee Wairagkar, Nirupama Natarajan, et al.
Frontiers in Robotics and AI , 2021 [Open Access Link]

Acceptability of Social Robots and Adaptation of Hybrid-Face Robot for Dementia Care in India: A Qualitative Study
Nirupama Natarajan, Sridhar Vaitheswaran, Maria R Lima, Maitreyee Wairagkar, et al.
The American Journal of Geriatric Psychiatry , 2021 [Link] [Open Access Link]
(Featured in the American Journal of Geriatric Psychiatry editorial)

Conversational Affective Social Robots for Ageing and Dementia Support: A Review
Maria R Lima, Maitreyee Wairagkar, Manish Gupta, et al.
IEEE Transactions on Cognitive and Developmental Systems, 2021 [In Press]

Neurorehabilitation Tools for Combined Motor and Language Therapy (MaLT)


Neurorehabilitation is an essential component of recovery after stroke and brain injury. The functional connectivity and structural proximity of elements of the language and motor systems result in frequent co-morbidity post brain injury; however, treatment for language and motor functions often occurs in isolation. Due to the care burden in rehab centres and hospitals, it is not possible to provide the necessary amount of high-intensity therapy to patients. Hence, in this project as a research assistant at the University of Reading, I have developed interactive combined motor and language therapy tools (MaLT) for long term home-based rehabilitation in collaboration with language therapists, assistive technology researchers, physiotherapists and clinicians from NHS, and patient and carer groups. MaLT comprises a suite of Kinect-based interactive games targeting both language and upper-limb motor therapy and records patient performance for therapists to assess progress. The games target four major language therapy tasks involving single word comprehension, initial phoneme identification, rhyme identification and a naming task with eight levels of difficulty by programmatically generating appropriate questions providing unique gameplay every time. MaLT was tested with stroke survivors at home and in NHS hospital with a positive response.

MaLT technology has been transferred for commercialisation to Evolv and is now included in their suite of virtual rehabilitation tools. I have also developed a mobile app for speech and language therapy (SpeLT).

MaLT– Combined Motor and Language Therapy tool for Brain Injury Patients using Kinect
Maitreyee Wairagkar, Rachel McCrindle, Holly Robson, et al.
Methods of Information in Medicine, 2017 [Link] [Open Access Link]

Combined Language and Motor Therapy for Brain Injury Patients
Maitreyee Wairagkar, Rachel McCrindle, Holly Robson, et al.
Proceedings of the 3rd Workshop on ICTs for Improving Patients Rehabilitation Research Techniques, 2015 [Link]

Modelling Movement Kinematics With Wearable Inertial Sensors for Parkinson's Disease

Modelling Movement Kinematics

Studying kinematics of movement can provide insight in the assessment of neurodegenerative Parkinson's disease, monitoring its progression, and developing rehabilitation strategies. Wearable inertial sensors are a cost-effective mode of assessing movement in the clinical setting and home environment. As a research assistant in the SPHERE project at the University of Reading, I worked on developing a new approach of integrating modelling and classifying sit-to-stand movement kinematics using extended Kalman filter, logistic regression, and unsupervised machine learning with only two inertial sensors placed on shank and back. Sit-to-stand transitions are an important part of activities of daily living and play a key role in functional mobility in humans and are often affected in older adults due to frailty and in people with motor impairments. This model was successfully used to characterise and compare sit-to-stand angular kinematics in younger healthy adults, older healthy adults, and people with Parkinson's disease.

In the SPHERE project at the University of Southampton, I developed an algorithm to estimate motion intensities and energy expenditure for assessing mobility in people with Parkinson's and stroke during different activities using wearable inertial sensors. We used this to monitor movement in the free-living condition in the home environment continuously for multiple days to study how mobility is affected by different physiological conditions.
[More info]

[Preprint, in review] A Novel Approach for Modelling and Classifying Sit-to-Stand Kinematics using Inertial Sensors
Maitreyee Wairagkar, Emma Villeneuve, Rachel King, et al.
arXiv preprint, 2021 [Open Access Link]

Journal Articles [Google Scholar]


Selected Conference Proceedings

Talks, Presentations and Science Outreach


Work and Research Experience


Awards and Honours

Research Grants and Bursaries

Teaching and Mentoring

Scientific Reviewer

Scientific and Academic Service

Latest News & Events

26th Nov 2021 Upcoming Talk
I will give a talk on 'AI and Ethics in Healthcare' on 27th Nov 2021, 15:30 IST online at Centre for Corporate Law Studies, ILNU. [Webinar link]

13th Nov 2021
Our social robotics research group in Biomechatronics Lab, Imperial College London received an unrestricted gift from Google to support excellent research in academia for our work in improving engagement in human-robot interaction and facilitating mental health telemedicine.

7th Oct 2021
New video on our dementia technology research in the Care Research and Technology Centre was released today at the UK Dementia Research Institute Connectome 2021.

29th Sep 2021
Our work on social robotics for dementia care has been featured in the editorial of The American Journal of Geriatric Psychiatry.

In News and Media


Motor Imagery BCI to control VR and Soft Robotic rehab device

Neurofeedback BCI

MaLT - Combined Motor and Language Therapy tool for Brain Injury

Conversational AI for Dementia Care

[starts at 08:09]

UK DRI CR&T New Horizons in Dementia Care from Helix Centre on Vimeo.

Our BCI featured in Royal Institution Christmas Lectures 2017 (BBC FOUR)

[starts at 53:01]

BCI live demo in Science Museum Lates

BCI live demo in Royal Institution Lates