This collaborative research project aims at developing a new class of dialog-based, home robotic healthcare assistants to facilitate a new level of in-home, real-time care to elderly and depressed patients, providing lower total costs and higher quality of life. An emotive, physical avatar called companionbot that possesses the ability to engage humans in a way that is unobstructive and suspends disbelief will be built in this project. The companionbot will be an integration of human language technology, vision, other sensory processing and emotive robotic technology to proactively recognize and dialog with isolated and elderly patients suffering from depression. The companionbot will utilize proactive or companionable dialog based on the context with users suffering from depression. This will require the first multimodal integration of a user model, environment model, and temporal processing with spoken dialog understanding and generation to produce dynamic dialog and emotive interaction, beyond the traditional scripted dialog and emotion. Object recognition, facial expressions recognition, and human activity recognition will augment natural language processing to provide current and historical context important to dynamic dialog.
A video of our eBear (expressive bear-like robot) can be seen HERE.
A video of our lifelike robotic face (MaskBot/ExpressionBot) can be seen HERE.
This research is supported by grant IIS-111568 from the National Science Foundation.