As a neuroengineering lab, we are conducting researches on how to establish interfaces to the brain to read out neural information for understanding brain mechanisms and developing technologies used for humans. Our research activities largely include a number of specific projects with different aims.
The research project of intracortical brain-computer interfaces (BCIs) aim to understand how neuronal activities encode information and to develop a system that harnesses intracortical neuronal signals to actuate external systems. Specifically, we are working on developing a bi-directional BCI to control a robotic arm by reading motor cortical activities and at the same time, to deliver somatosensory senses back to the brain by writing the code of senses directly to the somatosensory neurons. We anticipate that this type of BCIs will provide independence and enhanced quality of life to severely disabled persons.
Another research project builds a similar BCI system, but using non-invasive brain signals such as electroencephalography (EEG) to control home appliances of daily use. The non-invasive BCI system targets a wider range of populations to allow people to gain brain control over a variety of devices such as a TV set, a refrigerator, a smart LED system, and so on. We aim to achieve this goal by combining BCIs with augmented reality (AR) technology.
Other research projects aim to develop “neuro-tools” for various purposes. A neuromarketing project pursues to establish a set of tools to help understand cognitive and affective states of consumers by reading and analyzing brain activities. A biometric project develops novel biometrics based on individual brain responses elicited by unique stimulations. A multi-modal empathy assessment project attempts to build a systematic approach to evaluate one’s empathic capacity by measuring behavioral and neural activities as well as eye gaze during interaction with others. An automated home service project aims to develop a system that measures daily activities of a user from wearable devices, estimates the user’s mental and physical states and provides automated home service in accordance to the user states. Finally, a tactile learning project is concerned with building a model to learn tactile percepts from artificial tactile sensor signals, by mimicking the way the human somatosensory system works.
Take all these together, our lab ultimately aims to understand the brain better and utilize the brain information more effectively for human life.