Research
My research interests include human-robot interaction, mechanical design and modeling, embedded systems, databases, big data, machine learning, and natural language processing. These interests involve a broad spectrum of topics that span many disciplines. Robotics by nature is the combination of computer science, mechanical systems, and electrical engineering. I am very much interested in practical applications of robotic research and this involves taking my broad interests in all three fields and applying them to a problem worth solving.
Below list several examples of my research. For a more complete look, please see my publications page.
Human-Guided Robot Self-Exploration for object Affordance Learning
Our work focuses on robots to be deployed in human environments. These robots, which will need specialized object manipulation skills, should leverage end-users to efficiently learn the affordances (e.g. pull-able, open-able, push-able) of objects in their environment. This approach is promising because people naturally focus on showing salient aspects of the objects. The work characterize the benefits of self and supervised affordance learning and show that a combined approach is the most efficient and successful.
Chu and Thomaz, "Analyzing Differences between Teachers when Learning Object Affordances via Guided-Exploration", IJRR 2017
Chu et al., "Learning Object Affordances by Leveraging the Combination of Human-Guidance and Self-Exploration", HRI 2016 - Nominated for Best Technical Advance in HRI Paper Award
Multimodal Real-time Contingency Detection for HRI
Our goal is to develop robots that naturally engage people in social exchanges. In this work, we focus on the problem of recognizing that a person is responsive to a robot’s request for interaction. Inspired by human cognition, our approach is to treat this as a contingency detection problem.
Chu et al., "Multimodal Real-time Contingency Detection for HRI", IROS 2014
Robotic Learning of Haptic Adjectives Through Physical Interaction
Image Credit: Katherine J. Kuchenbecker
To perform useful tasks in everyday human environments, robots must be able to both understand and communicate the sensations they experience during haptic interactions with objects. Toward this goal, we augmented the Willow Garage PR2 robot with a pair of SynTouch BioTac sensors to capture rich tactile signals during the execution of four exploratory procedures on 60 household objects. In a parallel experiment, human subjects blindly touched the same objects and selected binary haptic adjectives from a predetermined set of 25 labels. We developed several machine-learning algorithms to discover the meaning of each adjective from the robot’s sensory data. The most successful algorithms were those that intelligently combine static and dynamic components of the data recorded during all four exploratory procedures. The best of our approaches produced an average adjective classification F1 score of 0.77, a score higher than that of an average human subject.
For more details, please see the original project's website
Chu et al. "Robotic Learning of Haptic Adjectives Through Physical Interaction", RAS 2015
Chu et al, "Using Robotic Exploratory Procedures to Learn the Meaning of Haptic Adjectives", ICRA 2013 - Best Cognitive Robotics Paper Award