skip to content

Search

Multi-Modal Perception and Behavior Adaptation Models for Human State Understanding and Interaction Improvement in Robotic Touch

Huy Quyen Ngo 1 , Rana Soltani Zarrin 2
1 The Robotics Institute
2 Honda Research Institute
0 min read PDF

Abstract

Robots that can physically interact with humans in a safe, comfortable, and intuitive manner can help in a variety of settings. However, perceptions of the users greatly affect the acceptability of such robots. Ability of the system to understand user’s perception of the physical interaction as well as adapting robot’s behaviors based on user perception and interaction context can facilitate acceptability of these robots. In this paper we propose a perception-based interaction adaptation framework. One main component of this framework is a multi-modal perception model which is grounded on the existing literature and is intended to provide a quantitative estimation of the human state- defined as the perceptions of the physical interaction- by using human, robot, and context information. This model is intended to be comprehensive in many physical Human-Robot Interaction (pHRI) scenarios. The estimated human state is fed to a context-aware behavior adaptation framework which recommends robot behaviors to improve human state using a learned behavior cost model and an optimization formulation. We show the potential and feasibility of such a human state estimation model by evaluating a reduced model, with data collected through a user study. Additionally, through some feature analysis, we aimed to shed light on future interaction designs for pHRI.