Human-Machine Interaction (HMI) is a multidisciplinary research area focused on interaction modalities between humans and machines. For instance, when computers are considered, it is generally referred to as Human-Computer Interaction (HCI). When the focus is on robots, it is declined as Human-Robot Interaction (HRI).
In these fields, new and exciting scenarios are opened by recent technological advances in user input devices able to provide a mean for natural interaction modes. Gestures on touch and multi-touch screens, hand and body poses, speech, and gaze tracking are just a few examples of the possibilities of the new ways people can interact with devices.
An up-to-date analysis of the current and future trends in HMI can be found in the Guest Editor's Introduction of the September 2014 issue of Computing Now on "Human-Computer Interaction: Present and Future Trends".
The research activities carried out by the GRAINS group in this area concern the study, design, implementation and assessment of affordable natural user interfaces (NUIs) and related issues. Several applications can benefit from building personalized user input for interacting with a system. The research group designed and investigated reconfigurable interfaces using touch and multitouch devices. Applications range from the manipulation of real time 3D generated content to the control of teleoperated robots such as quadrotors and unmanned vehicles.
Another challenging research topic faced by the group refers to fully natural interfaces where part or the whole human body can act as the controller. In particular, the studies concern the design of methodologies to map human hand and body poses to computer commands for the control of kinematic chains. Additional challenges raise when non anthropometric shapes are considered, as in the application fields of the animation of virtual characters in computer graphics and the control of robotic armatures and semi-autonomous vehicles.