top of page
Writer's pictureKeya Ghonasgi

Gamified environments for human training

Robotics research has advanced significantly in recent decades, demonstrating a potential to greatly improve human quality of life through human-robot interaction (HRI). This field of research has large focused on the engineering of complex robotics systems such as the Da Vinci with advanced haptics technology and exoskeletons for human movement augmentation. But, the research so far has primarily addressed the design and control challenges of the robots without considering how human motor learning and adaptation results in changes in the human-robot system's behavior. Neuroscience research has shown the many different factors affecting human learning such as cognitive load and salience. The study of human-robot interaction should thus take into account the adaptation and learning effects of human-robot interaction. At the same time, researchers in this field should understand the different methods of assessing human behavior.


The design of a testing environment for the evaluation of human behavior during HRI relies on a balance between two conflicting factors: 1) cognitive load, and 2) salience. Here, cognitive load refers to the amount of cognitive effort the person uses to perform the assessment task, ranging from a relatively simple pointing task while wearing an exoskeleton to complicated knot-tying in a surgical application. Salience refers to the human user's investment in the task and the information they receive on the task performance. Motor learning researchers have suggested that the design of an engaging and informative task environment can be key to the human behavior in the human-robot (Krakauer et al. 2019). Our work with the Harmony exoskeleton deals with the assessment of human behavior during training while wearing an exoskeleton. To this end, we have designed two different task environments and used them to assess human-exoskeleton interaction. I gave a talk about these two environments and their relative advantages at the IROS 2023 workshop on Ergonomic Human-Robot Collaboration.



The first task environment, called Reach Ninja, is a 2D reaching game loosely based on the popular smartphone game Fruit Ninja. The game is designed to be dynamic and fast paced, and played on a computer monitor with the hand movement being tracked using a webcam. This environment has been used for online experiments conducted via video conferencing as well as in person with an exoskeleton robot.



The second environment is a dynamic virtual-reality rendering of the Japanese ball-and-cup toy, Kendama. The goal of this task is to swing a ball attached to a cup through a string such that the ball may be caught in the cup. Users have trained on this task while wearing a virtual reality headset and an exoskeleton robot.




13 views0 comments

Recent Posts

See All

Comments


bottom of page