Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  • articleNo Access

    HUMAN–HUMANOID INTERACTION: IS A HUMANOID ROBOT PERCEIVED AS A HUMAN?

    As humanoid robots become more commonplace in our society, it is important to understand the relation between humans and humanoid robots. In human face-to-face interaction, the observation of another individual performing an action facilitates the execution of a similar action, and interferes with the execution of a different action. This phenomenon has been explained by the existence of shared neural mechanisms for the execution and perception of actions, which would be automatically engaged by the perception of another individual's action. In one interference experiment, null interference was reported when subjects observed a robotic arm perform the incongruent task, suggesting that this effect may be specific to interacting with other humans. This experimental paradigm, designed to investigate motor interference in human interactions, was adapted to investigate how similar the implicit perception of a humanoid robot is to a human agent. Subjects performed rhythmic arm movements while observing either a human agent or humanoid robot performing either congruent or incongruent movements. The variance of the executed movements was used as a measure of the amount of interference in the movements. Both the human and humanoid agents produced a significant interference effect. These results suggest that observing the action of humanoid robots and human agents may rely on similar perceptual processes. Our findings suggest that experimental paradigms adopted from cognitive psychology can be used to derive measures for quantifying the degree of the implicit perception of a robot as human.

  • articleNo Access

    MODELING AND TESTING PROXEMIC BEHAVIOR FOR HUMANOID ROBOTS

    Humanoid robots that share the same space with humans need to be socially acceptable and effective as they interact with people. In this paper we focus our attention on the definition of a behavior-based robotic architecture that (1) allows the robot to navigate safely in a cluttered and dynamically changing domestic environment and (2) encodes embodied non-verbal interactions: the robot respects the users personal space (PS) by choosing the appropriate distance and direction of approach. The model of the PS is derived from human–robot interaction tests, and it is described in a convenient mathematical form. The robot's target location is dynamically inferred through the solution of a Bayesian filtering problem. The validation of the overall behavioral architecture shows that the robot is able to exhibit appropriate proxemic behavior.

  • articleNo Access

    MODELING THE HUMAN BLINK: A COMPUTATIONAL MODEL FOR USE WITHIN HUMAN–ROBOT INTERACTION

    This paper describes findings from a Human-to-Human Interaction experiment that examines human communicative non-verbal facial behaviour. The aim was to develop a more comfortable and effective model of social human-robot communication. Analysis of the data revealed a strong co-occurrence between human blink production and non-verbal communicative behaviours of own speech instigation and completion, interlocutor speech instigation, looking at/away from the interlocutor, facial expression instigation and completion, and mental communicative state changes. Seventy-one percent of the total 2007 analysed blinks co-occurred with these behaviours within a time window of +/- 375 ms, well beyond their chance co-occurrence probability of 23%. Thus between 48% and 71% of blinks are directly related to human communicative behaviour and are not simply "physiological" (e.g., for cleaning/humidifying the eye). Female participants are found to blink twice as often as male participants, in the same communicative scenario, and have a longer average blink duration. These results provide the basis for the implementation of a blink generation system as part of a social cognitive robot for human-robot interaction.

  • articleNo Access

    MULTIMODAL AFFECT MODELING AND RECOGNITION FOR EMPATHIC ROBOT COMPANIONS

    Affect recognition for socially perceptive robots relies on representative data. While many of the existing affective corpora and databases contain posed and decontextualized affective expressions, affect resources for designing an affect recognition system in naturalistic human–robot interaction (HRI) must include context-rich expressions that emerge in the same scenario of the final application. In this paper, we propose a context-based approach to the collection and modeling of representative data for building an affect-sensitive robotic game companion. To illustrate our approach we present the key features of the Inter-ACT (INTEracting with Robots–Affect Context Task) corpus, an affective and contextually rich multimodal video corpus containing affective expressions of children playing chess with an iCat robot. We show how this corpus can be successfully used to train a context-sensitive affect recognition system (a valence detector) for a robotic game companion. Finally, we demonstrate how the integration of the affect recognition system in a modular platform for adaptive HRI makes the interaction with the robot more engaging.

  • articleOpen Access

    Biological Motion Aids Gestural Communication by Humanoid Social Robots

    Advances in social robotics have led to increased interest in designing robots that can communicate effectively with humans through nonverbal gestures. One approach to enhancing the naturalness and expressiveness of robot gestures is to incorporate biological motion, which mimics the velocity and acceleration profiles observed in human gestures. This paper examines the use of biological motion for gestural communication by humanoid social robots, focusing on the impact of biological motion on the perceived warmth and effectiveness of robot gestures in fostering engagement while interacting with these robots. The exercise involved implementing the minimum jerk model of biological motion on a Pepper humanoid social robot and conducting user studies to evaluate the impact of biological motion on human–robot interaction. The results show that incorporating biological motion cues can significantly increase the perceived warmth of robot gestures and improve the overall effectiveness of gestural communication, resulting in more natural and engaging human–robot interaction.

  • articleNo Access

    Portmanteau word-play for vocabulary enhancement with humanoid robot support

    Word-play is as powerful learning and motivation tool often used by educators for teaching the ability of reading, which is a complex activity. In this paper, we introduce a system that exploits a Pepper humanoid robot acting as a playfellow in a word-play game based on portmanteau words. The robot shows the ability to play with children using a conversation engine, a portmanteau creation engine, and a definition engine. In this manner, Pepper can integrate itself within a group of kids, and it can support a teacher in her activities. The humanoid can be involved in a word-based round-game in which it can play the role of either answerer or generator of new words.

  • chapterNo Access

    Portmanteau word-play for vocabulary enhancement with humanoid robot support

    Word-play is as powerful learning and motivation tool often used by educators for teaching the ability of reading, which is a complex activity. In this paper, we introduce a system that exploits a Pepper humanoid robot acting as a playfellow in a word-play game based on portmanteau words. The robot shows the ability to play with children using a conversation engine, a portmanteau creation engine, and a definition engine. In this manner, Pepper can integrate itself within a group of kids, and it can support a teacher in her activities. The humanoid can be involved in a word-based round-game in which it can play the role of either answerer or generator of new words.

  • chapterNo Access

    RETARGETING SYSTEM FOR A SOCIAL ROBOT IMITATION INTERFACE

    This paper presents a novel retargeting method, which is included in an interface that allows a social robot to imitate the gestures performed by a human demonstrator. The input for this interface are human pose data obtained using any motion capture system. A general human 3D model adopts the perceived pose, and then this pose is retargeted to the particular robotic platform being used. The retargeting module combines two different strategies to translate data from human to robot. Experimental results show that this combined approach is able to preserve the characteristics of both static and dynamic gestures in different scenarios. The system has been tested over two different robotic platforms: the Fujitsu HOAP-1 humanoid robot and the NOMADA, a new social robot that is currently being developed in our research group.

  • chapterNo Access

    DESIGNING A ROBOTIC INTERFACE FOR CHILDREN: THE MONARCH ROBOT EXAMPLE

    The development of an empathic link between oneself and the Other is a fundamental part of interpersonal relationships determining the establishment of effective social and affective links that are the grounding basis of successful communication and cooperation on which the cohesion of human societies depend and on which harmonious global personal development also stands.

    The design of efficient robotic interfaces for interaction with people, namely with children, depends on the development of expressive elements to be present in the appearance of robots and in the way they address and interact with people, i.e. on the definition of a set of socially behaviours identified as communication enhancers.

    The present paper reects how the previous assumptions have determined the process that led to the construction of the MOnarCH robots and some of its design options.