PUblicationMAnagement  PumaLogo 
TitoloChildren Interpretation of Emotional Body Language Displayed by a Robot

Autori Beck A., Caņamero L., Damiano L., Sommavilla G., Tesser F., Cosi P.

In ICSR 2011 - Social Robotics, Third International Conference, ICSR 2011. Proceedings (Amsterdam, 24-25 November 2011). Proceedings, vol. 7072 pp. 62 - 70. B.Mutlu, C. Bartneck, J. Ham, V. Evers, T. Kanda (eds.). (Lecture Notes in Artificial Intelligence). Springer, 2011.

URL http://www.springerlink.com/content/l26718l752k226ph

DOI 10.1007/978-3-642-25504-5_7

Abstract
(English)
Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3]. Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.

Soggetti Affective computing


DownloadIcona documento      2011-A2-025.pdf


Icona documento Open access Icona documento Restricted Icona documento No access