Catálogo de publicaciones - libros

Compartir en
redes sociales


Affective Computing and Intelligent Interaction: 2nd International Conference, ACII 2007 Lisbon, Portugal, September 12-14, 2007 Proceedings

Ana C. R. Paiva ; Rui Prada ; Rosalind W. Picard (eds.)

En conferencia: 2º International Conference on Affective Computing and Intelligent Interaction (ACII) . Lisbon, Portugal . September 12, 2007 - September 14, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-74888-5

ISBN electrónico

978-3-540-74889-2

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics

Aurélie Zara; Valérie Maffiolo; Jean Claude Martin; Laurence Devillers

In order to design affective interactive systems, experimental grounding is required for studying expressions of emotion during interaction. In this paper, we present the EmoTaboo protocol for the collection of multimodal emotional behaviours occurring during human-human interactions in a game context. First annotations revealed that the collected data contains various multimodal expressions of emotions and other mental states. In order to reduce the influence of language via a predetermined set of labels and to take into account differences between coders in their capacity to verbalize their perception, we introduce a new annotation methodology based on 1) a hierarchical taxonomy of emotion-related words, and 2) the design of the annotation interface. Future directions include the implementation of such an annotation tool and its evaluation for the annotation of multimodal interactive and emotional behaviours. We will also extend our first annotation scheme to several other characteristics interdependent of emotions.

- Affective Databases, Annotations, Tools and Languages | Pp. 464-475

Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus

Tanja Bänziger; Klaus R. Scherer

Emotion research is intrinsically confronted with a serious difficulty to access pertinent data. For both practical and ethical reasons, genuine and intense emotions are problematic to induce in the laboratory; and sampling sufficient data to capture an adequate variety of emotional episodes requires extensive resources. For researchers interested in emotional expressivity and nonverbal communication of emotion, this situation is further complicated by the pervasiveness of expressive regulations. Given that emotional expressions are likely to be regulated in most situations of our daily lives, spontaneous emotional expressions are especially difficult to access. We argue in this paper that, in view of the needs of current research programs in this field, well-designed corpora of acted emotion portrayals can play a useful role. We present some of the arguments motivating the creation of a multimodal corpus of emotion portrayals (Geneva Multimodal Emotion Portrayal, GEMEP) and discuss its overall benefits and limitations for emotion research.

- Affective Databases, Annotations, Tools and Languages | Pp. 476-487

The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data

Ellen Douglas-Cowie; Roddy Cowie; Ian Sneddon; Cate Cox; Orla Lowry; Margaret McRorie; Jean-Claude Martin; Laurence Devillers; Sarkis Abrilian; Anton Batliner; Noam Amir; Kostas Karpouzis

The HUMAINE project is concerned with developing interfaces that will register and respond to emotion, particularly pervasive emotion (forms of feeling, expression and action that colour most of human life). The HUMAINE Database provides naturalistic clips which record that kind of material, in multiple modalities, and labelling techniques that are suited to describing it.

- Affective Databases, Annotations, Tools and Languages | Pp. 488-500

User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements

Ginevra Castellano; Roberto Bresin; Antonio Camurri; Gualtiero Volpe

In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.

- Affective Sound and Music Processing | Pp. 501-510

Towards Affective-Psychophysiological Foundations for Music Production

António Pedro Oliveira; Amílcar Cardoso

This paper describes affective and psychophysiological foundations used to help to control affective content in music production. Our work includes the proposal of a knowledge base grounded on the state of the art done in areas of Music Psychology. This knowledge base has relations between affective states (happiness, sadness, etc.) and high level music features (rhythm, melody, etc.) to assist in the production of affective music. A computer system uses this knowledge base to select and transform chunks of music. The methodology underlying this system is essentially founded on Affective Computing topics. Psychophysiology measures will be used to detect listener’s affective state.

- Affective Sound and Music Processing | Pp. 511-522

Sound Design for Affective Interaction

Anna DeWitt; Roberto Bresin

Different design approaches contributed to what we see today as the prevalent design paradigm for Human Computer Interaction; though they have been mostly applied to the visual aspect of interaction. In this paper we presented a proposal for sound design strategies that can be used in applications involving affective interaction. For testing our approach we propose the sonification of the Affective Diary, a digital diary with focus on emotions, affects, and bodily experience of the user. We applied results from studies in music and emotion to sonic interaction design. This is one of the first attempts introducing different physics-based models for the real-time complete sonification of an interactive user interface in portable devices.

- Affective Sound and Music Processing | Pp. 523-533

Explanatory Style for Socially Interactive Agents

Sejin Oh; Jonathan Gratch; Woontack Woo

Recent years have seen an explosion of interest in computational models of socio-emotional processes, both as a mean to deepen understanding of human behavior and as a mechanism to drive a variety of training and entertainment applications. In contrast with work on emotion, where research groups have developed detailed models of emotional processes, models of personality have emphasized shallow surface behavior. Here, we build on computational appraisal models of emotion to better characterize dispositional differences in how people come to understand social situations. Known as , this dispositional factor plays a key role in social interactions and certain socio-emotional disorders, such as depression. Building on appraisal and attribution theories, we model key conceptual variables underlying the explanatory style, and enable agents to exhibit different explanatory tendencies according to their personalities. We describe an interactive virtual environment that uses the model to allow participants to explore individual differences in the explanation of social events, with the goal of encouraging the development of perspective taking and emotion-regulatory skills.

- Affective Interactions: Systems and Applications | Pp. 534-545

Expression of Emotions in Virtual Humans Using Lights, Shadows, Composition and Filters

Celso de Melo; Ana Paiva

Artists use words, lines, shapes, color, sound and their bodies to express emotions. Virtual humans use postures, gestures, face and voice to express emotions. Why are they limiting themselves to the body? The digital medium affords the expression of emotions using lights, camera, sound and the pixels in the screen itself. Thus, leveraging on accumulated knowledge from the arts, this work proposes a model for the expression of emotions in virtual humans which goes beyond embodiment and explores lights, shadows, composition and filters to convey emotions. First, the model integrates the OCC emotion model for emotion synthesis. Second, the model defines a pixel-based lighting model which supports extensive expressive control of lights and shadows. Third, the model explores the visual arts techniques of composition in layers and filtering to manipulate the virtual human pixels themselves. Finally, the model introduces a markup language to define mappings between emotional states and multimodal expression.

- Affective Interactions: Systems and Applications | Pp. 546-557

Pogany: A Tangible Cephalomorphic Interface for Expressive Facial Animation

Christian Jacquemin

A head-shaped input device is used to produce expressive facial animations. The physical interface is divided into zones, and each zone controls an expression on a smiley or on a virtual 3D face. Through contacts with the interface users can generate basic or blended expressions. To evaluate the interface and to analyze the behavior of the users, we performed a study made of three experiments in which subjects were asked to reproduce simple or more subtle expressions. The results show that the subjects easily accept the interface and get engaged in a pleasant affective relationship that make them feel as sculpting the virtual face. This work shows that anthropomorphic interfaces can be used successfully for intuitive affective expression.

- Affective Interactions: Systems and Applications | Pp. 558-569

SuperDreamCity: An Immersive Virtual Reality Experience That Responds to Electrodermal Activity

Doron Friedman; Kana Suji; Mel Slater

In this paper we describe an artistic exhibition that took place in our highly-immersive virtual-reality laboratory. We have allowed visitors to explore a virtual landscape based on the content of night dreams, where the navigation inside the landscape was based on an online feedback from their electrodermal response. We analyze a subset of the physiology data captured from participants and describe a new method for analyzing dynamic physiological experiences based on hidden Markov models.

- Affective Interactions: Systems and Applications | Pp. 570-581