Catálogo de publicaciones - libros

Compartir en
redes sociales


Affective Computing and Intelligent Interaction: 2nd International Conference, ACII 2007 Lisbon, Portugal, September 12-14, 2007 Proceedings

Ana C. R. Paiva ; Rui Prada ; Rosalind W. Picard (eds.)

En conferencia: 2º International Conference on Affective Computing and Intelligent Interaction (ACII) . Lisbon, Portugal . September 12, 2007 - September 14, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-74888-5

ISBN electrónico

978-3-540-74889-2

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Expressive Face Animation Synthesis Based on Dynamic Mapping Method

Panrong Yin; Liyue Zhao; Lixing Huang; Jianhua Tao

In the paper, we present a framework of speech driven face animation system with expressions. It systematically addresses audio-visual data acquisition, expressive trajectory analysis and audio-visual mapping. Based on this framework, we learn the correlation between neutral facial deformation and expressive facial deformation with Gaussian Mixture Model (GMM). A hierarchical structure is proposed to map the acoustic parameters to lip FAPs. Then the synthesized neutral FAP streams will be extended with expressive variations according to the prosody of the input speech. The quantitative evaluation of the experimental result is encouraging and the synthesized face shows a realistic quality.

- Affective Facial Expression and Recognition | Pp. 1-11

Model of Facial Expressions Management for an Embodied Conversational Agent

Radosław Niewiadomski; Catherine Pelachaud

In this paper we present a model of facial behaviour encompassing interpersonal relations for an Embodied Conversational Agent (ECA). Although previous solutions of this problem exist in ECA’s domain, in our approach a variety of facial expressions (i.e. expressed, masked, inhibited, and fake expressions) is used for the first time. Moreover, our rules of facial behaviour management are consistent with the predictions of politeness theory as well as the experimental data (i.e. annotation of the video-corpus). Knowing the affective state of the agent and the type of relations between interlocutors the system automatically adapts the facial behaviour of an agent to the social context. We present also the evaluation study we have conducted of our model. In this experiment we analysed the perception of interpersonal relations from the facial behaviour of our agent.

- Affective Facial Expression and Recognition | Pp. 12-23

Facial Expression Synthesis Using PAD Emotional Parameters for a Chinese Expressive Avatar

Shen Zhang; Zhiyong Wu; Helen M. Meng; Lianhong Cai

Facial expression plays an important role in face to face communication in that it conveys nonverbal information and emotional intent beyond speech. In this paper, an approach for facial expression synthesis with an expressive Chinese talking avatar is proposed, where a layered parametric framework is designed to synthesize intermediate facial expressions using PAD emotional parameters [5], which describe the human emotional state with three nearly orthogonal dimensions. Partial Expression Parameter (PEP) is proposed to depict the facial expression movements in specific face regions, which act as the mid-level expression parameters between the low-level Facial Animation Parameters (FAPs) [11] and the high-level PAD emotional parameters. A pseudo facial expression database is established by cloning the real human expression to avatar and the corresponding emotion states for each expression is annotated using PAD score. An emotion-expression mapping model is trained on the database to map the emotion state (PAD) into facial expression configuration (PEP). Perceptual evaluation shows the input PAD value is consistent with that of human perception on synthetic expression, which supports the effectiveness of our approach.

- Affective Facial Expression and Recognition | Pp. 24-35

Reconstruction and Recognition of Occluded Facial Expressions Using PCA

Howard Towner; Mel Slater

Descriptions of three methods for reconstructing incomplete facial expressions using principal component analysis are given, projection to the model plane, single component projection and replacement by the conditional mean – the facial expressions being represented by feature points. It is established that one method gives better reconstruction accuracy than the others. This method is used on a systematic reconstruction problem, the reconstruction of occluded top and bottom halves of faces. The results indicate that occluded-top expressions can be reconstructed with little loss of expression recognition – occluded-bottom expressions are reconstructed less accurately but still give comparable performance to human rates of facial expression recognition.

- Affective Facial Expression and Recognition | Pp. 36-47

Recognizing Affective Dimensions from Body Posture

Andrea Kleinsmith; Nadia Bianchi-Berthouze

The recognition of affective human communication may be used to provide developers with a rich source of information for creating systems that are capable of interacting well with humans. Posture has been acknowledged as an important modality of affective communication in many fields. Behavioral studies have shown that posture can communicate discrete emotion categories as well as affective dimensions. In the affective computing field, while models for the automatic recognition of discrete emotion categories from posture have been proposed, to our knowledge, there are no models for the automatic recognition of affective dimensions from static posture. As a continuation of our previous study, the two main goals of this study are: i) to build automatic recognition models to discriminate between levels of affective dimensions based on low-level postural features; and ii) to investigate both the discriminative power and the limitations of the postural features proposed. The models were built on the basis of human observers’ ratings of posture according to affective dimensions directly (instead of emotion category) in conjunction with our posture features.

- Affective Body Expression and Recognition | Pp. 48-58

Detecting Affect from Non-stylised Body Motions

Daniel Bernhardt; Peter Robinson

In this paper we present a novel framework for analysing non-stylised motion in order to detect implicitly communicated affect. Our approach makes use of a segmentation technique which can divide complex motions into a set of automatically derived motion primitives. The parsed motion is then analysed in terms of dynamic features which are shown to encode affective information. In order to adapt our algorithm to personal movement idiosyncrasies we developed a new approach for deriving unbiased motion features. We have evaluated our approach using a comprehensive database of affectively performed motions. The results show that removing personal movement bias can have a significant benefit for automated affect recognition from body motion. The resulting recognition rate is similar to that of humans who took part in a comparable psychological experiment.

- Affective Body Expression and Recognition | Pp. 59-70

Recognising Human Emotions from Body Movement and Gesture Dynamics

Ginevra Castellano; Santiago D. Villalba; Antonio Camurri

We present an approach for the recognition of acted emotional states based on the analysis of body movement and gesture expressivity. According to research showing that distinct emotions are often associated with different qualities of body movement, we use non- propositional movement qualities (e.g. amplitude, speed and fluidity of movement) to infer emotions, rather than trying to recognise different gesture shapes expressing specific emotions. We propose a method for the analysis of emotional behaviour based on both direct classification of time series and a model that provides indicators describing the dynamics of expressive motion cues. Finally we show and interpret the recognition rates for both proposals using different classification algorithms.

- Affective Body Expression and Recognition | Pp. 71-82

Person or Puppet? The Role of Stimulus Realism in Attributing Emotion to Static Body Postures

Marco Pasch; Ronald Poppe

Knowledge of the relation between body posture and the perception of affect is limited. Existing studies of emotion attribution to static body postures vary in method, response modalities and nature of the stimulus. Integration of such results proves difficult, and it remains to be investigated how the relation can be researched best. In this study we focus on the role of stimulus realism. An experiment has been conducted where computer generated body postures in two realism conditions were shown to participants. Results indicate that higher realism not always results in increased agreement but clearly has an influence on the outcome for distinct emotions.

- Affective Body Expression and Recognition | Pp. 83-94

Motion Capture and Emotion: Affect Detection in Whole Body Movement

Elizabeth Crane; Melissa Gross

Bodily expression of felt emotion was associated with emotion-specific changes in gait parameters and kinematics. The emotions angry, sad, content, joy and no emotion at all were elicited in forty-two undergraduates (22 female, 20 male; 20.1±2.7 yrs) while video and whole body motion capture data (120 Hz) were acquired. Participants completed a self-report of felt emotion after each trial. To determine whether the walkers’ felt emotions were recognizable in their body movements, video clips of the walkers were shown to 60 undergraduates (29 female, 31 male; 20.9±2.7 yrs). After viewing each video clip, observers selected one of 10 emotions that they thought the walker experienced during the trial. This study provides evidence that emotions can be successfully elicited in the laboratory setting, emotions can be recognized in the body movements of others, and that body movements are affected by felt emotions.

- Affective Body Expression and Recognition | Pp. 95-101

Does Body Movement Engage You More in Digital Game Play? and Why?

Nadia Bianchi-Berthouze; Whan Woong Kim; Darshak Patel

In past years, computer game designers have tried to increase player engagement by improving the believability of characters and environment. Today, the focus is shifting toward improving the game controller. This study seeks to understand engagement on the basis of the body movements of the player. Initial results from two case-studies suggest that an increase in body movement imposed, or allowed, by the game controller results in an increase in the player’s engagement level. Furthermore, they lead us to hypothesize that an increased involvement of the body can afford the player a stronger affective experience. We propose that the contribution of full-body experience is three-fold: (a) it facilitates the feeling of presence in the digital environment (fantasy); (b) it enables the affective aspects of human-human interaction (communication); and (c) it unleashes the regulatory properties of emotion (affect).

- Affective Body Expression and Recognition | Pp. 102-113