Catálogo de publicaciones - libros

Compartir en
redes sociales


Affective Computing and Intelligent Interaction: 1st International Conference, ACII 2005, Beijing, China, October 22-24, 2005, Proceedings

Jianhua Tao ; Tieniu Tan ; Rosalind W. Picard (eds.)

En conferencia: 1º International Conference on Affective Computing and Intelligent Interaction (ACII) . Beijing, China . October 22, 2005 - October 24, 2005

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-29621-8

ISBN electrónico

978-3-540-32273-3

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2005

Tabla de contenidos

Real-Life Emotion Representation and Detection in Call Centers Data

Laurence Vidrascu; Laurence Devillers

Since the early studies of human behavior, emotions have attracted the interest of researchers in Neuroscience and Psychology. Recently, it has been a growing field of research in computer science. We are exploring how to represent and automatically detect a subject’s emotional state. In contrast with most previous studies conducted on artificial data, this paper addresses some of the challenges faced when studying real-life non-basic emotions. Real-life spoken dialogs from call-center services have revealed the presence of many blended emotions. A soft emotion vector is used to represent emotion mixtures. This representation enables to obtain a much more reliable annotation and to select the part of the corpus without conflictual blended emotions for training models. A correct detection rate of about 80% is obtained between Negative and Neutral emotions and between Fear and Neutral emotions using paralinguistic cues on a corpus of 20 hours of recording.

- Affective Interaction and Systems and Applications | Pp. 739-746

Affective Touch for Robotic Companions

Walter Dan Stiehl; Cynthia Breazeal

As robotic platforms are designed for human robot interaction applications, a full body sense of touch, or “sensitive skin,” becomes important. The Huggable is a new type of therapeutic robotic companion based upon relational touch interactions. The initial use of neural networks to classify the affective content of touch is described.

- Affective Interaction and Systems and Applications | Pp. 747-754

Dynamic Mapping Method Based Speech Driven Face Animation System

Panrong Yin; Jianhua Tao

In the paper, we design and develop a speech driven face animation system based on the dynamic mapping method. The face animation is synthesized by the unit concatenating, and synchronous with the real speech. The units are selected according to the cost functions which correspond to voice spectrum distance between training and target units. Visual distance between two adjacent training units is also used to get better mapping results. Finally, the Viterbi method is used to find out the best face animation sequence. The experimental results show that synthesized lip movement has a good and natural quality.

- Affective Interaction and Systems and Applications | Pp. 755-763

Affective Intelligence: A Novel User Interface Paradigm

Barnabas Takacs

This paper describes an advanced human-computer interface that combines real-time, reactive and high fidelity virtual humans with artificial vision and communicative intelligence to create a closed-loop interaction model and achieve an affective interface. The system, called the Virtual Human Interface (VHI), utilizes a photo-real facial and body model as a virtual agent to convey information beyond speech and actions. Specifically, the VHI uses a dictionary of nonverbal signals including body language, hand gestures and subtle emotional display to support verbal content in a reactive manner. Furthermore, its built in facial tracking and artificial vision system allows the virtual human to maintain eye contact, follow the motion of the user and even recognizing when somebody joins him or her in front of the terminal and act accordingly. Additional sensors allow the virtual agent to react to touch, voice and other modalities of interaction. The system has been tested in a real-world scenario whereas a virtual child reacted to visitors in an exhibition space.

- Affective Interaction and Systems and Applications | Pp. 764-771

Affective Guide with Attitude

Mei Yii Lim; Ruth Aylett; Christian Martyn Jones

The Affective Guide System is a mobile context-aware and spatial-aware system, offering the user with an affective multimodal interaction interface. The system takes advantage of the current mobile and wireless technologies. It includes an ‘affective guide with attitude’ that links its memories and visitor’s interest to the spatial location so that stories are relevant to what can be immediately seen. This paper presents a review of related work, the system in detail, challenges and the future work to be carried out.

- Affective Interaction and Systems and Applications | Pp. 772-779

Human Vibration Environment of Wearable Computer

Zhiqi Huang; Dongyi Chen; Shiji Xiahou

The applied prospect of the wearable computer is very extensive, in order to put wearable computer into practice, one of key technologies to be solved is to improve antivibration capability. The brand-new human-computer interaction mode that the wearable computer offers, determines that the human body is its working environment. The beginning of our research work is to study the vibration environment in which the wearable computer should be working. The vibration that the wearable computer receives can be divided into two kinds, first, by the vibration of the human body transmission from external vibration, second, by the vibration of the human movement. In this paper, two environment that wearable computer often works in have been studied, and it is considered that the vibration caused by human moving is more intensive than the vibration transmitting from external vibration through human body.

- Affective Interaction and Systems and Applications | Pp. 788-794

An Online Multi-stroke Sketch Recognition Method Integrated with Stroke Segmentation

Jianfeng Yin; Zhengxing Sun

In this paper a novel multi-stroke sketch recognition method is presented. This method integrates the stroke segmentation and sketch recognition into a single approach, in which both stroke segmentation and sketch recognition are uniformed as a problem of “fitting to a template” with a minimal fitting error, and a nesting Dynamic Programming algorithm is designed to accelerate the optimizing approach.

- Affective Interaction and Systems and Applications | Pp. 803-810

A Model That Simulates the Interplay Between Emotion and Internal Need

Xiaoxiao Wang; Jiwei Liu; Zhiliang Wang

This paper proposes an emotion model that simulates the interplay between emotion and internal need of human being which influence human intelligence activities. The model includes emotion, internal need and decision modules. The aim of the model is to explore the role of emotion-like processes in intelligent machines. Simulations are conducted to test the performance of the emotion model.

- Affective Interaction and Systems and Applications | Pp. 811-818

Affective Dialogue Communication System with Emotional Memories for Humanoid Robots

M. S. Ryoo; Yong-ho Seo; Hye-Won Jung; H. S. Yang

Memories are vital in human interactions. To interact sociably with a human, a robot should not only recognize and express emotions like a human, but also share emotional experience with humans. We present an affective human-robot communication system for a humanoid robot, AMI, which we designed to enable high-level communication with a human through dialogue. AMI communicates with humans by preserving emotional memories of users and topics, and it naturally engages in dialogue with humans. Humans therefore perceive AMI to be more human-like and friendly. Thus, interaction between AMI and humans is enhanced.

- Affective Interaction and Systems and Applications | Pp. 819-827

Scenario-Based Interactive Intention Understanding in Pen-Based User Interfaces

Xiaochun Wang; Yanyan Qin; Feng Tian; Guozhong Dai

Interactive intention understanding is important for Pen-based User Interface (PUI). Many works on this topic are reported, and focus on handwriting or sketching recognition algorithms at the lexical layer. But these algorithms cannot totally solve the problem of intention understanding and can not provide the pen-based software with high usability. Hence, a scenario-based interactive intention understanding framework is presented in this paper, and is used to simulate human cognitive mechanisms and cognitive habits. By providing the understanding environment supporting the framework, we can apply the framework to the practical PUI system. The evaluation of the Scientific Training Management System for the Chinese National Diving Team shows that the framework is effective in improving the usability and enhancing the intention understanding capacity of this system.

- Affective Interaction and Systems and Applications | Pp. 828-835