Catálogo de publicaciones - libros

Compartir en
redes sociales


Intelligent Virtual Agents: 7th International Conference, IVA 2007 Paris, France, September 17-19, 2007 Proceedings

Catherine Pelachaud ; Jean-Claude Martin ; Elisabeth André ; Gérard Chollet ; Kostas Karpouzis ; Danielle Pelé (eds.)

En conferencia: 7º International Workshop on Intelligent Virtual Agents (IVA) . Paris, France . September 17, 2007 - September 19, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

User Interfaces and Human Computer Interaction; Artificial Intelligence (incl. Robotics); Information Systems Applications (incl. Internet); Computers and Education

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-74996-7

ISBN electrónico

978-3-540-74997-4

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Proactive Authoring for Interactive Drama: An Author’s Assistant

Mei Si; Stacy C. Marsella; David V. Pynadath

Interactive drama allows people to participate actively in a dynamically unfolding story, by playing a character or by exerting directorial control. One of the central challenges faced in the design of interactive dramas is how to ensure that the author’s goals for the user’s narrative experience are achieved in the face of the user’s actions in the story. This challenge is especially significant when a variety of users are expected. To address this challenge, we present an extension to Thespian, an authoring and simulating framework for interactive dramas. Each virtual character is controlled by a decision-theoretic goal driven agent. In our previous work on Thespian, we provided a semi-automated authoring approach that allows authors to configure virtual characters’ goals through specifying story paths. In this work, we extend Thespian into a more proactive authoring framework to further reduce authoring effort. The approach works by simulating potential users’ behaviors, generating corresponding story paths, filtering the generated paths to identify those that seem problematic and prompting the author to verify virtual characters’ behaviors in them. The author can correct virtual characters’ behaviors by modifying story paths. As new story paths are designed by the author, the system incrementally adjusts virtual characters’ configurations to reflect the author’s design ideas. Overall, this enables interactive testing and refinement of an interactive drama. The details of this approach will be presented in this paper, followed by preliminary results of applying it in authoring an interactive drama.

- Applications | Pp. 225-237

The Effects of an Embodied Conversational Agent’s Nonverbal Behavior on User’s Evaluation and Behavioral Mimicry

Nicole C. Krämer; Nina Simons; Stefan Kopp

Against the background that recent studies on embodied conversational agents demonstrate the importance of their behavior, an experimental study is presented that assessed the effects of different nonverbal behaviors of an embodied conversational agent on the users´ experiences and evaluations as well as on their behavior. 50 participants conducted a conversation with different versions of the virtual agent Max, whose nonverbal communication was manipulated with regard to eyebrow movements and self-touching gestures. In a 2x2 between subjects design each behavior was varied in two levels: occurrence of the behavior compared to the absence of the behavior. Results show that self-touching gestures compared to no self-touching gestures have positive effects on the experiences and evaluations of the user, whereas eyebrow raising evoked less positive experiences and evaluations in contrast to no eyebrow raising. The nonverbal behavior of the participants was not affected by the agent’s nonverbal behavior.

- Evaluation | Pp. 238-251

Spatial Social Behavior in Second Life

Doron Friedman; Anthony Steed; Mel Slater

We have developed software bots that inhabit the popular online social environment SecondLife (SL). Our bots can wander around, collect data, engage in simple interactions, and carry out simple automated experiments. In this paper we use our bots to study spatial social behavior. We found an indication that SL users display distinct spatial behavior when interacting with other users. In addition, in an automated experiment carried out by our bot, we found that users, when their avatars were approached by our bot, tended to respond by moving their avatar, further indicating the significance of proxemics in SL.

- Evaluation | Pp. 252-263

Generating Embodied Descriptions Tailored to User Preferences

Mary Ellen Foster

We describe two user studies designed to measure the impact of using the characteristic displays of a speaker expressing different user-preference evaluations to select the head and eye behaviour of an animated talking head. In the first study, human judges were reliably able to identify positive and negative evaluations based only on the motions of the talking head. In the second study, subjects generally preferred positive displays to accompany positive sentences and negative displays to accompany negative ones, and showed a particular dislike for negative facial displays accompanying positive sentences.

- Evaluation | Pp. 264-271

Scrutinizing Natural Scenes: Controlling the Gaze of an Embodied Conversational Agent

Antoine Picot; Gérard Bailly; Frédéric Elisei; Stephan Raidt

We present here a system for controlling the eye gaze of a virtual embodied conversational agent able to perceive the physical environment in which it interacts. This system is inspired by known components of human visual attention system and reproduces its limitations in terms of visual acuity, sensitivity to movement, limitations of short-memory and object pursuit. The aim of this coupling between animation and visual scene analysis is to provide sense of presence and mutual attention to human interlocutors. After a brief introduction to this research project and a focused state of the art, we detail the components of our system and confront simulation results to eye gaze data collected from viewers observing the same natural scenes.

- Gaze Models | Pp. 272-282

Attentive Presentation Agents

Tobias Eichner; Helmut Prendinger; Elisabeth André; Mitsuru Ishizuka

The paper describes an infotainment application where life-like characters present two MP3 players in a virtual showroom. The key feature of the system is that the presenter agents analyze the user’s gaze-behavior in real-time and may thus adapt the presentation flow accordingly. In particular, a user’s (non-)interest in interface objects and also preference in decision situations is estimated automatically by just using eye gaze as input modality. A formal study was conducted that compared two versions of the application. Results indicate that attentive presentation agents support successful grounding of deictic agent gestures and natural gaze behavior.

- Gaze Models | Pp. 283-295

The Rickel Gaze Model: A Window on the Mind of a Virtual Human

Jina Lee; Stacy Marsella; David Traum; Jonathan Gratch; Brent Lance

Gaze plays a large number of cognitive, communicative and affective roles in face-to-face human interaction. To build a believable virtual human, it is imperative to construct a gaze model that generates realistic gaze behaviors. However, it is not enough to merely imitate a person’s eye movements. The gaze behaviors should reflect the internal states of the virtual human and users should be able to derive them by observing the behaviors. In this paper, we present a gaze model driven by the cognitive operations; the model processes the virtual human’s reasoning, dialog management, and goals to generate behaviors that reflect the agent’s inner thoughts. It has been implemented in our virtual human system and operates in real-time. The gaze model introduced in this paper was originally designed and developed by Jeff Rickel but has since been extended by the authors.

- Gaze Models | Pp. 296-303

Embodied Creative Agents: A Preliminary Social-Cognitive Framework

Stéphanie Buisine; Améziane Aoussat; Jean-Claude Martin

The goal of this paper is to open discussion about industrial creativity as a potential application field for Embodied Conversational Agents. We introduce the domain of creativity and especially focus on a collective creativity tool, the brainstorming: we present the related research in Psychology which has identified several key cognitive and social mechanisms that influence brainstorming process and outcome. However, some dimensions remain unexplored, such as the influence of the partners’ personality or the facilitator’s personality on idea generation. We propose to explore these issues, among others, using Embodied Conversational Agents. The idea seems original given that Embodied Agents were never included into brainstorming computer tools. We draw some hypotheses and a research program, and conclude on the potential benefits for the knowledge on creativity process on the one hand, and for the field of Embodied Conversational Agents on the other hand.

- Emotions | Pp. 304-316

Feel the Difference: A Guide with Attitude!

Mei Yii Lim; Ruth Aylett

This paper describes a mobile context-aware ‘intelligent affective guide with attitude’ that guides visitors touring an outdoor attraction. Its behaviour is regulated by a biologically inspired architecture of emotion, allowing it to adapt to the user’s needs and feelings. In addition to giving an illusion of life, the guide emulates a real guide’s behaviour by presenting stories based on the user’s interests, its own interests, its belief and its current memory activation. A brief description of the system focusing on the core element - the guide’s emotional architecture - is given followed by findings from an evaluation with real users.

- Emotions | Pp. 317-330

It’s All in the Anticipation

Carlos Martinho; Ana Paiva

Since the beginnings of character animation, has been an effective part of the repertoire of tricks used to create believable animated characters. However, anticipation has had but a secondary role in the creation of synthetic virtual life forms. In this paper, we describe how a simple anticipatory mechanism that generates an affective signal resulting from the mismatch between sensed and predicted values — the — can help in the creation of consistent believable behaviour for intelligent virtual characters.

- Emotions | Pp. 331-338