Catálogo de publicaciones - libros

Compartir en
redes sociales


Perception and Interactive Technologies: International Tutorial and Research Workshop, Kloster Irsee, PIT 2006, Germany, June 19-21, 2006 Proceedings.

Elisabeth André ; Laila Dybkjær ; Wolfgang Minker ; Heiko Neumann ; Michael Weber (eds.)

En conferencia: International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems (PIT) . Kloster Irsee, Germany . June 19, 2006 - June 21, 2006

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Artificial Intelligence (incl. Robotics); Image Processing and Computer Vision; User Interfaces and Human Computer Interaction

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2006 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-34743-9

ISBN electrónico

978-3-540-34744-6

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2006

Tabla de contenidos

Talking with : Research Challenges in a Spoken Dialogue System

Gabriel Skantze; Jens Edlund; Rolf Carlson

This paper presents the current status of the research in the Higgins project and provides background for a demonstration of the spoken dialogue system implemented within the project. The project represents the latest development in the ongoing dialogue systems research at KTH. The practical goal of the project is to build collaborative conversational dialogue systems in which research issues such as error handling techniques can be tested empirically.

- System Demonstrations | Pp. 193-196

Location-Based Interaction with Children for Edutainment

Matthias Rehm; Elisabeth André; Bettina Conradi; Stephan Hammer; Malte Iversen; Eva Lösch; Torsten Pajonk; Katharina Stamm

Our mixed-reality installation features two cooperating characters that integrate multiple users by location-based tracking into the interaction, allowing for dynamic storylines.

- System Demonstrations | Pp. 197-200

An Immersive Game – Augsburg Cityrun

Klaus Dorfmueller-Ulhaas; Dennis Erdmann; Oliver Gerl; Nicolas Schulz; Volker Wiendl; Elisabeth André

We present a platform for creating immersive 3D games including a new interface for navigating through virtual scenes. One innovative part of this application is a crowd simulation with an emergent behaviour of virtual characters. While the users has to move very quickly through a crowd of nearly two hundred virtual characters he is supported by a precise, fast operating, and unobtrusive navigation interface.

- System Demonstrations | Pp. 201-204

Gaze-Contingent Spatio-temporal Filtering in a Head-Mounted Display

Michael Dorr; Martin Böhme; Thomas Martinetz; Erhardt Barth

The spatio-temporal characteristics of the human visual system vary widely across the visual field. Recently, we have developed a display capable of simulating arbitrary visual fields on high-resolution natural videos in real time by means of a gaze-contingent spatio-temporal filtering . While such a system can also be a useful tool for psychophysical research, our main motivation is to develop gaze-guidance techniques. Because the message an image sequence conveys depends on the exact pattern of eye movements an observer makes, we propose that in future information and communication systems, images will be augmented with a recommendation of where to look, of how to view them. Ultimately, we want to incorporate gaze guidance technology into mobile applications; such technology, integrated into a head-mounted display (HMD), could use computer vision techniques to enhance human visual performance.

In our demonstration, we will show a first implementation of such a device in the form of a system that implements our gaze-contingent spatio-temporal filtering algorithm in an HMD with video-see-through. Subjects will be able to walk around, seeing their natural visual environment inside the HMD. We will demonstrate that we then can manipulate what the subjects see in real time.

- System Demonstrations | Pp. 205-207

A Single-Camera Remote Eye Tracker

André Meyer; Martin Böhme; Thomas Martinetz; Erhardt Barth

Many eye-tracking systems either require the user to keep their head still or involve cameras or other equipment mounted on the user’s head. While acceptable for research applications, these limitations make the systems unsatisfactory for prolonged use in interactive applications. Since the goal of our work is to use eye trackers for improved visual communication through gaze guidance [1,2] and for Augmentative and Alternative Communication (AAC) [3], we are interested in less invasive eye tracking techniques.

- System Demonstrations | Pp. 208-211

Miniature 3D TOF Camera for Real-Time Imaging

Thierry Oggier; Felix Lustenberger; Nicolas Blanc

In the past, measuring the scene in all three dimensions has been either very expensive, slow or extremely computationally intensive. The latest progresses in the field of microtechnologies enable the breakthrough for time-of-flight (TOF) based distance-measuring devices. This paper describes the basic principle of the TOF measurements and a first specific implementation in a state-of-the-art 3D-camera ”SwissRanger SR-3000” [T. Oggier et al., 2005]. Acquired image sequences will be presented as well.

- System Demonstrations | Pp. 212-216