Catálogo de publicaciones - libros

Compartir en
redes sociales


Universal Access in Human-Computer Interaction. Ambient Interaction: 4th International Conference on Universal Access in Human-Computer Interaction, UAHCI 2007 Held as Part of HCI International 2007 Beijing, China, July 22-27, 2007 Proceedings, Part

Constantine Stephanidis (eds.)

En conferencia: 4º International Conference on Universal Access in Human-Computer Interaction (UAHCI) . Beijing, China . July 22, 2007 - July 27, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

User Interfaces and Human Computer Interaction; Multimedia Information Systems; Information Storage and Retrieval; Computer Communication Networks; Software Engineering; Logics and Meanings of Programs

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-73280-8

ISBN electrónico

978-3-540-73281-5

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Multimodal Augmented Reality in Medicine

Matthias Harders; Gerald Bianchi; Benjamin Knoerlein

The driving force of our current research is the development of medical training systems using augmented reality techniques. To provide multimodal feedback for the simulation, haptic interfaces are integrated into the framework. In this setting, high accuracy and stability are a prerequisite. Misalignment of overlaid virtual objects would greatly compromise manipulative fidelity and the sense of presence, and thus reduce the overall training effect. Therefore, our work targets the precise integration of haptic devices into the augmented environment and the stabilization of the tracking process. This includes a distributed system structure which is able to handle multiple users in a collaborative augmented world. In this paper we provide an overview of related work in medical augmented reality and give an introduction to our developed system.

- Part III: Virtual and Augmented Environments | Pp. 652-658

New HCI Based on a Collaborative 3D Virtual Desktop for Surgical Planning and Decision Making

Pascal Le Mer; Dominique Pavy

Today, diagnosis of cancer and therapeutic choice imply strongly structured meeting between specialized practitioners. These complex and not standardized meetings are generally located at a same place and need a heavy preparation-time. In this context, we assume that efficient collaborative tools could help to reduce decision time and improve reliability of the chosen treatments. The European project Odysseus investigates how to design a Collaborative Decision Support Systems (CDSS) for surgical planning. We present here an activity analysis and the first outcomes of a participatory design method involving end users. Especially a new concept of Graphic User Interface (GUI) is proposed. It tries to make use of Virtual Reality technologies to overcome issues met with common collaborative tools.

- Part III: Virtual and Augmented Environments | Pp. 659-665

VR, HF and Rule-Based Technologies Applied and Combined for Improving Industrial Safety

Konstantinos Loupos; Luca Vezzadini; Wytze Hoekstra; Waleed Salem; Paul Chung; Matthaios Bimpas

Industrial safety can be regarded as a major issue of industrial environments nowadays. This is why industries are currently spending huge amounts of resources to improve safety in all levels by reducing risks of causing damages to equipment, human injuries or even fatalities. This paper describes how Virtual Reality, Human Factors and Rule-based technologies are used in the framework of the VIRTHUALIS Integrated Project towards industrial training, safety management and accident investigation. The paper focuses mainly on the VR system specification and basic modules, while at the same time it presents the main system modules that synthesize the tool as a whole.

- Part III: Virtual and Augmented Environments | Pp. 676-680

Adaptive Virtual Reality Games for Rehabilitation of Motor Disorders

Minhua Ma; Michael McNeill; Darryl Charles; Suzanne McDonough; Jacqui Crosbie; Louise Oliver; Clare McGoldrick

This paper describes the development of a Virtual Reality (VR) based therapeutic training system aimed at encourage stroke patients with upper limb motor disorders to practice physical exercises. The system contains a series of physically-based VR games. Physically-based simulation provides realistic motion of virtual objects by modelling the behaviour of virtual objects and their responses to external force and torque based on physics laws. We present opportunities for applying physics simulation techniques in VR therapy and discuss their potential therapeutic benefits to motor rehabilitation. A framework for physically-based VR rehabilitation systems is described which consists of functional tasks and game scenarios designed to encourage patients’ physical activity in highly motivating, physics-enriched virtual environments where factors such as gravity can be scaled to adapt to individual patient’s abilities and in-game performance.

- Part III: Virtual and Augmented Environments | Pp. 681-690

Controlling an Anamorphic Projected Image for Off-Axis Viewing

Jiyoung Park; Myoung-Hee Kim

We modify a projected image so as to compensate for changes in the viewer’s location. We use the concept of a virtual camera in the viewing space to achieve a transformable display with improved visibility. The 3D space and virtual camera are initialized and then the image is translated, rotated, scaled and projected. The user can modify the position and size of the image freely within the allowable projection area. They can also change its orientation as seen from their viewpoint, which can be off the axis of projection.

- Part III: Virtual and Augmented Environments | Pp. 691-698

An Anthropomorphic AR-Based Personal Information Manager and Guide

Andreas Schmeil; Wolfgang Broll

The use of personal electronic equipment has significantly increased during recent years. Augmented Reality (AR) technology enables mobile devices to provide a very rich user experience by combining mobile computing with connectivity and location-awareness. In this paper we discuss the approach and development of an Augmented Reality-based personal assistant, combining the familiar interface of a human person with the functionality of a location-aware digital information system. The paper discusses the main components of the system, including the anthropomorphic user interface as well as the results of an initial prototype evaluation.

- Part III: Virtual and Augmented Environments | Pp. 699-708

Merging of Next Generation VR and Ambient Intelligence – From Retrospective to Prospective User Interfaces

Oliver Stefani; Ralph Mager; Evangelos Bekiaris; Maria Gemou; Alex Bullinger

In this paper we present current and future approaches to merge intelligent interfaces with immersive Virtual Environments (VEs). The aim of this paper is to substantiate the introductory presentation in the session ”Facing Virtual Environments with innovative interaction techniques” at HCI 2007. Although VEs and multimodal interfaces tried to make Human-Computer- Interaction as natural as possible, they have shown serious usability problems. We describe concepts to aid users in supporting their personal cognitive and perceptual capabilities, where the Virtual Environment will adapt dynamically and in real-time to the users’ physiological constitution, previous behaviour and desires. With our concept, human performance can be significantly enhanced by adapting interfaces and environments to the users’ mental condition and their information management capacity. Health and usability problems caused by stress, workload and fatigue will be avoided. We intend to encourage discussions on this topic among the experts, which are gathered this session.

- Part III: Virtual and Augmented Environments | Pp. 709-714

Steady-State VEPs in CAVE for Walking Around the Virtual World

Hideaki Touyama; Michitaka Hirose

The human brain activities of steady-state visual evoked potentials, induced by a virtual panorama and two objects, were recorded for two subjects in immersive virtual environment. The linear discriminant analysis with single trial EEG data for 1.0 seconds resulted in 74.2 % of averaged recognition rate in inferring three gaze directions. The possibility of online interaction with 3D images in CAVE will be addressed for walking application or remote control of a robotic camera.

- Part III: Virtual and Augmented Environments | Pp. 715-717

An Eye-Gaze Input System Using Information on Eye Movement History

Kiyohiko Abe; Shoichi Ohi; Minoru Ohyama

We have developed an eye-gaze input system for people with severe physical disabilities such as amyotrophic lateral sclerosis. The system utilizes a personal computer and a home video camera to detect eye gaze under natural light. It also compensates for measurement errors caused by head movements; in other words, it can detect the eye gaze with a high degree of accuracy. We have also developed a new gaze selection method based on the eye movement history of a user. Using this method, users can rapidly input text using eye gazes.

- Part IV: Interaction Techniques and Devices | Pp. 721-729

Nonverbally Smart User Interfaces: Postural and Facial Expression Data in Human Computer Interaction

G. Susanne Bahr; Carey Balaban; Mariofanna Milanova; Howard Choe

We suggest that User Interfaces (UIs) can be designed to serve as cognitive tools based on a model of nonverbal human interaction. Smart User Interfaces (SUIs) have the potential to support the human user when and where appropriate and thus indirectly facilitate higher mental processes without the need for end-user programming or external actuation. Moreover, graphic nonverbally sensitive SUIs are expected to be less likely to interfere with ongoing activity and disrupt the user. We present two non-invasive methods to assess postural and facial expression components and propose a contextual analysis to guide SUI actuation and supportive action. The approach is illustrated in a possible redesign of the Microsoft helper agent “Clippit” ®.

- Part IV: Interaction Techniques and Devices | Pp. 740-749