Catálogo de publicaciones - libros

Compartir en
redes sociales


Artificial Mind System: Kernel Memory Approach

Tetsuya Hoya

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-26072-1

ISBN electrónico

978-3-540-32403-4

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin/Heidelberg 2005

Tabla de contenidos

Introduction

Tetsuya Hoya

“What is mind?” When you are asked such a question, you may be probably confused, because you do not exactly know how to answer, though you frequently use the word “mind” in daily conversation to describe your conditions, experiences, feelings, mental states, and so on. On the other hand, many people have so far tackled the topic of how science can handle the mind and its operation.

Pp. 1-8

From Classical Connectionist Models to Probabilistic/Generalised Regression Neural Networks (PNNs/GRNNs)

Tetsuya Hoya

This chapter begins by briefly summarising some of the well-known classical connectionist/artificial neural network models such as multi-layered perceptron neural networks (MLP-NNs), radial basis function neural networks (RBF-NNs), self-organising feature maps (SOFMs), associative memory, and Hopfield-type recurrent neural networks (HRNNs). These models are shown to normally require iterative and/or complex parameter approximation procedures, and it is highlighted why these approaches have in general lost interest in modelling the psychological functions and developing artificial intelligence (in a more realistic sense).

Palabras clave: Associative Memory; Radial Basis Function Neural Network; Recurrent Neural Network; Probabilistic Neural Network; Deterioration Rate.

Pp. 9-29

The Kernel Memory Concept – A Paradigm Shift from Conventional Connectionism

Tetsuya Hoya

In this chapter, the general concept of kernel memory (KM) is described, which is given as the basis for not only representing the general notion of “memory” but also modelling the psychological functions related to the arti- ficial mind system developed in later chapters.

Palabras clave: Kernel Function; Gaussian Kernel; Spike Train; MIMO System; Link Weight.

Pp. 31-58

The Self-Organising Kernel Memory (SOKM)

Tetsuya Hoya

In the previous chapter, various topological representations in terms of the kernel memory concept have been discussed together with some illustrative examples. In this chapter, a novel unsupervised algorithm to train the link weights between the KFs is given by extending the original Hebb’s neuropsychological concept, whereby the self-organising kernel memory (SOKM)^1 is proposed.

Palabras clave: Construction Phase; Link Weight; Topological Representation; Pattern Vector; Single Kernel.

Pp. 59-80

The Artificial Mind System (AMS), Modules, and Their Interactions

Tetsuya Hoya

The previous two chapters have been devoted to establishing the novel artificial neural network concept, namely the kernel memory concept, for the foundation of the artificial mind system (AMS).

Palabras clave: Mutual Interaction; Attention Module; Primary Output; Kernel Memory; Emotion Module.

Pp. 81-94

Sensation and Perception Modules

Tetsuya Hoya

In any kind of creature, both the mechanisms of sensation and perception are indispensable for continuous living, e.g. to find edible plants/fruits in the forest, or to protect themselves from attack by approaching enemies. To fulfill these aims, there are considered to be two different kinds of information processes occurring in the brain: 1) extraction of useful features amongst the flood of information coming from the sensory organs equipped and 2) perception of the current surroundings based upon the features so detected in 1) for planning the next actions to be taken. Namely, the sensation mechanism is responsible for the former, whereas the latter is the role of the perception mechanism.

Palabras clave: Speech Signal; Noise Reduction; Sensation Module; Blind Signal; Memory Module.

Pp. 95-116

Learning in the AMS Context

Tetsuya Hoya

In this chapter, we dig further into the notion of “learning” within the AMS context. In conventional connectionist models, the term “learning” is almost always referred to as merely establishing the input-output relations via the parametric changes within such models, and the parameter tuning is typically performed by a certain iterative algorithm, given a finite (and mostly static) set of variables (i.e. both the training patterns and target signals). However, this interpretation is rather microscopic and hence still quite distant from the general notion of learning, since it only ends up with such parameter tuning, without giving any clear notions or clues to describe it at a macroscopic level, e.g. to explain the higher-order functions/phenomena occurring within the brain (see e.g. Roy, 2000).

Palabras clave: Feature Extraction; Target Response; Memory Module; Link Weight; Competitive Learning.

Pp. 117-133

Memory Modules and the Innate Structure

Tetsuya Hoya

As the philosopher Miguel de Umamuno (1864-1936) once said, “We live in memory and memory, and our spiritual life is at bottom simply the effort of our memory to persist, to transform itself into hope ... into our future. ” from “Tragic Sense of Life” (Unamuno, 1978), the “memory” is an indispensable item for the description of the mind. In psychological study (Squire, 1987), the notion of “learning” is defined as the process of acquiring new information, whereas “memory” is referred to as the persistence of learning in a state that can be revealed at a later time (see also Gazzaniga et al., 2002) and the outcome of learning. Thus, both the principles of learning, as described in the previous chapter, and memory within the AMS context are closely tied to each other.

Palabras clave: Independent Component Analysis; Memory Module; Link Weight; Phonological Loop; Target Speech.

Pp. 135-168

Language and Thinking Modules

Tetsuya Hoya

In this chapter, we focus upon the two modules which are closely tied to the concept of “action planning”, i.e. the 1) language and 2) thinking modules.

Palabras clave: Concept Formation; Memory Module; Link Weight; Language Module; Mental Lexicon.

Pp. 169-187

Modelling Abstract Notions Relevant to the Mind and the Associated Modules

Tetsuya Hoya

This chapter is devoted to the remaining four modules within the AMS, i.e. 1) attention , 2) emotion , 3) intention , and 4) intuition module, and their mutual interactions with the other associated modules. Then, the four modules so modelled represent the respective abstract notions related to the mind.

Palabras clave: Attentive State; Generalise Regression Neural Network; Memory Module; Link Weight; Memory Search.

Pp. 189-235