Catálogo de publicaciones - libros

Compartir en
redes sociales


Advances in Natural Computation: 1st International Conference, ICNC 2005, Changsha, China, August 27-29, 2005, Proceedings, Part I

Lipo Wang ; Ke Chen ; Yew Soon Ong (eds.)

En conferencia: 1º International Conference on Natural Computation (ICNC) . Changsha, China . August 27, 2005 - August 29, 2005

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Artificial Intelligence (incl. Robotics); Image Processing and Computer Vision; Computation by Abstract Devices; Algorithm Analysis and Problem Complexity; Pattern Recognition; Evolutionary Biology

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-28323-2

ISBN electrónico

978-3-540-31853-8

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2005

Tabla de contenidos

An Application of Pattern Recognition Based on Optimized RBF-DDA Neural Networks

Guoyou Li; Huiguang Li; Min Dong; Changping Sun; Tihua Wu

An algorithm of Dynamic Decay Adjustment Radial Basis Function (RBF-DDA) neural networks is presented. It can adaptively get the number of the hidden layer nodes and the center values of data. It resolve the problem of deciding RBF parameters randomly and generalization ability of RBF is improved. When is applied to the system of image pattern recognition, the experimental results show that the recognition rate of the improved RBF neural network still achieves 97.4% even under stronger disturbance. It verifies the good performance of improved algorithm.

- Neurodynamics | Pp. 397-404

Effect of Noises on Two-Layer Hodgkin-Huxley Neuronal Network

Jun Liu; Zhengguo Lou; Guang Li

Stochastic resonance (SR) effect has been discovered in non-dynamical threshold systems such as sensory systems. This paper presents a network simulating basic structure of a sensory system to study SR. The neuronal network consists of two layers of the Hodgkin-Huxley (HH) neurons. Compared with single HH model, subthreshold stimulating signals do not modulate output signal-noise ratio, thus a fixed level of noise from circumstance can induce SR for the various stimulating signals. Numeric experimental results also show that noises do not always deteriorate the capability of the detection of suprathreshold input signals.

- Neurodynamics | Pp. 411-419

Modeling of Short-Term Synaptic Plasticity Using Dynamic Synapses

Biswa Sengupta

This work presents a model of minimal time-continuous target-cell specific use-dependent short-term synaptic plasticity (STP) observed in the pyramidal cells that can account for both short-term depression and facilitation. In general it provides a concise and portable description that is useful for predicting synaptic responses to more complex patterns of simulation, for studies relating to circuit dynamics and for equating dynamic properties across different synaptic pathways between or within preparations. This model allows computation of postsynaptic responses by either facilitation or depression in the synapse thus exhibiting characteristics of dynamic synapses as that found during short-term synaptic plasticity, for any arbitrary pre-synaptic spike train in the presence of realistic background synaptic noise. Thus it allows us to see specific effect of the spike train on a neuronal lattice both small-scale and large-scale, so as to reveal the short-term plastic behavior in neurons.

- Neurodynamics | Pp. 429-438

Stochastic Neuron Model with Dynamic Synapses and Evolution Equation of Its Density Function

Wentao Huang; Licheng Jiao; Yuelei Xu; Maoguo Gong

In most neural network models, neurons are viewed as the only computational units, while the synapses are treated as passive scalar parameters (weights). It has, however, long been recognized that biological synapses can exhibit rich temporal dynamics. These dynamics may have important consequences for computing and learning in biological neural systems. This paper proposes a novel stochastic model of single neuron with synaptic dynamics, which is characterized by several stochastic differential equations. From this model, we obtain the evolution equation of their density function. Furthermore, we give an approach to cut the evolution equation of the high dimensional function down to the evolution equation of one dimension function.

- Neurodynamics | Pp. 449-455

Learning Algorithm for Spiking Neural Networks

Hesham H. Amin; Robert H. Fujii

Spiking Neural Networks (SNNs) use inter-spike time coding to process input data. In this paper, a new learning algorithm for SNNs that uses the inter-spike times within a spike train is introduced. The learning algorithm utilizes the spatio-temporal pattern produced by the spike train input mapping unit and adjusts synaptic weights during learning. The approach was applied to classification problems.

- Neurodynamics | Pp. 456-465

Implementing Fuzzy Reasoning by IAF Neurons

Zhijie Wang; Hong Fan

Implementing of intersection operation and union operation in fuzzy reasoning is explored by three Integrate-And-Fire (IAF) neurons, with two neurons as inputs and the other one as output. We prove that if parameter values of the neurons are set appropriately for intersection operation, firing rate of the output neuron is equal to or is lower than the lower one of two input neurons. We also prove that if parameter values of the neurons are set appropriately for union operation, the firing rate of the output neuron is equal to or is higher than the higher one of the two input neurons. The characteristic of intersection operation and union operation implemented by IAF neurons is discussed.

- Neurodynamics | Pp. 476-479

A Method for Quantifying Temporal and Spatial Patterns of Spike Trains

Shi-min Wang; Qi-Shao Lu; Ying Du

Spike trains are treated as exact time dependent stepwise functions called response functions. Five variables defined at sequential moments with equal interval are introduced to characterize features of response function; and these features can reflect temporal patterns of spike train. These variables have obvious geometric meaning in expressing the response and reasonable coding meaning in describing spike train since the well known ’firing rate’ is among them. The dissimilarity or distance between spike trains can be simply defined by means of these variables. The reconstruction of spike train with these variables demonstrates that information carried by spikes is preserved. If spikes of neuron ensemble are taken as a spatial sequence in each time bins, spatial patterns of spikes can also be quantified with a group of variables similar to temporal ones.

- Neurodynamics | Pp. 480-489

Study on Circle Maps Mechanism of Neural Spikes Sequence

Zhang Hong; Fang Lu-ping; Tong Qin-ye

Till now, the problem of neural coding remains a puzzle. The intrinsic information carried in irregular neural spikes sequence is not known yet. But solution of the problem will have direct influence on the study of neural information mechanism. In this paper, coding mechanism of the neural spike sequence, which is caused by input stimuli of various frequencies, is investigated based on analysis of H-H equation with the method of nonlinear dynamics. The signals of external stimuli – those continuously varying physical or chemical signals – are transformed into frequency signals of potential in many sense organs of biological system, and then the frequency signals are transformed into irregular neural coding. This paper analyzes in detail the neuron response of stimuli with various periods and finds the possible rule of coding.

- Neurodynamics | Pp. 499-507

Doubly Regularized Kernel Regression with Heteroscedastic Censored Data

Jooyong Shim; Changha Hwang

A doubly regularized likelihood estimating procedure is introduced for the heteroscedastic censored regression. The proposed procedure provides the estimates of both the conditional mean and the variance of the response variables, which are obtained by two stepwise iterative fashion. The generalized cross validation function and the generalized approximate cross validation function are used alternately to estimate tuning parameters in each step. Experimental results are then presented which indicate the performance of the proposed estimating procedure.

- Statistical Neural Network Models and Support Vector Machines | Pp. 521-527

A Prediction Interval Estimation Method for KMSE

Changha Hwang; Kyung Ha Seok; Daehyeon Cho

The kernel minimum squared error estimation (KMSE) model can be viewed as a general framework that includes kernel Fisher discriminant analysis (KFDA), least squares support vector machine (LS-SVM), and kernel ridge regression (KRR) as its particular cases. For continuous real output the equivalence of KMSE and LS-SVM is shown in this paper. We apply standard methods for computing prediction intervals in nonlinear regression to KMSE model. The simulation results show that LS-SVM has better performance in terms of the prediction intervals and mean squared error(MSE). The experiment on a real date set indicates that KMSE compares favorably with other method.

- Statistical Neural Network Models and Support Vector Machines | Pp. 536-545