Catálogo de publicaciones - libros
Foundations of Intelligent Systems: 13th International Symposium, ISMIS 2002 Lyon, France, June 27-29, 2002 Proceedings
Mohand-Saïd Hacid ; Zbigniew W. Raś ; Djamel A. Zighed ; Yves Kodratoff (eds.)
En conferencia: 13º International Symposium on Methodologies for Intelligent Systems (ISMIS) . Lyon, France . June 27, 2002 - June 29, 2002
Resumen/Descripción – provisto por la editorial
No disponible.
Palabras clave – provistas por la editorial
Artificial Intelligence (incl. Robotics); Information Storage and Retrieval; Information Systems Applications (incl. Internet); User Interfaces and Human Computer Interaction; Database Management; Computers and Society
Disponibilidad
Institución detectada | Año de publicación | Navegá | Descargá | Solicitá |
---|---|---|---|---|
No detectada | 2002 | SpringerLink |
Información
Tipo de recurso:
libros
ISBN impreso
978-3-540-43785-7
ISBN electrónico
978-3-540-48050-1
Editor responsable
Springer Nature
País de edición
Reino Unido
Fecha de publicación
2002
Información sobre derechos de publicación
© Springer-Verlag Berlin Heidelberg 2002
Tabla de contenidos
Cooperation of Multiple Strategies for Automated Learning in Complex Environments
Floriana Esposito; Stefano Ferilli; Nicola Fanizzi; Teresa Maria Altomare Basile; Nicola Di Mauro
This work presents a new version of the incremental learning system INTHELEX, whose multistrategy learning capabilities have been further enhanced. To improve effectiveness and efficiency of the learning process, pure induction and abduction have been augmented with abstraction and deduction. Some results proving the benefits that the addition of each strategy can bring are also reported. INTHELEX will be the learning component in the architecture of the EU project COLLATE, dealing with cultural heritage documents.
- Intelligent Information Systems | Pp. 574-582
Classifier Fusion Using Local Confidence
Eunju Kim; Wooju Kim; Yillbyung Lee
Combined classifiers can show better performance than the best single classifier used in isolation, while involving little additional computational effort. This is because different classifier can potentially offer complementary information about the pattern and group decisions can take the advantage of the benefit of combining multiple classifiers in making final decision. In this paper we propose a new combining method, which harness the local confidence of each classifier in the combining process. This method learns the local confidence of each classifier using training data and if an unknown data is given, the learned knowledge is used to evaluate the outputs of individual classifiers. An empirical evaluation using five real data sets has shown that this method achieves a promising performance and outperforms the best single classifiers and other known combining methods we tried.
- Learning and Knowledge Discovery | Pp. 583-591
Feature Selection for Ensembles of Simple Bayesian Classifiers
Alexey Tsymbal; Seppo Puuronen; David Patterson
A popular method for creating an accurate classifier from a set of training data is to train several classifiers, and then to combine their predictions. The ensembles of simple Bayesian classifiers have traditionally not been a focus of research. However, the simple Bayesian classifier has much broader applicability than previously thought. Besides its high classification accuracy, it also has advantages in terms of simplicity, learning speed, classification speed, storage space, and incrementality. One way to generate an ensemble of simple Bayesian classifiers is to use different feature subsets as in the random subspace method. In this paper we present a technique for building ensembles of simple Bayesian classifiers in random subspaces. We consider also a hill-climbing-based refinement cycle, which improves accuracy and diversity of the base classifiers. We conduct a number of experiments on a collection of real-world and synthetic data sets. In many cases the ensembles of simple Bayesian classifiers have significantly higher accuracy than the single “global” simple Bayesian classifier. We consider several methods for integration of simple Bayesian classifiers. The dynamic integration better utilizes ensemble diversity than the static integration.
- Learning and Knowledge Discovery | Pp. 592-600
Data Squashing for Speeding Up Boosting-Based Outlier Detection
Shutaro Inatani; Einoshin Suzuki
In this paper, we apply data squashing to speed up outlier detection based on boosting. One person’s noise is another person’s signal. Outlier detection is gaining increasing attention in data mining. In order to improve computational time for AdaBoost-based outlier detection, we beforehand compress a given data set based on a simplified method of BIRCH. Effectiveness of our approach in terms of detection accuracy and computational time is investigated by experiments with two real-world data sets of drug stores in Japan and an artificial data set of unlawful access to a computer network.
- Learning and Knowledge Discovery | Pp. 601-611