Catálogo de publicaciones - libros
Systemics of Emergence: Research and Development
Gianfranco Minati ; Eliano Pessa ; Mario Abram (eds.)
Resumen/Descripción – provisto por la editorial
No disponible.
Palabras clave – provistas por la editorial
No disponibles.
Disponibilidad
Institución detectada | Año de publicación | Navegá | Descargá | Solicitá |
---|---|---|---|---|
No detectada | 2006 | SpringerLink |
Información
Tipo de recurso:
libros
ISBN impreso
978-0-387-28899-4
ISBN electrónico
978-0-387-28898-7
Editor responsable
Springer Nature
País de edición
Reino Unido
Fecha de publicación
2006
Información sobre derechos de publicación
© Springer Science+Business Media, Inc. 2006
Cobertura temática
Tabla de contenidos
Uncertainty and Information: Emergence of Vast New Territories
George J. Klir
A research program whose objective is to study uncertainty and uncertainty-based information in all their manifestations was introduced in the early 1990’s under the name “generalized information theory” (GIT). This research program, motivated primarily by some fundamental methodological issues emerging from the study of complex systems, is based on a two-dimensional expansion of classical, probability-based information theory. In one dimension, additive probability measures, which are inherent in classical information theory, are expanded to various types of nonadditive measures. In the other dimension, the formalized language of classical set theory, within which probability measures are formalized, is expanded to more expressive formalized languages that are based on fuzzy sets of various types. As in classical information theory, uncertainty is the primary concept in GIT and information is defined in terms of uncertainty reduction. This restricted interpretation of the concept of information is described in GIT by the qualified term “uncertainty-based information”. Each uncertainty theory that is recognizable within the expanded framework is characterized by: (i) a particular formalized language (a theory of fuzzy sets of some particular type); and (ii) a generalized measure of some particular type (additive or nonadditive). The number of possible uncertainty theories is thus equal to the product of the number of recognized types of fuzzy sets and the number of recognized types of generalized measures. This number has been growing quite rapidly with the recent developments in both fuzzy set theory and the theory of generalized measures. In order to fully develop any of these theories of uncertainty requires that issues at each of the following four levels be adequately addressed: (i) the theory must be formalized in terms of appropriate axioms; (ii) a calculus of the theory must be developed by which the formalized uncertainty is manipulated within the theory; (iii) a justifiable way of measuring the amount of relevant uncertainty (predictive, diagnostic, etc.) in any situation formalizable in the theory must be found; and (iv) various methodological aspects of the theory must be developed. Among the many uncertainty theories that are possible within the expanded conceptual framework, only a few theories have been sufficiently developed so far. By and large, these are theories based on various types of generalized measures, which are formalized in the language of classical set theory. Fuzzification of these theories, which can be done in different ways, has been explored only to some degree and only for standard fuzzy sets. One important result of research in the area of GIT is that the tremendous diversity of uncertainty theories made possible by the expanded framework is made tractable due to some key properties of these theories that are invariant across the whole spectrum or, at least, within broad classes of uncertainty theories. One important class of uncertainty theories consists of theories that are viewed as theories of imprecise probabilities. Some of these theories are based on Choquet capacities of various orders, especially capacities of order infinity (the well known theory of evidence), interval-valued probability distributions, and Sugeno λ-measures. While these theories are distinct in many respects, they share several common representations, such as representation by lower and upper probabilities, convex sets of probability distributions, and so-called Möbius representation. These representations are uniquely convertible to one another, and each may be used as needed. Another unifying feature of the various theories of imprecise probabilities is that two types of uncertainty coexist in each of them. These are usually referred to as nonspecificity and conflict. It is significant that well-justified measures of these two types of uncertainty are expressed by functionals of the same form in all the investigated theories of imprecise probabilities, even though these functionals are subject to different calculi in different theories. Moreover, equations that express relationship between marginal, joint, and conditional measures of uncertainty are invariant across the whole spectrum of theories of imprecise probabilities. The tremendous diversity of possible uncertainty theories is thus compensated by their many commonalities.
- Opening Lecture | Pp. 3-28
Complexity in Universe Dynamic Evolution. Part 1 - Present State and Future Evolution
Umberto Di Caprio
Recent experimental results gathered by spatial Telescope Hubble, from October 2003, deeply modify our knowledge of Universe. They give numerical estimates of fundamental quantities as “age of Universe”, geometric form, radius, density of matter, expansion rate with time, birth of galaxies, Hubble constant. We frame these results in a general theory that explains present status and, in addition, forecasts future evolution, and extrapolates past structure from time zero on. We propose a simple solution of the problem of “missing mass” and point out existence of dark energy. Complexity enters the picture through formulation and testing of a two-body dynamic model in which visible Universe rotates around a central black-hole. Presentation is splitted in two parts, one dealing with present state and future evolution, the other with past evolution from time zero.
1 - Applications | Pp. 31-49
Complexity in Universe Dynamic Evolution. Part 2 - Preceding History
Umberto Di Caprio
We define the initial structure of Universe and the disturbance that destroys the dynamical equilibrium relating to such structure. We reconstruct subsequent evolution and explain how expansion led to a matter dominated era. Further on we illustrate the birth of galaxies and the growth up to present time. According to our model Universe was transparent up to the beginning of the matter dominated era and afterwards, for a certain time, was obscured by the presence of central black-hole. A conjecture about ether is set forward.
1 - Applications | Pp. 51-66
Mistake Making Machines
Gianfranco Minati; Giuseppe Vitiello
Classic approaches consider errors and mistakes related to inadequate usage or functioning of physical or logical devices. They are usually considered problems to be fixed, like in engineering the ones related to reliability and availability. Mistake making processes or machines are assumed to be repaired. Another phase has been established when considering the role of the observer and the introduction of uncertainty principles. It is then possible to consider processes, at a certain level of description, as observer-related mistake making machines. We discuss the topic related to the possibility to design a mistake making device as a problem having correspondences with designing emergence. Emergence may be considered as a possible error appearing in Mistake Making Processes. We introduce the possibility to design an intrinsically (non observer-related) mistake making device, which has been proposed to be named Spartacus. This project is proposed with reference to the dissipative quantum model of brain. Another approach may be the one related to chaotic neural network designing, introduced in literature as Creativity Machine.
1 - Applications | Pp. 67-78
Explicit Velocity for Modelling Surface Complex Flows with Cellular Automata and Applications
M. V. Avolio; G. M. Crisci; D. D’Ambrosio; S. Di Gregorio; G. Iovine; V. Lupiano; R. Rongo; W. Spataro; G. A. Trunfio
Fluid-dynamics is an important field of Cellular Automata applications, that give rise also to specialised models as lattice gas automata and lattice Boltzmann models. These models come up against difficulties for applications to large scale (kilometres) phenomena. Our research group faces up to macroscopic phenomena concerning surface flows, developing an alternative strategy, that was significantly improved, introducing explicit velocities in own Cellular Automata models. This paper illustrates the methodology with applications to lava flows, debris flows and pyroclastic flows.
1 - Applications | Pp. 79-92
Analysis of Fingerprints Through a Reactive Agent
Anna Montesanto; Guido Tascini; Paola Baldassarri; Luca Santinelli
The aim of this job is to study the process of self-organisation of the knowledge in a reactive autonomous agent that navigates throughout a fingerprint image. This fingerprint has been recorded using a low cost sensor, so it has with her a lot of noise. In this particular situation the usual methods of analysis of the minutiae fail or need a strong pre-processing of the image. Our system is a reactive agent that acts independently from the noise in the image because the process of self-organising of the knowledge carries to the emergency of the concept of “run toward the minutiae” through a categorisation of the sensorial input and a generalisation of the situation “state-action”. The system is based on hybrid architecture for the configuration recognition and the knowledge codifies.
1 - Applications | Pp. 93-104
User Centered Portal Design: A Case Study in Web Usability
Maria Pietronilla Penna; Vera Stara; Daniele Costenaro
The chance to share information, receive feedbacks, tips and suggestions, in a “one-to-many” way, represented the strength of the WWW in recent years, becoming the most important channel of information delivery of all times. However, the “virtually unlimited” growth of the web finds now an actual limit: the desire of employing new technologies when developing internet sites has pushed web designers to pay more attention to tools first of all, almost as in a closed system, rather than to process of human-computer interaction. Starting from a real case study, this contribution proposes a total analysis of the multiple aspects that make usability the true problem of the web.
1 - Applications | Pp. 105-114
Logic and Context in Schizophrenia
Pier Luca Bandinelli; Carlo Palma; Maria Pietronilla Penna; Eliano Pessa
In this paper the authors analyze the pattern of reasoning in schizophrenia, according to proof theory. In particular they consider the clinical form of “organized” (paranoid subtype) and “disorganized” schizophrenia. In the first form they focusing on the conservation and an “excess” of the use of standard inference rules that formalize certain logical modes of reasoning, but also the incorrect use of premises not context sensitive. The authors also suggest that in disorganized subtype the inference rules are not derived from a tautological proposition, but the patient use non-standard inference rules like assonance, analogy and metaphor, relative to a particular focalized and pervasive mental state. In these case the premises and conclusion of reasoning are represented by a formalized logic expression
2 - Biology and Human Care | Pp. 117-132
The “Hope Capacity” in the Care Process and the Patient-Physician Relationship
Alberto Ricciuti
Especially in serious pathologies, in which there is a real risk of dying, the capacity of the sick person of keeping alive the will of living and participating actively in the care process, is intimately linked to what Fornari calls “hope capacity”. Such capacity is often reduced and inhibited by the kind of arguments developed in the patient-physician communication due to a misunderstanding or a wrong use of the concept of “probability”. In the context of the actual biomedical model, inspired on a narrow conception of the living beings, we currently use, in the patient-physician communication, the statistical and probabilistic evaluations referred to clinical/epidemiological data as predictive of the possible evolution of the pathology in the single person. When that happens — for a misunderstanding of the concept of “probability” or a semantic one — the “hope capacity” of the sick person fades away until it definitely disappears. This work shows how, in a systemic conception of health problems — where the barycentre of our attention is shifted from the illness to the sick person — new and fertile spaces for the discussion in the patient-physician communication about his/her possible futures can be opened, referring on one hand to the logistic concept of probability developed in the XX century by Wittgenstein and Keynes and on the concept of subjective probability developed more recently by De Finetti, on the other hand to the objective interpretation of the probability theory proposed by Popper, which defines probability as “propensity”, that is as a physical reality analogous to forces or forces field.
2 - Biology and Human Care | Pp. 133-146
Puntonet 2003. A Multidisciplinary and Systemic Approach in Training Disabled People within the Experience of Villa S. Ignazio
Dario Fortin; Viola Durini; Marianna Nardon
In this paper we will present Puntonet 2003, a about 900 hours’ course intended for disabled people and co-financed by the European Social Fund. Our approach in developing the course structure itself was focused in taking into account both the future employment of the participants and the personal and social reinforcement. The organizing and teaching team is itself multidisciplinary, combining engineers and scientific professionals and professionals with social, educational and psychological skills. The course Puntonet 2003 aims the inclusion of disabled people in the Information Society, improving professional skills but also reinforcing knowledge and integration in the social network.
2 - Biology and Human Care | Pp. 147-154