Catálogo de publicaciones - libros

Compartir en
redes sociales


Logic Programming: 21st International Conference, ICLP 2005, Sitges, Spain, October 2-5, 2005, Proceedings

Maurizio Gabbrielli ; Gopal Gupta (eds.)

En conferencia: 21º International Conference on Logic Programming (ICLP) . Sitges, Spain . October 2, 2005 - October 5, 2005

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-29208-1

ISBN electrónico

978-3-540-31947-4

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2005

Tabla de contenidos

Optimizing Queries for Heterogeneous Information Sources

András G. Békés

Integration of heterogeneous information sources requires building an infrastructure for accessing several different information sources together. One task of the integration is to provide a uniform interface to the different information sources (databases, directory servers, web services, web sites, XML files). The other task is the semantic integration: the meaning of the stored data is also different in the different sources.

In the SILK [1] system (Semantic Integration with Knowledge and Logic), semantic integration is carried out by providing a high-level model and the mappings between the high-level model and the models of the sources.When executing a query of the high-level model, the query has to be transformed to queries of the sources and to the code performing the semantic transformation of the data. The component which translates a high-level query to a low level query is the Query Planner. The Query Planner builds the set of low-level queries, and a further component is needed to create an efficient query plan from it. This component in the SILK system is called the Query Optimizer.

Pp. 429-430

Denotational Semantics Using Horn Concurrent Transaction Logic

Marcus Vinicius Santos

In this work we propose to use a Horn fragment of Concurrent Transaction Logic ( ) as an intuitive logic framework to specify semantics of concurrent programming languages.

Using Horn logic to specify a programming language has been suggested before [5,6,2]. By specifying a programming language we mean writing semantics, all three semantics-operational, denotational, and axiomatic-in Horn logic, which is also executable. Slonneger convincingly demonstrated that, for the specification of denotational semantics, Prolog can be regarded as superior to imperative languages. Gupta [2] expanded on the idea and showed how Horn logic denotations lead to some interesting practical applications, such as automatic program verification and automatic generation of compilers. The work presented here builds on the aforementionedworks and extends that of [2] by providing a Horn logic denotational semantics for concurrent programming languages.

Pp. 431-432

Gentra4cp: A Generic Trace Format for Constraint Programming

Ludovic Langevine;

Several debugging tools have been designed for constraint programming (CP). There is no ultimate tool, that satis.es all the needs, but rather a set of complementary tools. Most of them are dynamic tools. They collect data from the execution and produce abstract views of this execution, for instance a search-tree, the evolution of some domains, or an application-specific display.

So far, there are two issues concerning CP debugging tools. Firstly, each tool is dedicated to a given platform: there is no sharing of tools among the CP platforms, whereas most of solvers are based on the same techniques. Secondly, the extraction of execution data requires the solver to be instrumented. Such instrumentation is tedious and needs to access the solver code. We propose to address those two issues by means of a generic trace format which allows the execution to be described as a sequence of elementary events reflecting the behavior of the search procedure and the propagation process. The tools can then pick in the trace the data they need.

Pp. 433-434

Analyses, Optimizations and Extensions of Constraint Handling Rules: Ph.D. Summary

Tom Schrijvers

This is a summary of the Ph.D. thesis of Tom Schrijvers [4].

Constraint Handling Rules (CHR) [3] is a rule-based language commonly embedded in a host language. It combines elements of Constraint Logic Programming and term rewriting. Several implementations of CHR exist: in Prolog, Haskell, Java and HAL. Typical applications of CHR are in the area of constraint solving, but currently CHR is also used in a wider range of applications, such as type checking, natural language processing and multi-agent systems.

In this work we contribute program analyses, program optimizations and extensions of the CHR language.

For the optimized compilation of CHR we present several new optimizations: code specialization for ground constraints, anti-monotonic delay avoidance, hashtable constraint stores and a new late storage optimization. These and other optimizations have been implemented in a new state-of-the-art CHR system: the K.U.Leuven CHR system [5] which is currently available in SWI-Prolog [10], XSB [9] and hProlog [2].

Abstract interpretation is a general technique for program analysis [1]. We propose a framework of abstract interpretation for the CHR language [7], in particular for the formulation of analyses that drive program optimization. This frameworks allows for the uniform formulation of program analyses as well as easier improvements and combinations of existing analyses. We also evaluate analyses for theoretical properties, confluence and time complexity, on a practical case study to investigate their relevance.

We contribute two extensions to the expressivity of CHR. The first extension comprises the integration of CHR with tabled execution [8]. Tabled execution avoids many forms of non-termination and is useful for automatic program optimization through the dynamic reuse of previous computations. The second extension automatically provides implication checking functionality to constraint solvers written in CHR [6]. Implication checking is an essential building block for formulating complex constraints in terms of basic constraints and for composing constraint solvers.

Pp. 435-436

Formalization and Verification of Interaction Protocols

Federico Chesani

In recent years, the study of protocols and their properties has been one of the most investigated issues in distributed and multi-process systems research, and they are indeed one of the key component of Multi-Agent Systems. Several formal languages for defining protocols and properties have been proposed within different research communities. Some of the most common objectives of such languages include the ability to: in an easy and clear way for human users; of the participating peers; be able to specify and investigate properties, and help the of the peers.

Most of the current research on protocols falls into one of the following four main areas of interest: where languages for specifying protocol has been intensively studied not only in MAS research [3,5,9], but also in the broader community of distributed and multi-process systems [6]; , aimed at guaranteeing interoperability between heterogeneous agents in open computing environment [2,5]; protocol properties, where tools for proving properties are of utmost importance in the MAS community [4] and in the security protocols community [1]; and finally where argumentation and negotiation are examples of domains where the study of protocols is driven by the need to address specific features [8].

Pp. 437-438

-LTL for Constraint-Based Security Protocol Analysis

Ricardo Corin; Ari Saptawijaya; Sandro Etalle

Several formal approaches have been proposed to analyse security protocols, e.g. [2,7,11,1,6,12]. Recently, a great interest has been growing on the use of constraint solving approach. Initially proposed by Millen and Shmatikov [9], this approach allows analysis of a finite number of protocol sessions. Yet, the representation of protocol runs by traces (as opposed to concrete traces) captures the possibility of having unbounded message space, allowing analysis over an infinite state space. A constraint is defined as a pair consisting of a message and a set of messages that represents the intruder’s knowledge. Millen and Shmatikov present a procedure to a set of constraints, i.e. that in each constraint, can be built from . When a set of constraints is solved, then a concrete trace representing an attack over the protocol can be extracted.

Corin and Etalle [4] has improved the work of Millen and Shmatikov by presenting a more efficient procedure. However, none of these constraint-based systems provide enough flexibility and expresiveness in specifying security properties. For example, to check secrecy an artificial protocol role is added to simulate whether a secret can be learned by an intruder. Authentication cannot also be checked directly. Moreover, only a built-in notion of authentication is implemented by Millen and Shmatikov in his Prolog implementation [10]. This problem motivates our current work.

A logical formalism is considered to be an appropriate solution to improve the flexibility and expresiveness in specifying security properties. A preliminary attempt to use logic for specifying local security properties in a constraint-based setting has been carried out [3]. Inspired by this work and the successful NPATRL [11,8], we currently explores a variant of linear temporal logic (LTL) over finite traces, -LTL, standing for pure-past security LTL [5]. In contrast to standard LTL, this logic deals only with past events in a trace. In our current work, a protocol is modelled as in previous works [9,4,3], viz. by protocol roles. A protocol role is a sequence of and events, together with status events to indicate, e.g. that a protocol role has completed her protocol run. A is then used to deal with the number of sessions and protocol roles considered in the analysis.

Integrating -LTL into our constraint solving approach presents a challenge, since we need to develop a sound and complete decision procedure against traces, instead of concrete traces. Our idea to address this problem is by concretizing symbolic traces incrementally while deciding a formula. Basically, the decision procedure consists of two steps: and . The former step transforms a -LTL formula with respect to the current trace into a so-called formula that is built from constraints and equalities using logical connectives and quantifiers. The decision is then performed by the latter step through solving the constraints and checking the equalities.

Although we define a decision procedure for a fragment of -LTL, this fragment is expressive enough to specify several security properties, like various notions of secrecy and authentication, and also data freshness. We provide a Prolog implementation and have analysed several security protocols.

There are many directions for improvement. From the implementation point of view, the efficiency of the decision procedure can still be improved. I would also like to investigate the expressiveness of the logic for speficying other security properties. This may result in an extension of the decision procedure for a larger fragment of the logic. Another direction is to characterize the expressivity power of -LTL compared to other security requirement languages.

Pp. 439-440

Concurrent Methodologies for Global Optimization

Luca Bortolussi

My research interests are mainly focused on concurrent approaches to global optimization problems. Optimization tasks have two main conflicting features: they are both very difficult and central to a lot of the applications computer science faces daily. The problems I’m most interested in stem out from biology, protein folding being the principal one.

Pp. 441-443

A Temporal Programming Language for Heterogeneous Information Systems

Vitor Nogueira

In the last decades the number of information centers that receive data from different origins or technologies has increased enormously. Representing and reasoning with temporal data is an important issue in the majority of those centers. Nevertheless, most of the solutions to solve such issue are rather limited and specific to a certain domain.

Pp. 444-445

Nonmonotonic Logic Programs for the Semantic Web

Roman Schindlauer

The next step in the development of the Semantic Web is the realisation of the Rules, Logic, and Proof layers, which will be developed on top of the Ontology layer, and which should offer sophisticated representation and reasoning capabilities. The aim of this project is to investigate methods and techniques how nonmonotonic logic programming can be successfully and efficiently applied for integrating the Rules and the Ontology layer. In particular, it is crucial to allow for building rules on top of ontologies, that is, for rule-based systems that use vocabulary specified in ontology knowledge bases. Another type of combination is to build ontologies on top of rules, which means that ontological definitions are supplemented by rules or imported from rules. Our approach combines both kinds of couplings in one single framework. At the same time, we want to enrich Semantic Web reasoning by nonmonotonic reasoning methods, focussing on those closely related to Answer Set Programming. Among these are the use of defaults, closed-world reasoning and multiple model generation.

Pp. 446-447

ICLP 2005 Doctoral Consortium

Rémy Haemmerlé

The Concurrent Constraint (CC) [8] languages allow to model a large number of constraint programming systems, form Prolog coroutinings to constraints propagation mechanisms such these implemented in the Finite Domain (FD) solver of GNU-Prolog [2]. These languages are an extension of CLP obtained by an addition of a synchronization primitive based on constraint implication. The Linear Concurrent Constraint (LCC) [9] languages are generalizations of CC languages that consists of considerating constraint systems on Girard’s linear logic [5] instead of the classical logic. This new kind of languages offer an unified framework combining constraints with state change and allow us to express the semantics of constraint programming together with concurrency and imperative features such as multiple assignment variables. It is, then, possible to defined solvers as pure libraries written in the same languages as the system.

Pp. 448-449