Catálogo de publicaciones - libros

Compartir en
redes sociales


Algorithms and Data Structures: 9th International Workshop, WADS 2005, Waterloo, Canada, August 15-17, 2005, Proceedings

Frank Dehne ; Alejandro López-Ortiz ; Jörg-Rüdiger Sack (eds.)

En conferencia: 9º Workshop on Algorithms and Data Structures (WADS) . Waterloo, ON, Canada . August 15, 2005 - August 17, 2005

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Algorithm Analysis and Problem Complexity; Data Structures; Discrete Mathematics in Computer Science; Computer Graphics; Numeric Computing

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-28101-6

ISBN electrónico

978-3-540-31711-1

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2005

Tabla de contenidos

Towards a Theory of Algorithms

Allan Borodin

Undergraduate (and graduate) algorithms courses and texts often organize the material in terms of “algorithmic paradigms”, such as greedy algorithms, backtracking, dynamic programming, divide and conquer, local search, primal-dual, etc. (but not etc. etc.). We seem to be able to intuitively describe what we have in mind when we discuss these classes of algorithms but rarely (if ever) do we attempt to define precisely what we mean by such terms as greedy, dynamic programming, etc. Clearly, a precise definition is required if we want to defend a statement such as “This is a difficult optimization problem.... in particular, there is no greedy algorithm that provides a good approximation for this problem”.

In the context of combinatorial search and optimization problems, I will present precise models for some common basic paradigms. In a sense what we would really like to have are algorithmic models (e.g. for a greedy optimization algorithm) that are as widely accepted as the Church-Turing definition for “computable function”. While this goal is probably much too ambitious, we would at least like to have models that capture most of the algorithms that fit within these common paradigms.

This talk is based on results from a number of papers. In particular, I will present precise definitions for greedy and greedy-like algorithms [2], simple dynamic programming and backtracking [3], and basic primal-dual/local ratio algorithms [1].

- Session 1 | Pp. 1-1

k-Restricted Rotation with an Application to Search Tree Rebalancing

Alejandro Almeida Ruiz; Fabrizio Luccio; Antonio Mesa Enriquez; Linda Pagli

The (, ) between two binary trees , of vertices is the minimum number of rotations by which S can be transformed into T, where rotations can only take place at the root of the tree, or at the right child of the root. A sharp upper bound (, ) ≤ 4n – 8 is known, based on the word metric of Thompson’s group. We refine this bound to a sharp (, ) ≤ 4n – 8 – – , where and are the numbers of vertices in the rightmost vertex chains of the two trees, by means of a very simple transformation algorithm based on elementary properties of trees. We then generalize the concept of restricted rotation to rotation, by allowing rotations to take place at all the vertices of the highest levels of the tree. For = 2 we show that not much is gained in the worst case, although the classical problem of rebalancing an AVL tree can be solved efficiently, in particular rebalancing after vertex deletion requires (log ) rotations as in the standard algorithm. Finding significant bounds and applications for ≥ 3 is open.

- Session 2A | Pp. 2-13

Heap Building Bounds

Zhentao Li; Bruce A. Reed

We consider the lower bound for building a heap in the worst case and the upper bound in the average case. We will prove that the supposedly fastest algorithm in the average case[2] does not attain its claimed bound and indeed is slower than that in [6]. We will then prove that the adversarial argument for the claimed best lower bound in the worst case[1] is also incorrect and the adversarial argument used yields a bound which is worse than that in [5] given by an information theory argument. Finally, we have proven a lower bound of 1.37 + () for building a heap in the worst case.

- Session 2A | Pp. 14-23

The Multi-radius Cover Problem

Refael Hassin; Danny Segev

Let = (,) be a graph with a non-negative edge length for every (,) ∈ . The vertices of represent locations at which transmission stations are positioned, and each edge of represents a continuum of demand points to which we should transmit. A station located at is associated with a set of allowed transmission radii, where the cost of transmitting to radius ∈ is given by (). The problem asks to determine for each station a transmission radius, such that for each edge (,) ∈ the sum of the radii in and is at least , and such that the total cost is minimized.

In this paper we present LP-rounding and primal-dual approximation algorithms for discrete and continuous variants of multi-radius cover. Our algorithms cope with the special structure of the problems we consider by utilizing greedy rounding techniques and a novel method for constructing primal and dual solutions.

- Session 2B | Pp. 24-35

Parameterized Complexity of Generalized Vertex Cover Problems

Jiong Guo; Rolf Niedermeier; Sebastian Wernicke

Important generalizations of the problem (, , and ) have been intensively studied in terms of approximability. However, their parameterized complexity has so far been completely open. We close this gap here by showing that, with the size of the desired vertex cover as parameter, and are both fixed-parameter tractable while is W[1]-hard. This answers two open questions from the literature. The results extend to several closely related problems. Interestingly, although the considered generalized problems behave very similar in terms of constant-factor approximability, they display a wide range of different characteristics when investigating their parameterized complexities.

- Session 2B | Pp. 36-48

The Complexity of Implicit and Space Efficient Priority Queues

Christian W. Mortensen; Seth Pettie

In this paper we study the time-space complexity of implicit priority queues supporting the operation. Our first result is that by using extra word of storage it is possible to match the performance of Fibonacci heaps: constant amortized time for insert and decreasekey and logarithmic time for deletemin. Our second result is a lower bound showing that that one extra word really is necessary. We reduce the decreasekey operation to a cell-probe type game called the , where one must maintain a simple data structure without the aid of any auxiliary storage.

- Session 2B | Pp. 49-60

Analysis of a Class of Tries with Adaptive Multi-digit Branching

Yuriy A. Reznik

We study a class of adaptive multi-digit tries, in which the numbers of digits processed by nodes with incoming strings are such that, in memoryless model (with  → ∞):

where is an algorithm-specific constant. Examples of known data structures from this class include LC-tries (Andersson and Nilsson, 1993), “relaxed” LC-tries (Nilsson and Tikkanen, 1998), tries with logarithmic selection of degrees of nodes, etc. We show, that the average depth of such tries in asymmetric memoryless model has the following asymptotic behavior (with  → ∞):

where is the number of strings inserted in the trie, and is the entropy of the source. We use this formula to compare performance of known adaptive trie structures, and to predict properties of other possible implementations of tries in this class.

- Session 2B | Pp. 61-72

Balanced Aspect Ratio Trees Revisited

Amitabh Chaudhary; Michael T. Goodrich

Spatial databases support a variety of geometric queries on point data such as range searches, nearest neighbor searches, etc. Balanced Aspect Ratio (BAR) trees are hierarchical space decomposition structures that are general-purpose and space-efficient, and, in addition, enjoy a worst case performance poly-logarithmic in the number of points for approximate queries. They maintain limits on their depth, as well as on the aspect ratio (intuitively, how skinny the regions can be). BAR trees were initially developed for 2 dimensional spaces and a fixed set of partitioning planes, and then extended to dimensional spaces and more general partitioning planes. Here we revisit 2 dimensional spaces and show that, for any given set of 3 partitioning planes, it is not only possible to construct such trees, it is also possible to derive a simple closed-form upper bound on the aspect ratio. This bound, and the resulting algorithm, are much simpler than what is known for general BAR trees. We call the resulting BAR trees Parameterized BAR trees and empirically evaluate them for different partitioning planes. Our experiments show that our theoretical bound converges to the empirically obtained values in the lower ranges, and also make a case for using evenly oriented partitioning planes.

- Session 2B | Pp. 73-85

Improved Combinatorial Group Testing for Real-World Problem Sizes

David Eppstein; Michael T. Goodrich; Daniel S. Hirschberg

We study practically efficient methods for performing combinatorial group testing. We present efficient non-adaptive and two-stage combinatorial group testing algorithms, which identify the at most items out of a given set of items that are defective, using fewer tests for all practical set sizes. For example, our two-stage algorithm matches the information theoretic lower bound for the number of tests in a combinatorial group testing regimen.

- Session 3B | Pp. 86-98

Parameterized Counting Algorithms for General Graph Covering Problems

Naomi Nishimura; Prabhakar Ragde; Dimitrios M. Thilikos

We examine the general problem of covering graphs by graphs: given a graph , a collection of graphs each on at most vertices, and an integer , is there a collection of subgraphs of , each belonging to , such that the removal of the graphs in from creates a graph none of whose components have more than vertices? We can also require that the graphs in be disjoint (forming a “matching”). This framework generalizes vertex cover, edge dominating set, and minimal maximum matching. In this paper, we examine the parameterized complexity of the counting version of the above general problem. In particular, we show how to count the solutions of size at most of the covering and matching problems in time ( · (+)+2), where is the number of vertices in and is a simple polynomial. In order to achieve the additive relation between the polynomial and the non-polynomial parts of the time complexity of our algorithms, we use the compactor technique, the counting analogue of kernelization for parameterized decision problems.

- Session 3B | Pp. 99-109