The shuffling buffer
Résumé
The complexity of randomized incremental algorithms is analyzed with the assumption of a random order of the input. To guarantee this hypothesis, the n data have to be known in advance in order to be mixed what contradicts with the on-line nature of the algorithm. We present the shuffling buffer technique to introduce sufficient randomness to guarantee an improvement on the worst case complexity by knowing only k data in advance. Typically, an algorithm with $O(n^2)$ worst-case complexity and O(n) or O(n log n) randomized complexity has an O((n^2 log k)/k) complexity for the shuffling buffer. We illustrate this with binary search trees, the number of Delaunay triangles or the number of trapezoids in a trapezoidal map created during an incremental construction.
Domaines
Géométrie algorithmique [cs.CG]Origine | Fichiers produits par l'(les) auteur(s) |
---|