Spreading vectors for similarity search - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2019

Spreading vectors for similarity search

Résumé

Discretizing multi-dimensional data distributions is a fundamental step of modern indexing methods. State-of-the-art techniques learn parameters of quantizers on training data for optimal performance, thus adapting quantizers to the data. In this work, we propose to reverse this paradigm and adapt the data to the quantizer: we train a neural net which last layer forms a fixed parameter-free quantizer, such as pre-defined points of a hyper-sphere. As a proxy objective, we design and train a neural network that favors uniformity in the spherical latent space, while preserving the neighborhood structure after the mapping. We propose a new regularizer derived from the Kozachenko–Leonenko differential entropy estimator to enforce uniformity and combine it with a locality-aware triplet loss. Experiments show that our end-to-end approach outperforms most learned quantization methods, and is competitive with the state of the art on widely adopted benchmarks. Furthermore, we show that training without the quantization step results in almost no difference in accuracy, but yields a generic catalyzer that can be applied with any subsequent quantizer. The code is available online.

Dates et versions

hal-02278905 , version 1 (04-09-2019)

Identifiants

Citer

Alexandre Sablayrolles, Matthijs Douze, Cordelia Schmid, Hervé Jégou. Spreading vectors for similarity search. ICLR 2019 - 7th International Conference on Learning Representations, May 2019, New Orleans, United States. pp.1-13. ⟨hal-02278905⟩
210 Consultations
0 Téléchargements

Altmetric

Partager

More