Functional invariants to watermark large transformers - Inria - Institut national de recherche en sciences et technologies du numérique
Conference Papers Year : 2024

Functional invariants to watermark large transformers

Pierre Fernandez
  • Function : Author
  • PersonId : 1142961
  • IdHAL : pierrefdz
Guillaume Couairon
  • Function : Author
  • PersonId : 1220018
Matthijs Douze
  • Function : Author
  • PersonId : 1086118

Abstract

The rapid growth of transformer-based models increases the concerns about their integrity and ownership insurance. Watermarking addresses this issue by embedding a unique identifier into the model, while preserving its performance. However, most existing approaches require to optimize the weights to imprint the watermark signal, which is not suitable at scale due to the computational cost. This paper explores watermarks with virtually no computational cost, applicable to a non-blind white-box setting (assuming access to both the original and watermarked networks). They generate functionally equivalent copies by leveraging the models’ invariance, via operations like dimension permutations or scaling/unscaling. This enables to watermark models without any change in their outputs and remains stealthy. Experiments demonstrate the effectiveness of the approach and its robustness against various model transformations (fine-tuning, quantization, pruning), making it a practical solution to protect the integrity of large models.
Fichier principal
Vignette du fichier
fernandez_icassp2024.pdf (296.15 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-04361026 , version 1 (22-12-2023)

Licence

Identifiers

  • HAL Id : hal-04361026 , version 1

Cite

Pierre Fernandez, Guillaume Couairon, Teddy Furon, Matthijs Douze. Functional invariants to watermark large transformers. ICASSP 2024 - IEEE International Conference on Acoustics, Speech and Signal Processing, Apr 2024, Seoul, South Korea. pp.1-5. ⟨hal-04361026⟩
216 View
103 Download

Share

More