PoNQ: a Neural QEM-based Mesh Representation - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Poster Année : 2024

PoNQ: a Neural QEM-based Mesh Representation

Résumé

Although polygon meshes have been a standard representation in geometry processing, their irregular and combinatorial nature hinders their suitability for learning-based applications. In this work, we introduce a novel learnable mesh representation through a set of local 3D sample Points and their associated Normals and Quadric error metrics (QEM) w.r.t. the underlying shape, which we denote PoNQ. A global mesh is directly derived from PoNQ by efficiently leveraging the knowledge of the local quadric errors. Besides marking the first use of QEM within a neural shape representation, our contribution guarantees both topological and geometrical properties by ensuring that a PoNQ mesh does not self-intersect and is always the boundary of a volume. Notably, our representation does not rely on a regular grid, is supervised directly by the target surface alone, and also handles open surfaces with boundaries and/or sharp features. We demonstrate the efficacy of PoNQ through a learning-based mesh prediction from SDF grids and show that our method surpasses recent state-of-the-art techniques in terms of both surface and edge-based metrics.
Fichier principal
Vignette du fichier
Poster_PoNQ6.pdf (7.86 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04622974 , version 1 (11-07-2024)

Licence

Identifiants

Citer

Nissim Maruani, Maks Ovsjanikov, Pierre Alliez, Mathieu Desbrun. PoNQ: a Neural QEM-based Mesh Representation. The IEEE/CVF Conference on Computer Vision and Pattern Recognition 2024, Jun 2024, Seattle, United States. 2024. ⟨hal-04622974⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More