Phyloformer: towards fast and accurate phylogeny estimation with self-attention networks - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Phyloformer: towards fast and accurate phylogeny estimation with self-attention networks

Résumé

A bstract An important problem in molecular evolution is that of phylogenetic reconstruction, that is, given a set of sequences descending from a common ancestor, the reconstruction of the binary tree describing their evolution from the latter. State-of-the-art methods for the task, namely Maximum likelihood and Bayesian inference, have a high computational cost, which limits their usability on large datasets. Recently researchers have begun investigating deep learning approaches to the problem but so far these attempts have been limited to the reconstruction of quartet tree topologies, addressing phylogenetic reconstruction as a classification problem. We present here a radically different approach with a transformer-based network architecture that, given a multiple sequence alignment, predicts all the pairwise evolutionary distances between the sequences, which in turn allow us to accurately reconstruct the tree topology with standard distance-based algorithms. The architecture and its high degree of parameter sharing allow us to apply the same network to alignments of arbitrary size, both in the number of sequences and in their length. We evaluate our network Phyloformer on two types of simulations and find that its accuracy matches that of a Maximum Likelihood method on datasets that resemble training data, while being significantly faster.
Fichier principal
Vignette du fichier
2022.06.24.496975v1.full.pdf (2.01 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03756990 , version 1 (21-11-2022)

Identifiants

Citer

Luca Nesterenko, Bastien Boussau, Laurent Jacob. Phyloformer: towards fast and accurate phylogeny estimation with self-attention networks. 2022. ⟨hal-03756990⟩
161 Consultations
148 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More