Physics-based Deep Neural Network for Real-Time Lesion Tracking in Ultrasound-guided Breast Biopsy
Résumé
In the context of ultrasound (US) guided breast biopsy, image fusion techniques can be employed to track the position of US-invisible lesions previously identified on a pre-operative image. Such methods have to account for the large anatomical deformations resulting from probe pressure during US scanning within the real-time constraint. Although biomechanical models based on the finite element (FE) method represent the preferred approach to model breast behavior, they cannot achieve real-time performances. In this paper we propose to use deep neural networks to learn large deformations occurring in ultrasound-guided breast biopsy and then to provide accurate prediction of lesion displacement in real-time. We train a U-Net architecture on a relatively small amount of synthetic data generated in an offline phase from FE simulations of probe-induced deformations on the breast anatomy of interest. Overall, both training data generation and network training are performed in less than 5 hours, which is clinically acceptable considering that the biopsy can be performed at most the day after the pre-operative scan. The method is tested both on synthetic and on real data acquired on a realistic breast phantom. Results show that our method correctly learns the deformable behavior modelled via FE simulations and is able to generalize to real data, achieving a target registration error comparable to that of FE models, while being about a hundred times faster.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...