Article (Scientific journals)
Convolutional neural networks for PET functional volume fully automatic segmentation: development and validation in a multi-center setting.
Iantsen, Andrei; Da Silva Ferreira, Marta; Lucia, Francois et al.
2021In European Journal of Nuclear Medicine and Molecular Imaging
Peer Reviewed verified by ORBi
 

Files


Full Text
Iantsen2021_Article_ConvolutionalNeuralNetworksFor.pdf
Publisher postprint (1.26 MB)
Download

All documents in ORBi are protected by a user license.

Send to



Details



Keywords :
Cervical cancer; Convolutional neural network; PET; Segmentation; U-Net
Abstract :
[en] PURPOSE: In this work, we addressed fully automatic determination of tumor functional uptake from positron emission tomography (PET) images without relying on other image modalities or additional prior constraints, in the context of multicenter images with heterogeneous characteristics. METHODS: In cervical cancer, an additional challenge is the location of the tumor uptake near or even stuck to the bladder. PET datasets of 232 patients from five institutions were exploited. To avoid unreliable manual delineations, the ground truth was generated with a semi-automated approach: a volume containing the tumor and excluding the bladder was first manually determined, then a well-validated, semi-automated approach relying on the Fuzzy locally Adaptive Bayesian (FLAB) algorithm was applied to generate the ground truth. Our model built on the U-Net architecture incorporates residual blocks with concurrent spatial squeeze and excitation modules, as well as learnable non-linear downsampling and upsampling blocks. Experiments relied on cross-validation (four institutions for training and validation, and the fifth for testing). RESULTS: The model achieved good Dice similarity coefficient (DSC) with little variability across institutions (0.80 ± 0.03), with higher recall (0.90 ± 0.05) than precision (0.75 ± 0.05) and improved results over the standard U-Net (DSC 0.77 ± 0.05, recall 0.87 ± 0.02, precision 0.74 ± 0.08). Both vastly outperformed a fixed threshold at 40% of SUVmax (DSC 0.33 ± 0.15, recall 0.52 ± 0.17, precision 0.30 ± 0.16). In all cases, the model could determine the tumor uptake without including the bladder. Neither shape priors nor anatomical information was required to achieve efficient training. CONCLUSION: The proposed method could facilitate the deployment of a fully automated radiomics pipeline in such a challenging multicenter context.
Disciplines :
Radiology, nuclear medicine & imaging
Author, co-author :
Iantsen, Andrei
Da Silva Ferreira, Marta ;  Université de Liège - ULiège > GIGA CRC In vivo Imaging - Nuclear Medicine Division
Lucia, Francois
Jaouen, Vincent
Reinhold, Caroline
Bonaffini, Pietro
Alfieri, Joanne
Rovira, Ramon
Masson, Ingrid
Robin, Philippe
Mervoyer, Augustin
Rousseau, Caroline
Kridelka, Frédéric ;  Université de Liège - ULiège > Département des sciences cliniques > Gynécologie-Obstétrique
DE CUYPERE, Marjolein ;  Centre Hospitalier Universitaire de Liège - CHU > Département de gynécologie-obstétrique > Secteur oncologie
LOVINFOSSE, Pierre ;  Centre Hospitalier Universitaire de Liège - CHU > Département de Physique Médicale > Service médical de médecine nucléaire et imagerie onco
Pradier, Olivier
Hustinx, Roland  ;  Université de Liège - ULiège > Département des sciences cliniques > Médecine nucléaire
Schick, Ulrike
Visvikis, Dimitris
Hatt, Mathieu
More authors (10 more) Less
Language :
English
Title :
Convolutional neural networks for PET functional volume fully automatic segmentation: development and validation in a multi-center setting.
Publication date :
2021
Journal title :
European Journal of Nuclear Medicine and Molecular Imaging
ISSN :
1619-7070
eISSN :
1619-7089
Publisher :
Springer, Germany
Peer reviewed :
Peer Reviewed verified by ORBi
Available on ORBi :
since 30 April 2021

Statistics


Number of views
139 (11 by ULiège)
Number of downloads
69 (2 by ULiège)

Scopus citations®
 
18
Scopus citations®
without self-citations
13
OpenCitations
 
8

Bibliography


Similar publications



Contact ORBi