One-click annotation to improve segmentation by a convolutional neural network for PET images of head and neck cancer patients
: Rainio, Oona; Liedes, Joonas; Murtojärvi, Sarita; Malaspina, Simona; Kemppainen, Jukka; Klén, Riku
Publisher: SPRINGER
: Vienna
: 2024
: Network Modeling Analysis in Health Informatics and Bioinformatics
: NETWORK MODELING AND ANALYSIS IN HEALTH INFORMATICS AND BIOINFORMATICS
: NETW MODEL ANAL HLTH
: 47
: 13
: 1
: 8
: 2192-6662
: 2192-6670
DOI: https://doi.org/10.1007/s13721-024-00483-0(external)
: https://doi.org/10.1007/s13721-024-00483-0(external)
: https://research.utu.fi/converis/portal/detail/Publication/457784672(external)
A convolutional neural network (CNN) can be used to perform fully automatic tumor segmentation from the positron emission tomography (PET) images of head and neck cancer patients but the predictions often contain false positive segmentation caused by the high concentration of the tracer substance in the human brain. A potential solution would be a one-click annotation in which a user points the location of the tumor by clicking the image. This information can then be given either directly to a CNN or an algorithm that fixes its predictions. In this article, we compare the fully automatic segmentation to four semi-automatic approaches by using 962 transaxial slices collected from the PET images of 100 head and neck cancer patients. According to our results, a semi-automatic segmentation method with information about the center of the tumor performs the best with a median Dice score of 0.708.
:
Open Access funding provided by University of Turku (including Turku University Central Hospital). The first author was financially supported by the Finnish Culture Foundation.