Aktivitäten pro Jahr
Abstract
In this contribution, we propose an automatic ground truth generation approach that utilizes Positron Emission Tomography (PET) acquisitions to train neural networks for automatic urinary bladder segmentation in Computed Tomography (CT) images. We evaluated different deep learning architectures to segment the urinary bladder. However, deep neural networks require a large amount of training data, which is currently the main bottleneck in the medical field, because ground truth labels have to be created by medical experts on a time-consuming slice-by-slice basis. To overcome this problem, we generate the training data set from the PET data of combined PET/CT acquisitions. This can be achieved by applying simple thresholding to the PET data, where the radiotracer accumulates very distinct in the urinary bladder. However, the ultimate goal is to entirely skip PET imaging and its additional radiation exposure in the future, and only use CT images for segmentation.
Originalsprache | englisch |
---|---|
DOIs | |
Publikationsstatus | Veröffentlicht - 2019 |
Veranstaltung | 2018 IEEE Biomedical Engineering International Conference - Chiang Mai, Thailand Dauer: 21 Nov. 2018 → … Konferenznummer: 11 |
Konferenz
Konferenz | 2018 IEEE Biomedical Engineering International Conference |
---|---|
Kurztitel | BMEiCON 2018 |
Land/Gebiet | Thailand |
Ort | Chiang Mai |
Zeitraum | 21/11/18 → … |
ASJC Scopus subject areas
- Artificial intelligence
- Instrumentierung
- Biomedizintechnik
Fingerprint
Untersuchen Sie die Forschungsthemen von „PET-Train: Automatic Ground Truth Generation from PET Acquisitions for Urinary Bladder Segmentation in CT Images using Deep Learning“. Zusammen bilden sie einen einzigartigen Fingerprint.Aktivitäten
- 1 Vortrag bei Konferenz oder Fachtagung
-
PET-Train: Automatic Ground Truth Generation from PET Acquisitions for Urinary Bladder Segmentation in CT Images using Deep Learning
Pepe, A. (Redner/in)
2018Aktivität: Vortrag oder Präsentation › Vortrag bei Konferenz oder Fachtagung › Science to science