pith. machine review for the scientific record. sign in

arxiv: 1707.03237 · v3 · submitted 2017-07-11 · 💻 cs.CV

Recognition: unknown

Generalised Dice overlap as a deep learning loss function for highly unbalanced segmentations

Authors on Pith no claims yet
classification 💻 cs.CV
keywords functionlosssegmentationdeep-learningdicechoiceclassimbalance
0
0 comments X
read the original abstract

Deep-learning has proved in recent years to be a powerful tool for image analysis and is now widely used to segment both 2D and 3D medical images. Deep-learning segmentation frameworks rely not only on the choice of network architecture but also on the choice of loss function. When the segmentation process targets rare observations, a severe class imbalance is likely to occur between candidate labels, thus resulting in sub-optimal performance. In order to mitigate this issue, strategies such as the weighted cross-entropy function, the sensitivity function or the Dice loss function, have been proposed. In this work, we investigate the behavior of these loss functions and their sensitivity to learning rate tuning in the presence of different rates of label imbalance across 2D and 3D segmentation tasks. We also propose to use the class re-balancing properties of the Generalized Dice overlap, a known metric for segmentation assessment, as a robust and accurate deep-learning loss function for unbalanced tasks.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Trajectory-Agnostic Asteroid Detection in TESS with Deep Learning

    astro-ph.EP 2026-05 unverdicted novelty 7.0

    A W-Net deep learning model detects asteroids in TESS data independently of trajectory by rotating training image cubes and using adaptive normalization for data scaling.

  2. Model-Agnostic Meta Learning for Class Imbalance Adaptation

    cs.CL 2026-04 conditional novelty 5.0

    HAMR combines meta-learning with hardness-aware weighting and neighborhood resampling to improve minority-class performance on imbalanced NLP datasets.