Losses
Loss functions are one of the most important aspects of neural networks, as they, along with the optimization functions, are directly responsible for fitting the model to the given training data.
The choice of the loss function is very important, since each usecase and each model will probably require a different loss.
Here some losses available in GDL.
Segmentation
Cross Entropy Loss (multiclass)
Also called logarithmic loss, log loss or logistic loss. Each predicted class probability is compared to the current class desired output 0 or 1 and a score/loss is calculated that penalizes the probability based on how far it is form the actual expected value.
We are using the smp losses implementation here.
Boundary Loss (multiclass)
A differentiable surrogate of a metric accounting accuracy of boundary detection.
- class losses.boundary_loss.BoundaryLoss(*args: Any, **kwargs: Any)[source]
Boundary Loss proposed in the paper Boundary Loss for Remote Sensing Imagery Semantic Segmentation from Alexey Bokhovkin et al. (https://arxiv.org/abs/1905.07852)
From: https://github.com/yiskw713/boundary_loss_for_remote_sensing
Dice Loss (binary & multiclass)
A loss using the dice coefficient that estimates the fraction of contour length that needs correction.
For the binary and multiclasses version, the configuration call the DiceLoss
from
smp losses library.
Focal Loss (multiclass)
The focal loss focuses training on a sparse set of hard examples and prevents the vast number of easy negatives from overwhelming the detector during training.
The configuration call the FocalLoss
from
smp losses library.
Lovasz-Softmax Loss (binary & multiclass)
A tractable surrogate for the optimization of the intersection-over-union measure in neural networks.
For the binary and multiclasses version, the configuration call the LovaszLoss
from
smp losses library.
Ohem Loss (multiclass)
A loss that calculate where the hard pixels are defined as the pixels associated with probabilities smaller than a certain value over the correct classes.
- class losses.ohem_loss.OhemCrossEntropy2d(*args: Any, **kwargs: Any)[source]
Adapted version of the Ohem Cross Entropy loss from OCNet repository (https://github.com/PkuRainBow/OCNet).
- __init__(thresh=0.6, min_kept=0, weight=None, ignore_index=255)[source]
Initialize the Ohem Cross Entropy loss.
- Parameters:
thresh (float, optional) – threshold index apply to the model prediction. Defaults to 0.6.
min_kept (int, optional) – _description_. Defaults to 0.
weight (Tensor, optional) – a manual rescaling weight given to each class. Defaults to None.
ignore_index (int, optional) – target value that is ignored and does not contribute to the input gradient. Defaults to 255.
Softbce Loss (binary)
Drop-in replacement for torch.nn.BCEWithLogitsLoss
with few additions: ignore_index
and label_smoothing
We are using the segmentation models pytorch implementation here.
Duo Loss (multiclass)
This loss is a combinaison of the losses.lovasz_loss.LovaszSoftmax()
and
losses.boundary_loss.BoundaryLoss()
.