Losses

Loss functions are one of the most important aspects of neural networks, as they, along with the optimization functions, are directly responsible for fitting the model to the given training data.

The choice of the loss function is very important, since each usecase and each model will probably require a different loss.

Here some losses available in GDL.

Segmentation

Cross Entropy Loss (multiclass)

Also called logarithmic loss, log loss or logistic loss. Each predicted class probability is compared to the current class desired output 0 or 1 and a score/loss is calculated that penalizes the probability based on how far it is form the actual expected value.

We are using the smp losses implementation here.

Boundary Loss (multiclass)

A differentiable surrogate of a metric accounting accuracy of boundary detection.

class losses.boundary_loss.BoundaryLoss(*args: Any, **kwargs: Any)[source]

Boundary Loss proposed in the paper Boundary Loss for Remote Sensing Imagery Semantic Segmentation from Alexey Bokhovkin et al. (https://arxiv.org/abs/1905.07852)

From: https://github.com/yiskw713/boundary_loss_for_remote_sensing

__init__(theta0=19, theta=19, ignore_index=None)[source]

Initialize the boundary loss.

Parameters:
  • theta0 (int, optional) – size of the sliding window. Defaults to 19.

  • theta (int, optional) – predened threshold on a distance. Defaults to 19.

  • ignore_index (int, optional) – index to be ignore during trainning. Defaults to None.

forward(pred, gt)[source]

Foward function use during trainning.

Parameters:
  • pred (Tensor) – the output from model (before softmax), shape (N, C, H, W).

  • gt (Tensor) – ground truth, shape (N, H, W).

Returns:

boundary loss score, averaged over mini-batch.

Return type:

Tensor

Dice Loss (binary & multiclass)

A loss using the dice coefficient that estimates the fraction of contour length that needs correction.

For the binary and multiclasses version, the configuration call the DiceLoss from smp losses library.

Focal Loss (multiclass)

The focal loss focuses training on a sparse set of hard examples and prevents the vast number of easy negatives from overwhelming the detector during training.

The configuration call the FocalLoss from smp losses library.

Lovasz-Softmax Loss (binary & multiclass)

A tractable surrogate for the optimization of the intersection-over-union measure in neural networks.

For the binary and multiclasses version, the configuration call the LovaszLoss from smp losses library.

Ohem Loss (multiclass)

A loss that calculate where the hard pixels are defined as the pixels associated with probabilities smaller than a certain value over the correct classes.

class losses.ohem_loss.OhemCrossEntropy2d(*args: Any, **kwargs: Any)[source]

Adapted version of the Ohem Cross Entropy loss from OCNet repository (https://github.com/PkuRainBow/OCNet).

__init__(thresh=0.6, min_kept=0, weight=None, ignore_index=255)[source]

Initialize the Ohem Cross Entropy loss.

Parameters:
  • thresh (float, optional) – threshold index apply to the model prediction. Defaults to 0.6.

  • min_kept (int, optional) – _description_. Defaults to 0.

  • weight (Tensor, optional) – a manual rescaling weight given to each class. Defaults to None.

  • ignore_index (int, optional) – target value that is ignored and does not contribute to the input gradient. Defaults to 255.

forward(predict, target)[source]

Foward function use during trainning.

Parameters:
  • predict (Tensor) – the output from model, shape (N, C, H, W).

  • target (Tensor) – ground truth, shape (N, H, W).

Returns:

Ohem loss score.

Return type:

Tensor

Softbce Loss (binary)

Drop-in replacement for torch.nn.BCEWithLogitsLoss with few additions: ignore_index and label_smoothing

We are using the segmentation models pytorch implementation here.

Duo Loss (multiclass)

This loss is a combinaison of the losses.lovasz_loss.LovaszSoftmax() and losses.boundary_loss.BoundaryLoss().

class losses.duo_loss.DuoLoss(*args: Any, **kwargs: Any)[source]

Implementation of a losses combinaison between the lovasz loss and the boundary loss.

__init__(**kwargs)[source]

Initialize the two losses.

forward(preds, labels)[source]

Foward function use during trainning.

Parameters:
  • preds (Tensor) – the output from model.

  • labels (Tensor) – ground truth.

Returns:

duo loss score.

Return type:

Tensor