Loss Functions in Deep Learning

Loss function is a method of evaluating how well your algorithm is modeling your dataset.

Why loss function is important?

You can't improve what you can't measure.

It is important for measuring the error of your model.

Types of loss functions?

  • Regression

    1. Mse

    2. Mae

    3. Huber loss

  • Classification

    1. Binary cross-entropy

    2. Categorical cross-entropy

    3. Hinge Loss

  • Autoencoders

    1. Kl divergence
  • GAN

    1. Discriminator loss

    2. Minmax gan loss

  • Object Detection

    1. Focal loss
  • Embedding

    1. Triplet loss

Loss function vs Cost Function?

Loss function is calculated on a single training example.

Cost function is calculated on the entire training dataset.

Regression:

Mean Square Error (MSE):

$$L = (y_i - \hat{y_i})^2$$

$$C = \frac{1}{n}\sum_{i=1}^{n} (y_i - \hat{y_i})^2$$

Mean Absolute Error(MAE):

$$L = |y_i - \hat{y_i}|$$

$$ C = \frac{1}{n}\sum_{i=1}^{n} |y_i - \hat{y_i}|$$

Huber Loss:

$$L_{\delta}= \left\{\begin{matrix} \frac{1}{2}(y - \hat{y})^{2} & if \left | (y - \hat{y}) \right | < \delta\\ \delta ((y - \hat{y}) - \frac1 2 \delta) & otherwise \end{matrix}\right.$$

Classification:

Binary Cross Entropy:

$$L = -{(y\log(\hat{y_i}) - (1 - y)\log(1 - \hat{y_i}))}$$

$$ C = -\frac{1}{n}\sum{(y\log(\hat{y_i}) + (1 - y)\log(1 - \hat{y_i}))}$$

Categorical Cross Entropy:

$$L = - \sum_{j=1}^k y_ilog(\hat{y_i})$$

where k is the number of classes in the data.

$$C = -\frac{1}{n}\sum_{i=1}^{n}\sum_{j=1}^{k} y_{ij}log(\hat{y_{ij}})$$

Note:

  • If you are working with regression use mse.

  • If the regression data has outliers use mae.

  • If you are working with binary classification data use bce.

  • If you are working with multi classification data use cce if you have few categories.

  • If you are working with multi classification data use spars categorical cross entropy if you have more categories.

Did you find this article valuable?

Support Gitik Gupta's blog by becoming a sponsor. Any amount is appreciated!