Skip to contents

Formula:

loss <- y_true * log(y_true / y_pred)

y_true and y_pred are expected to be probability distributions, with values between 0 and 1. They will get clipped to the [0, 1] range.

Usage

loss_kl_divergence(
  y_true,
  y_pred,
  ...,
  reduction = "sum_over_batch_size",
  name = "kl_divergence"
)

Arguments

y_true

Tensor of true targets.

y_pred

Tensor of predicted targets.

...

For forward/backward compatability.

reduction

Type of reduction to apply to the loss. In almost all cases this should be "sum_over_batch_size". Supported options are "sum", "sum_over_batch_size" or NULL.

name

Optional name for the loss instance.

Value

KL Divergence loss values with shape = [batch_size, d0, .. dN-1].

Examples

y_true <- random_uniform(c(2, 3), 0, 2)
y_pred <- random_uniform(c(2,3))
loss <- loss_kl_divergence(y_true, y_pred)
loss

## tf.Tensor([3.5312676 0.2128672], shape=(2), dtype=float32)