Denote this distribution (self) by P and the other distribution by Q. Assuming P, Q are absolutely continuous with respect to one another and permit densities p(x) dr(x) and q(x) dr(x), (Shannon) cross entropy is defined as: H[P, Q] = E_p[-log q(X)] = -int_F p(x) log q(x) dr(x) where F denotes the support of the random variable X ~ P.

tfd_cross_entropy(distribution, other, name = "cross_entropy")

Arguments

distribution

The distribution being used.

other

tfp$distributions$Distribution instance.

name

String prepended to names of ops created by this function.

Value

cross_entropy: self.dtype Tensor with shape [B1, ..., Bn] representing n different calculations of (Shannon) cross entropy.

See also

Examples

# \donttest{ d1 <- tfd_normal(loc = 1, scale = 1) d2 <- tfd_normal(loc = 2, scale = 1) d1 %>% tfd_cross_entropy(d2)
#> tf.Tensor(1.9189385, shape=(), dtype=float32)
# }