Denote this distribution (self) by P and the other distribution by Q.
Assuming P, Q are absolutely continuous with respect to one another and permit densities
p(x) dr(x) and q(x) dr(x), (Shannon) cross entropy is defined as:
H[P, Q] = E_p[-log q(X)] = -int_F p(x) log q(x) dr(x)
where F denotes the support of the random variable X ~ P.
tfd_cross_entropy(distribution, other, name = "cross_entropy")
| distribution | The distribution being used. |
|---|---|
| other |
|
| name | String prepended to names of ops created by this function. |
cross_entropy: self.dtype Tensor with shape [B1, ..., Bn] representing n different calculations of (Shannon) cross entropy.
Other distribution_methods:
tfd_cdf(),
tfd_covariance(),
tfd_entropy(),
tfd_kl_divergence(),
tfd_log_cdf(),
tfd_log_prob(),
tfd_log_survival_function(),
tfd_mean(),
tfd_mode(),
tfd_prob(),
tfd_quantile(),
tfd_sample(),
tfd_stddev(),
tfd_survival_function(),
tfd_variance()
# \donttest{ d1 <- tfd_normal(loc = 1, scale = 1) d2 <- tfd_normal(loc = 2, scale = 1) d1 %>% tfd_cross_entropy(d2)#> tf.Tensor(1.9189385, shape=(), dtype=float32)# }