A Csiszar-function is a member of F = { f:R_+ to R : f convex }.

vi_kl_reverse(logu, self_normalized = FALSE, name = NULL)

Arguments

logu

float-like Tensor representing log(u) from above.

self_normalized

logical indicating whether f'(u=1)=0. When f'(u=1)=0 the implied Csiszar f-Divergence remains non-negative even when p, q are unnormalized measures.

name

name prefixed to Ops created by this function.

Value

kl_reverse_of_u float-like Tensor of the Csiszar-function evaluated at u = exp(logu).

Details

When self_normalized = TRUE, the KL-reverse Csiszar-function is f(u) = -log(u) + (u - 1). When self_normalized = FALSE the (u - 1) term is omitted. Observe that as an f-Divergence, this Csiszar-function implies: D_f[p, q] = KL[q, p]

The KL is "reverse" because in maximum likelihood we think of minimizing q as in KL[p, q].

Warning: when self_normalized = Truethis function makes non-log-space calculations and may therefore be numerically unstable for|logu| >> 0`.

See also