A Csiszar-function is a member of F = { f:R_+ to R : f convex }
.
vi_modified_gan(logu, self_normalized = FALSE, name = NULL)
logu |
|
---|---|
self_normalized |
|
name | name prefixed to Ops created by this function. |
jensen_shannon_of_u, float
-like Tensor
of the Csiszar-function
evaluated at u = exp(logu)
.
When self_normalized = True
the modified-GAN (Generative/Adversarial
Network) Csiszar-function is:
f(u) = log(1 + u) - log(u) + 0.5 (u - 1)
When self_normalized = False
the 0.5 (u - 1)
is omitted.
The unmodified GAN Csiszar-function is identical to Jensen-Shannon (with
self_normalized = False
).
Warning: this function makes non-log-space calculations and may therefore be
numerically unstable for |logu| >> 0
.
Other vi-functions:
vi_amari_alpha()
,
vi_arithmetic_geometric()
,
vi_chi_square()
,
vi_csiszar_vimco()
,
vi_dual_csiszar_function()
,
vi_fit_surrogate_posterior()
,
vi_jeffreys()
,
vi_jensen_shannon()
,
vi_kl_forward()
,
vi_kl_reverse()
,
vi_log1p_abs()
,
vi_monte_carlo_variational_loss()
,
vi_pearson()
,
vi_squared_hellinger()
,
vi_symmetrized_csiszar_function()