This layer uses variational inference to fit a "surrogate" posterior to the
distribution over both the kernel matrix and the bias terms which are
otherwise used in a manner similar to layer_dense().
This layer fits the "weights posterior" according to the following generative
process:
[K, b] ~ Prior() M = matmul(X, K) + b Y ~ Likelihood(M)
layer_dense_variational( object, units, make_posterior_fn, make_prior_fn, kl_weight = NULL, kl_use_exact = FALSE, activation = NULL, use_bias = TRUE, ... )
| object | Model or layer object |
|---|---|
| units | Positive integer, dimensionality of the output space. |
| make_posterior_fn | function taking |
| make_prior_fn | function taking |
| kl_weight | Amount by which to scale the KL divergence loss between prior and posterior. |
| kl_use_exact | Logical indicating that the analytical KL divergence should be used rather than a Monte Carlo approximation. |
| activation | An activation function. See |
| use_bias | Whether or not the dense layers constructed in this layer
should have a bias term. See |
| ... | Additional keyword arguments passed to the |
a Keras layer
Other layers:
layer_autoregressive(),
layer_conv_1d_flipout(),
layer_conv_1d_reparameterization(),
layer_conv_2d_flipout(),
layer_conv_2d_reparameterization(),
layer_conv_3d_flipout(),
layer_conv_3d_reparameterization(),
layer_dense_flipout(),
layer_dense_local_reparameterization(),
layer_dense_reparameterization(),
layer_variable()