Average the samples for the beta parameter from the VB posterior mean. This
is used to get predicted compositions when using predict
on an lnm model.
Value
A matrix whose rows are predictors and columns are outcomes in the beta parameter for the LNM model.
Examples
example_data <- lnm_data(N = 50, K = 10)
xy <- dplyr::bind_cols(example_data[c("X", "y")])
fit <- lnm(
starts_with("y") ~ starts_with("x"), xy,
iter = 25, output_samples = 25
)
#> Chain 1: ------------------------------------------------------------
#> Chain 1: EXPERIMENTAL ALGORITHM:
#> Chain 1: This procedure has not been thoroughly tested and may be unstable
#> Chain 1: or buggy. The interface is subject to change.
#> Chain 1: ------------------------------------------------------------
#> Chain 1:
#> Chain 1:
#> Chain 1:
#> Chain 1: Gradient evaluation took 0.001872 seconds
#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 18.72 seconds.
#> Chain 1: Adjust your expectations accordingly!
#> Chain 1:
#> Chain 1:
#> Chain 1: Begin eta adaptation.
#> Chain 1: Iteration: 1 / 250 [ 0%] (Adaptation)
#> Chain 1: Iteration: 50 / 250 [ 20%] (Adaptation)
#> Chain 1: Iteration: 100 / 250 [ 40%] (Adaptation)
#> Chain 1: Iteration: 150 / 250 [ 60%] (Adaptation)
#> Chain 1: Iteration: 200 / 250 [ 80%] (Adaptation)
#> Chain 1: Success! Found best value [eta = 1] earlier than expected.
#> Chain 1:
#> Chain 1: Begin stochastic gradient ascent.
#> Chain 1: iter ELBO delta_ELBO_mean delta_ELBO_med notes
#> Chain 1: Informational Message: The maximum number of iterations is reached! The algorithm may not have converged.
#> Chain 1: This variational approximation is not guaranteed to be meaningful.
#> Chain 1:
#> Chain 1: Drawing a sample of size 25 from the approximate posterior...
#> Chain 1: COMPLETED.
#> Warning: Pareto k diagnostic value is Inf. Resampling is disabled. Decreasing tol_rel_obj may help if variational algorithm has terminated prematurely. Otherwise consider using sampling instead.
beta_mean(fit)
#> y1 y2 y3 y4 y5 y6
#> x1 0.46427806 0.8048733 0.35463917 -2.4947330 -2.76623684 0.86242280
#> x2 0.40309784 -0.4443320 0.79299874 -0.7888050 -0.51400673 0.72779960
#> x3 1.13139520 0.3853843 1.11322032 -1.1067753 -1.19646832 -0.03241249
#> x4 -0.02049565 0.4435732 0.85233811 -0.8570138 1.02546696 0.30976718
#> x5 0.66296042 -0.1527616 0.07108312 -1.4182268 -0.09982087 -0.06828306
#> y7 y8 y9
#> x1 -0.3939122 0.7462772 -0.51278648
#> x2 -0.9554299 -0.2357660 0.44033118
#> x3 -0.5431590 0.9078626 0.61705932
#> x4 -0.9072503 1.2920316 0.03291639
#> x5 0.5093146 -0.4655007 0.48531757