Return multiple samples for the beta parameter from the VB posterior mean.
This is used to simulate new compositions when using sample
on an lnm
model.
Value
A matrix whose rows are predictors and columns are outcomes in the beta parameter for the LNM model.
Examples
example_data <- lnm_data(N = 50, K = 10)
xy <- dplyr::bind_cols(example_data[c("X", "y")])
fit <- lnm(
starts_with("y") ~ starts_with("x"), xy,
iter = 25, output_samples = 25
)
#> Chain 1: ------------------------------------------------------------
#> Chain 1: EXPERIMENTAL ALGORITHM:
#> Chain 1: This procedure has not been thoroughly tested and may be unstable
#> Chain 1: or buggy. The interface is subject to change.
#> Chain 1: ------------------------------------------------------------
#> Chain 1:
#> Chain 1:
#> Chain 1:
#> Chain 1: Gradient evaluation took 0.001691 seconds
#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 16.91 seconds.
#> Chain 1: Adjust your expectations accordingly!
#> Chain 1:
#> Chain 1:
#> Chain 1: Begin eta adaptation.
#> Chain 1: Iteration: 1 / 250 [ 0%] (Adaptation)
#> Chain 1: Iteration: 50 / 250 [ 20%] (Adaptation)
#> Chain 1: Iteration: 100 / 250 [ 40%] (Adaptation)
#> Chain 1: Iteration: 150 / 250 [ 60%] (Adaptation)
#> Chain 1: Iteration: 200 / 250 [ 80%] (Adaptation)
#> Chain 1: Success! Found best value [eta = 1] earlier than expected.
#> Chain 1:
#> Chain 1: Begin stochastic gradient ascent.
#> Chain 1: iter ELBO delta_ELBO_mean delta_ELBO_med notes
#> Chain 1: Informational Message: The maximum number of iterations is reached! The algorithm may not have converged.
#> Chain 1: This variational approximation is not guaranteed to be meaningful.
#> Chain 1:
#> Chain 1: Drawing a sample of size 25 from the approximate posterior...
#> Chain 1: COMPLETED.
#> Warning: Pareto k diagnostic value is Inf. Resampling is disabled. Decreasing tol_rel_obj may help if variational algorithm has terminated prematurely. Otherwise consider using sampling instead.
beta_samples(fit, size = 2)
#> [[1]]
#> y1 y2 y3 y4 y5 y6 y7
#> x1 0.333223 9.748480 0.585406 -0.9210430 -2.962710 -1.67452 1.09834
#> x2 1.143260 2.634330 4.213530 0.0818555 -0.818609 -1.14632 1.22016
#> x3 -0.650406 0.939593 0.483454 0.9727800 -2.109930 0.96731 -0.27683
#> x4 -0.390750 11.526400 0.437361 1.6014400 -2.990990 2.84794 1.19075
#> x5 0.838501 1.667520 1.096590 0.4920220 -0.691003 1.06200 1.85243
#> y8 y9
#> x1 -1.694320 -2.070410
#> x2 0.240263 -0.623207
#> x3 -1.010890 -0.308038
#> x4 0.442549 -3.850320
#> x5 1.920890 -0.633965
#>
#> [[2]]
#> y1 y2 y3 y4 y5 y6 y7
#> x1 0.5252300 0.480730 0.519561 1.166470 -2.870600 -0.2989320 -0.913715
#> x2 1.0260400 2.646570 -2.558960 -0.542442 -0.968081 -1.3131000 1.187580
#> x3 -0.0602829 0.980218 0.781678 0.960806 -1.617120 1.0084000 -0.931595
#> x4 -0.3112050 18.009900 1.171180 0.549871 -2.947240 -0.0933713 -1.090320
#> x5 0.0533194 1.728950 0.794200 0.297997 -1.187120 0.5061130 2.001490
#> y8 y9
#> x1 -1.962290 -2.064760
#> x2 0.201145 -0.704438
#> x3 -1.041820 -0.313579
#> x4 -0.089889 -2.504530
#> x5 0.396430 -0.603307
#>