tl;dr: S-VBMC improves posterior inference via VBMC by combining multiple independent runs into a single optimized posterior. It requires no additional model evaluations and no communication between runs: just run VBMC in parallel with different initializations, then stack the results at the end. Start here for a quick usage demo.
Stacking Variational Bayesian Monte Carlo (S-VBMC)[1] is a fast post-processing step for Variational Bayesian Monte Carlo (VBMC). VBMC is an approximate Bayesian inference technique that produces a variational posterior in the form of a Gaussian mixture (see the relevant papers [2-4] for more details). S-VBMC improves upon this by combining ("stacking") the Gaussian mixture components from several independent VBMC runs into a single, larger mixture, which we call "stacked posterior". It then re-optimizes the weights of this combined mixture to maximize the combined Evidence Lower BOund (ELBO, a lower bound on log model evidence).
A key advantage of S-VBMC is its efficiency: the original model is never re-evaluated, making it an inexpensive way to boost inference performance. Furthermore, no communication is needed among VBMC runs, making it possible to run them in parallel before applying S-VBMC as a post-processing step with negligible computational overhead.
Refer to the S-VBMC paper for further details [1].
S-VBMC works as a post-processing step for VBMC, so it shares its use cases (described here).
Performing several VBMC inference runs with different initialization points is already recommended for robustness and convergence diagnostics; therefore, S-VBMC naturally fits into VBMC's best practices. Because S-VBMC is inexpensive and effective, we recommend using it whenever you first perform inference with VBMC. It is especially useful when separate VBMC runs yield noticeably different variational posteriors, which might happen when the target distribution has a particularly complex shape (see this notebook for two examples of this).
Create a new environment in conda
and activate it:
conda create -n svbmc-env python=3.11
conda activate svbmc-env
Install svbmc
with pip
:
pip install svbmc
You should have already run VBMC multiple times on the same problem and saved the resulting VariationalPosterior
objects as .pkl
files. Refer to these notebooks for VBMC usage examples.
First, load these objects into a single list. For example, if you have your files in a folder named vbmc_runs/
:
import pickle
import glob
vp_files = glob.glob("vbmc_runs/*.pkl")
vp_list = []
for file in vp_files:
with open(file, "rb") as f:
vp_list.append(pickle.load(f))
Next, initialize the SVBMC
object with this list and run the optimization.
from svbmc.svbmc import SVBMC
# Initialize the SVBMC object and optimize the weights
vp_stacked = SVBMC(vp_list=vp_list)
vp_stacked.optimize()
# The SVBMC object now contains the optimized weights and ELBO estimates
print(f"Stacked ELBO: {vp_stacked.elbo['estimated']}")
Note: For compatibility with VBMC, this implementation of S-VBMC stores results in NumPy
arrays. However, it uses PyTorch
under the hood to run the ELBO optimization.
We include two detailed walkthroughs:
- Basic Usage: This notebook shows how to run S-VBMC, with an optional guide on how to run VBMC multiple times.
- Noisy Log-Density Evaluations: This notebook addresses scenarios where the target log-density evaluations are noisy and a discussion of ELBO debiasing.
If you want to compute estimates (or visualize) the final stacked posterior, you can draw samples from it using .sample()
:
# Draw 10,000 samples from the final, stacked posterior
samples = vp_stacked.sample(n_samples=10000)
You can also extract the ELBO estimates for model comparison (see here; and here if your log-density evaluations are noisy).
Do not "open the box" to get S-VBMC stacked posterior's individual components' means and covariance matrices. This is because each VBMC run may use different internal parameter transformations. Consequently, the component means and covariance matrices from different VBMC posteriors exist in incompatible parameter spaces. Combining them creates a mixture whose individual Gaussian components are not directly meaningful. Always use samples from the final stacked posterior, which are correctly transformed back into the original parameter space. These are available via the .sample()
method.
- Silvestrin, F., Li, C., & Acerbi, L. (2025). Stacking Variational Bayesian Monte Carlo. arXiv preprint arXiv:2504.05004. (paper on arXiv)
- Acerbi, L. (2018). Variational Bayesian Monte Carlo. In Advances in Neural Information Processing Systems 31: 8222-8232. (paper + supplement on arXiv, NeurIPS Proceedings)
- Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In Advances in Neural Information Processing Systems 33: 8211-8222 (paper + supplement on arXiv, NeurIPS Proceedings).
- Huggins, B., Li, C., Tobaben, M., Aarnos, M., & Acerbi, L. (2023). PyVBMC: Efficient Bayesian inference in Python. Journal of Open Source Software 8(86), 5428, https://doi.org/10.21105/joss.05428.
Please cite all four references if you use S-VBMC in your work.
- Acerbi, L. (2019). An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo. In Proc. Machine Learning Research 96: 1-10. 1st Symposium on Advances in Approximate Bayesian Inference, Montréal, Canada. (paper in PMLR)
@article{silvestrin2025stacking,
title={{S}tacking {V}ariational {B}ayesian {M}onte Carlo},
author={Silvestrin, Francesco and Li, Chengkun and Acerbi, Luigi},
journal={arXiv preprint arXiv:2504.05004},
year={2025}
}
@article{acerbi2018variational,
title={{V}ariational {B}ayesian {M}onte {C}arlo},
author={Acerbi, Luigi},
journal={Advances in Neural Information Processing Systems},
volume={31},
pages={8222--8232},
year={2018}
}
@article{acerbi2020variational,
title={{V}ariational {B}ayesian {M}onte {C}arlo with noisy likelihoods},
author={Acerbi, Luigi},
journal={Advances in Neural Information Processing Systems},
volume={33},
pages={8211--8222},
year={2020}
}
@article{huggins2023pyvbmc,
title = {PyVBMC: Efficient Bayesian inference in Python},
author = {Bobby Huggins and Chengkun Li and Marlon Tobaben and Mikko J. Aarnos and Luigi Acerbi},
publisher = {The Open Journal},
journal = {Journal of Open Source Software},
url = {https://doi.org/10.21105/joss.05428},
doi = {10.21105/joss.05428},
year = {2023},
volume = {8},
number = {86},
pages = {5428}
}
@article{acerbi2019exploration,
title={An Exploration of Acquisition and Mean Functions in {V}ariational {B}ayesian {M}onte {C}arlo},
author={Acerbi, Luigi},
journal={PMLR},
volume={96},
pages={1--10},
year={2019}
}
S-VBMC is released under the terms of the BSD 3-Clause License.
S-VBMC was developed by members of the Machine and Human Intelligence Lab at the University of Helsinki. This work was supported by Research Council of Finland (grants 358980 and 356498).