Projects per year
Abstract
We present a fast update rule for variational blocksparse Bayesian learning (SBL) methods. Using a variational Bayesian framework, we show how repeated updates of probability density functions (PDFs) of the prior variances and weights can be expressed as a nonlinear firstorder recurrence from one estimate of the parameters of the proxy PDFs to the next. Specifically, the recurrent relation turns out to be a strictly increasing rational function for many commonly used prior PDFs of the variances, such as Jeffrey’s prior. Hence, the fixed points of this recurrent relation can be obtained by solving for the roots of a polynomial. This scheme allows to check for convergence/divergence of individual prior variances in a single step. Thereby, the the computational complexity of the variational blockSBL algorithm is reduced and the convergence speed is improved by two orders of magnitude in our simulations. Furthermore, the solution allows insights into the sparsity of the estimators obtained by choosing different priors.
Original language  English 

Journal  IEEE Transactions on Signal Processing 
Publication status  Submitted  1 Jun 2023 
Fingerprint
Dive into the research topics of 'Fast Variational BlockSparse Bayesian Learning'. Together they form a unique fingerprint.
CDLaboratory for Locationaware Electronic Systems
Witrisal, K., Grebien, S. J., Fuchs, A., Wilding, T., Venus, A. & Wielandner, L.
1/01/18 → 31/12/25
Project: Research project

SEAMAL Front  Securely Applied Machine Learning
Witrisal, K., Bischof, H., Schreiber, H. & Freiberger, G.
1/10/20 → 30/09/23
Project: Research project