Projekte pro Jahr
Abstract
We present a fast update rule for variational block-sparse Bayesian learning (SBL) methods. Using a variational Bayesian framework, we show how repeated updates of probability density functions (PDFs) of the prior variances and weights can be expressed as a nonlinear first-order recurrence from one estimate of the parameters of the proxy PDFs to the next. Specifically, the recurrent relation turns out to be a strictly increasing rational function for many commonly used prior PDFs of the variances, such as Jeffrey’s prior. Hence, the fixed points of this recurrent relation can be obtained by solving for the roots of a polynomial. This scheme allows to check for convergence/divergence of individual prior variances in a single step. Thereby, the the computational complexity of the variational block-SBL algorithm is reduced and the convergence speed is improved by two orders of magnitude in our simulations. Furthermore, the solution allows insights into the sparsity of the estimators obtained by choosing different priors.
Originalsprache | englisch |
---|---|
Fachzeitschrift | IEEE Transactions on Signal Processing |
Publikationsstatus | Eingereicht - 1 Juni 2023 |
Fingerprint
Untersuchen Sie die Forschungsthemen von „Fast Variational Block-Sparse Bayesian Learning“. Zusammen bilden sie einen einzigartigen Fingerprint.-
CD-Labor für Ortssensitive Elektronische Systeme
Witrisal, K., Grebien, S. J., Fuchs, A., Wilding, T., Venus, A. & Wielandner, L.
1/01/18 → 31/12/24
Projekt: Forschungsprojekt
-
SEAMAL Front - Sicher angewandtes maschinelles Lernen
Witrisal, K., Bischof, H., Schreiber, H. & Freiberger, G.
1/10/20 → 30/09/23
Projekt: Forschungsprojekt