Hartung-Gorre Verlag

Inh.: Dr. Renate Gorre

D-78465 Konstanz

Fon: +49 (0)7533 97227

Fax: +49 (0)7533 97228

www.hartung-gorre.de

S

Series in Signal and Information Processing, Vol. 36
edited by Hans-Andrea Loeliger

 

 

 

Raphael Urs Keusch

 

Composite NUV Priors

and Applications

 

1st Edition 2022. XXVI, 248 pages. € 64,00.
ISBN 978-3-86628-768-6

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Abstract

 

Normal with unknown variance (NUV) priors are a central idea of sparse

Bayesian learning and allow variational representations of non-Gaussian

priors. More specifically, such variational representations can be seen

as parameterized Gaussians, wherein the parameters are generally unknown.

The advantage is apparent: for fixed parameters, NUV priors are

Gaussian, and hence computationally compatible with Gaussian models.

Moreover, working with (linear-)Gaussian models is particularly attractive

since the Gaussian distribution is closed under affine transformations,

marginalization, and conditioning. Interestingly, the variational

representation proves to be rather universal than restrictive: many common

sparsity-promoting priors (among them, in particular, the Laplace

prior) can be represented in this manner.

 

In estimation problems, parameters or variables of the underlying model

are often subject to constraints (e.g., discrete-level constraints). Such

constraints cannot adequately be represented by linear-Gaussian models

and generally require special treatment. To handle such constraints

within a linear-Gaussian setting, we extend the idea of NUV priors beyond

its original use for sparsity. In particular, we study compositions

of existing NUV priors, referred to as composite NUV priors, and show

that many commonly used model constraints can be represented in this

way.

 

In Part I, we derive composite NUV representations of discretizing constraints,

which enforce a model variable to take on values in a finite

set (e.g., binary: {0, 1}, or M-ary: {0, 1, . . . ,M−1}). Furthermore, we

derive composite NUV representations of linear inequality constraints,

which enforce a model variable to be lower-bounded, upper-bounded,

or both. In addition, we derive a composite NUV representation of an

exclusion constraint, which enforces a model variable to stay outside of

an exclusion region.

 

In Part II, we review the standard linear state space representation to

model physical systems. Linear state space models (LSSMs) are defined

only by a few parameters, bring flexible modeling capabilities, and pave

the way for efficient algorithms thanks to their linearity and recursive

structure. Kalman-type algorithms are commonly used to perform inference

in Gaussian LSSMs. We will use a Gaussian message passing scheme

based on factor graphs which offers several improvements and can be seen

as a generalization of the standard Kalman filter/smoother. In particular,

we will apply the modified Bryson-Frazier (MBF) smoother (augmented

with input estimation), which is numerically stable and avoids

matrix inversions.

 

The expressive power of composite NUV priors and their computational

compatibility with Gaussian models allow us to reformulate a variety of

(constrained) optimization problems as statistical estimation problems

in a linear-Gaussian model with unknown parameters. We propose an

efficient iterative algorithm based on Gaussian message passing with

closed-form update rules for the unknown parameters. An asset of the

algorithm is the linear computational complexity in time (per iteration).

Consequently, the method is able to efficiently handle long time horizons,

which is generally the bottleneck of other algorithms.

 

Finally, in Part III and IV, we demonstrate the applicability of the proposed

method using pertinent problems from signal processing and constrained

control. We consider problems ranging from digital-to-analog

conversion, discrete-phase beamforming, trajectory planning, to obstacle

avoidance, power converter control, and more. The results are promising

and suggest that the proposed method is a versatile toolbox to handle

various challenging practical applications.

 

Keywords: Normal with unknown variance (NUV); sparse Bayesian learning;
composite NUV priors; Gaussian message passing;
iteratively reweighted least squares (IRLS); constrained optimization; control as inference.

 

About the author:

 

Raphael Keusch was born in Muri (AG), Switzerland, in 1989 and grew

up in Buttwil (AG), Switzerland. He received his diploma as an electronics

technician from Roche Diagnostics Ltd., Rotkreuz, Switzerland,

in 2009. Subsequently, he enrolled in the electrical engineering and information

technology program at ETH Zurich, Switzerland, from which he

received his BSc and MSc degrees in 2014 and 2016, respectively. During

his master’s degree, he spent a semester as an exchange student at the

KTH Royal Institute of Technology, Stockholm, Sweden. After graduation,

he worked as a signal processing engineer for Sensirion AG, Stäfa,

Switzerland. Since 2018, he has been a PhD candidate and a full research

assistant at the Signal and Information Processing Laboratory (ISI) at

ETH Zurich under the supervision of Prof. Hans-Andrea Loeliger. His

research interests include statistical signal processing, control, machine

learning and electronics.

 

Series / Reihe "Series in Signal and Information Processing" im Hartung-Gorre Verlag

Direkt bestellen bei / to order directly from

Hartung-Gorre Verlag / D-78465 Konstanz / Germany

Telefon: +49 (0) 7533 97227  Telefax: +49 (0) 7533 97228
http://www.hartung-gorre.de   eMail: verlag@hartung-gorre.de