Theory#

The Canary library relies on the integration of state-space models (SSM) and Bayesian neural networks with recurrent architectures. This section contains resources to understand the theory and methods behind the library.

Background#

Both SSMs and Bayesian neural networks rely extensively on linear algebra and probability theory.

State-Space Models#

State-space models (SSM), or more specifically linear-Gaussian SSMs used by Canary, model dynamic systems in a recurrent manner using a vector of hidden (i.e., not directly observed) states. The dynamic evolution between two consecutive time steps is governed by a first system of linear equations. A second set of linear equations defines the observation model as a function of the same hidden state vector.

In the case of linear-Gaussian SSMs, exact tractability of all calculations is maintained. This enables closed-form exact Bayesian inference and marginalization using the Kalman Filter and Rauch–Tung–Striebel (RTS) smoother.

Kalman Filtering, Smoothing, and Regime Switching

Basic Components: Level, Trend, Acceleration, Periodic, Autoregressive, and White Noise

Advanced Components

TAGI & LSTM Neural Networks#

The TAGI method treats all the parameters and hidden units in a neural network as Gaussian random variables. This enables reliance on the same closed-form Gaussian conditional equations as the linear-Gaussian SSM.

  • Tractable Approximate Gaussian Inference (TAGI) for Bayesian Neural Networks [Paper] [YouTube]

  • Coupling LSTM Neural Networks and SSM through Analytically Tractable Inference [Paper] [Thesis] [YouTube]