Theory#
The Canary library relies on the integration of state-space models (SSM) and Bayesian neural networks with recurrent architectures. This section contains resources to understand the theory and methods behind the library.
Background#
Both SSMs and Bayesian neural networks rely extensively on linear algebra and probability theory.
Probabilistic Machine Learning for Civil Engineers [Chapter 2 - Linear Algebra] [YouTube]
Probabilistic Machine Learning for Civil Engineers [Chapter 3 - Probability Theory] [YouTube]
Probabilistic Machine Learning for Civil Engineers [Chapter 4 - Distributions] [YouTube]
State-Space Models#
State-space models (SSM), or more specifically linear-Gaussian SSMs used by Canary, model dynamic systems in a recurrent manner using a vector of hidden (i.e., not directly observed) states. The dynamic evolution between two consecutive time steps is governed by a first system of linear equations. A second set of linear equations defines the observation model as a function of the same hidden state vector.
In the case of linear-Gaussian SSMs, exact tractability of all calculations is maintained. This enables closed-form exact Bayesian inference and marginalization using the Kalman Filter and Rauch–Tung–Striebel (RTS) smoother.
Kalman Filtering, Smoothing, and Regime Switching
Probabilistic Machine Learning for Civil Engineers [Chapter 12 - State-space Models] [YouTube]
Basic Components: Level, Trend, Acceleration, Periodic, Autoregressive, and White Noise
Probabilistic Machine Learning for Civil Engineers [Chapter 12 - State-space Models] [YouTube]
Advanced Components
TAGI & LSTM Neural Networks#
The TAGI method treats all the parameters and hidden units in a neural network as Gaussian random variables. This enables reliance on the same closed-form Gaussian conditional equations as the linear-Gaussian SSM.