-
Invariant Gaussian Processes
Gaussian processes can be understood as "distributions over functions" providing prior models for unknown functions. The kernel which identifies the GP can be used to encode known properties of the function such as smoothness or stationarity. A somewhat more exotic characteristic is invariance to input transformations which we'll explore here.
-
The Bias-Variance Decomposition
The expected squared error of an estimator can be decomposed into a bias and a variance term. In order for the error to be minimal, generally not both, bias and variance, can be minimal. This is the bias-variance trade-off.
-
Dataset Shifts
Distinguishing data-distributions is a central topic of statistics. There exist several approaches to categorize data-distribution shift, often with particular emphasis on the difference between the training and test distribution. Here, I collect some thoughts on the issue.