Maren Mahsereci
  • about
  • posts
  • projects
  • publications
  • Invariant Gaussian Processes

    Gaussian processes can be understood as "distributions over functions" providing prior models for unknown functions. The kernel which identifies the GP can be used to encode known properties of the function such as smoothness or stationarity. A somewhat more exotic characteristic is invariance to input transformations which we'll explore here.

    November 1, 2021

    2021   ·   gaussianprocesses   machinelearning     ·   techblog  

  • The Bias-Variance Decomposition

    The expected squared error of an estimator can be decomposed into a bias and a variance term. In order for the error to be minimal, generally not both, bias and variance, can be minimal. This is the bias-variance trade-off.

    October 31, 2021

    2021   ·   statistics   machinelearning     ·   techblog  

  • Dataset Shifts

    Distinguishing data-distributions is a central topic of statistics. There exist several approaches to categorize data-distribution shift, often with particular emphasis on the difference between the training and test distribution. Here, I collect some thoughts on the issue.

    October 30, 2021

    2021   ·   machinelearning   statistics     ·   techblog  

  • Newer
  • 1
  • 2
  • 3
  • 4
  • Older
© Copyright 2024 Maren Mahsereci. Powered by Jekyll with al-folio theme. Hosted by GitHub Pages.