-
Covariance of the Wishart distribution
This post contains a derivation of the covariance of the elements of a Wishart distributed random matrix which can be expressed as a symmetric Kronecker product.
-
Line Searches
Line searches are fast an efficient sub-routines that determine the step size (a.k.a 'learning rate') of gradient-based optimizers at every iteration. Besides this, line searches have auxiliary purpose in quasi-Newton methods, where a correctly chosen step size yields positive definite Hessian estimates and thus descent directions. In this post, we discuss two well-known instances of a line search and their use cases: 1) the back-tracking line search, and 2) line searches based on cubic polynomials and the Wolfe conditions.
-
Quasi-Newton Methods
Limited memory BFGS (L-BFGS) is one of the most successful gradient-based optimizers and arguably the gold-standard in deterministic, non-convex optimization. It is a member of the Dennis family of quasi-Newton methods that use low-rank approximations of the inverse Hessian to project the gradient. The resulting search direction can thus be thought of as an approximation to the Newton direction, with the important difference that, even for non-convex objective functions, it is always a descent direction. There is a multitude of symmetric and non-symmetric quasi-Newton updates, and here we'll discuss the most relevant ones.