All articles

  1. CrKR

    I will take a look at how kernel ridge regression can be derived from linear regression and ridge regression in this article. I will further show how this knowledge can be used to derive a more specialized model that can be used for reinforcement learning: cost-regularized kernel regression (CrKR). CrKR …

  2. BFGS

    Machine learning often involves optimization, that is solving

    $$\arg\min_{x} f(x),\ f: \mathbb{R}^n \rightarrow \mathbb{R},$$

    where $f(x)$ is an objective function, for example, negative log-likelihood.

    Often you cannot find the optimum directly. That is the reason why we need numerical, iterative optimization methods. I …

  3. pytransform

    My work often combines motion capture, reinforcement learning, and robotics. There are many different systems involved, like proprietary motion capturing software, our own machine learning library, robotic simulations (e.g. MARS, Gazebo, or Bullet), and robotic middleware (e.g. ROS or RoCK). All of them come with their own complex …

  4. Maximum Likelihood

    Maximum likelihood is one of the fundemental concepts in statistics and artificial intelligence algorithms. What does it mean and how is it used in practice?

    Suppose you have some dataset $\mathcal{D}$ and a possible hypothesis $h$ of the latent function that might have generated the dataset. The probability distribution …

  5. Learning Curves #2

    Almost everyone working in the field of machine learning is usually pretty sure about what a learning curve is. It seems to be intuitive. The problem is that each field has its own typical definition of a learning curve and it is unusual to write it down explicitely. The only …

  6. pybullet

    pybullet is a simple Python interface to the physics engine Bullet. It is easy to install (via pip install pybullet) and use and it is yet a powerful tool. This article will give a brief glimpse at what you can do with it. A more detailed guide can be found …

  7. Linear Support Vector Machine

    Model

    A linear Support Vector Machine implements the linear model $$y = \text{sign}\left(\boldsymbol{w}^T\boldsymbol{x} + b\right),$$ where $y \in \{-1, 1\}$ is a class label, $\boldsymbol{x} \in \mathbb{R}^D$ is an input vector, and $\boldsymbol{w} \in \mathbb{R}^D$, $b \in \mathbb …

  8. Regression

    Approximate an unknown function $f$ that maps from $\mathbb{R}^D$ to $\mathbb{R}^F$.

    Model:

    • We assume that there is some latent function $f: \mathbb{R}^D \rightarrow \mathbb{R}^F$.
    • We observe samples $(\boldsymbol{x}_n, \boldsymbol{y}_n)$ with $f(\boldsymbol{x}_n) + \boldsymbol{\epsilon}_n = \boldsymbol …

Page 1 / 2 »