GP derivative in GpyTorch

I am working on a project using GP-regression models to model transition and measurements models in a Kalman Filter. This means I need to be able to sample from the derivative of the original GP model.

I am aware of how to combine the various kernels offered in the GpyTorch library, but is there any way I can implement my own mean and covariance functions?

In the case of an RBF-Kernel the posterior mean and covariance would be.

\begin{equation} \begin{aligned} \bar{f}_* = \mathbf{k}(\mathbf{x}_* \, \mathbf{X}) K(\mathbf{X}, \mathbf{X}) ^{-1} \mathbf{y}\\ \stackrel{\triangle}{=} \mathbf{k}(\mathbf{x}_* \, \mathbf{X}) \mathbf{\alpha} \end{aligned} \end{equation}

Posterior Covariance \begin{equation} \begin{aligned} \text{covariance} = K(\mathbf{x}_*, \mathbf{x}_*) - K(\mathbf{x}_*, \mathbf{X})K(\mathbf{X}, \mathbf{X})^{-1}K(\mathbf{x}_*, \mathbf{X})^T \\ \end{aligned} \end{equation}

The Jacobian of the mean function would then be which I would like to use to linearize the model would then be: \begin{equation} \begin{aligned} \frac{\partial\bar{f}_*}{\partial \mathbf{x}_*} = \frac{\partial k(\mathbf{x}_* \, \mathbf{X})}{\partial \mathbf{x}_*} \mathbf{\alpha}\\ = \left[ \Lambda^{-1} \tilde{\mathbf{X}}^T_* (\mathbf{k}(\mathbf{x}_* \, \mathbf{X})^T \odot \mathbf{\alpha}) \right] \in \mathcal{R}_{D \times 1} \end{aligned} \end{equation}

Is there any way I can implement this as a mean function for my GP-model? In the case of the filter the state-vector $\mathbf{x}_*$ would contain the current state and the control actions.

Topic gaussian-process kernel python

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.