Skip to content

Eason-Sun/Implementation-of-Gaussian-Process-Regression

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Implementation-of-Gaussian-Process-Regression

Objective:

This project implements Gaussian Process Regression with three different kernels (Identity, Polynomial and Gaussian) for nonlinear data from scratch (without using any existing machine learning libraries e.g. sklearn).

Dataset:

The data used in this project corresponds to samples from a 3D surface.

Format:

There is one row per data instance and one column per attribute. The targets are real values. The training set is already divided into 10 subsets for 10-fold cross validation.

Data Visualization:

Capture

Gaussian Kernel

We can show that the Gaussian kernel k(x, x') = exp(-||x - x'||^2 / 2σ^2) can be expressed as the inner product of an infinite-dimensional feature space.

Proof: Capture

Mean Squared Error Comparision between Different Kernels:

Identity Kernel:

The Mean Squared Error of the test set for Identity Kernel = 1.227554

Gaussian Kernel (w.r.t. σ):

Capture

Polynomial Kernel (w.r.t. degree):

Capture

Time Efficiency Comparision between Different Kernels:

Since kernel technique is applied in this case, the time efficiency for Gaussian Process Regression with polynomial kernel is Ο(1) with respect to the maximum degree of monomial basis functions. Same thing holds true for gaussian kernel. In this implementation, the matrix multiplication from polynomial kernel take advantage of the Numpy optimization instead of hard coding, therefore, it is a lot quicker.

Capture

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages