Kernels measure the similarity between two points and . As such, it follows the invariant that the kernel matrix is symmetric and positive semi-definite (non-negative eigenvalues).

Another way to view kernels is that it computes of the inner product of and transformed to another feature space. Instead of directly transforming the points and computing the inner product , the kernel finds the value directly.

There are multiple types of kernels for different transformations.

  1. Linear:
  2. Polynomial:
  3. Gaussian:

Kernel Trick

The kernel trick is a strategy for applying linear models to non-linear data. By using a kernel in the modelโ€™s dual form, which solely relies on the relationship between pairs of datapoints, we can effectively transform the data into another feature space where the classes are linearly separable. This trick is especially effective because kernels donโ€™t require us to know the actual feature space.

This trick is most notably used in ๐Ÿ›ฉ๏ธ Support Vector Machines, allowing it to fit a curve to the original data.