A Support vector machine (SVM) is a popular choice for a classifier and radial basis functions (RBFs) are commonly used kernels to apply SVMs also to non-linearly separable problems. There are two hyperparameters in this case. First, the margin is maximized by minimizing the function
\begin{equation*} \varphi(\fvec{w}, \delta) = \frac{\left\| \fvec{w} \right\|_2^2}{2} + C \sum_{i=1}^{N} \delta_i \end{equation*}with the weight vector \(\fvec{w}\) and the slack variables \(\delta_i \geq 0\). Here, we have to tune the regularization parameter \(C \in \mathbb{R}^+\). Second, the RBF kernel
\begin{equation*} k(\fvec{x}_i, \fvec{x}_j) = e^{-\gamma \left\| \fvec{x}_i - \fvec{x}_j \right\|_2^2} \end{equation*}which calculates the distance between the data points \(\fvec{x}_i\) introduces the tunable scaling parameter \(\gamma \in \mathbb{R}^+\).
In the following animation, you can control both parameters and switch between a linear and an RBF kernel. It uses data points from the Iris flower dataset showing two features and two classes (selected to be non-separable). The idea is inspired by this sklearn example.
List of attached files:
- SVMParametersRBF.ipynb (Jupyter notebook used to create the visualization)
← Back to the overview page