SVC

Hyperparameters

  • coef0, independent term in kernel function, controls how much the model is influenced by high-degree polynomials versus low-degree
  • gamma, kernel coefficient for ‘rbf’, ‘poly’ and ‘sigmoid’
    • rbf, gamma = 1/lamda, small gamma, large lamda, high bias, low variance
    • overfitting, reduce gamma

Example

In [4]:
# Get the Data
from sklearn.svm import SVC
from sklearn import datasets

iris = datasets.load_iris()
X = iris["data"][:, (2, 3)]  # petal length, petal width
y = iris["target"]

# Prepare the Trainning Set
setosa_or_versicolor = (y == 0) | (y == 1)
X = X[setosa_or_versicolor]
y = y[setosa_or_versicolor]

# Linear SVM Classifier model
svm_clf = SVC(kernel="linear", C=float("inf"))
svm_clf.fit(X, y)
Out[4]:
SVC(C=inf, cache_size=200, class_weight=None, coef0=0.0,
    decision_function_shape='ovr', degree=3, gamma='auto_deprecated',
    kernel='linear', max_iter=-1, probability=False, random_state=None,
    shrinking=True, tol=0.001, verbose=False)

Result Analysis

  • Attributes
    • coef_, coefficients, $\theta$, only available in the case of a linear kernel
    • intercept, $\theta{0}$
    • support_, indices of support vectors
    • supportvectors, support vectors, which are on the edge of the street
In [9]:
print(svm_clf.support_)
print(svm_clf.support_vectors_ )
[44 98]
[[1.9 0.4]
 [3.  1.1]]

Reference

  • Hands-On Machine Learning with Scikit-Learn, Keras & TensorFlow