Introduction
AugmentedSVM is a novel support vector formulation that generalizes the classical Support Vector Machine. This framework can be used to learn a nonlinear classifier/regression function by simultaneously taking into account constraints (equality or inequality) on the function value and the gradient. This work was first presented in the following paper :
Ashwini Shukla and Aude Billard. AugmentedSVM: Automatic Space Partitioning for Combining Multiple Nonlinear Dynamics. Advances in Neural Information Processing Systems (NIPS), 2012. Tahoe, Nevada. Volume 25, Pages 10251033. [pdf] [bib]
From SVM to ASVM
In the existing SVM formulations for classification and regression, the general idea is to apply constraints on the function value at the given data points. Loosely speaking, classification problems
are characterized of inequality constraints
and regression problems are composed of equality constraints where is the required function, are the data points in the input space and are labels (classification) or real values (regression). However, we showed that it is possible to also include arbitrary linear constraints on the function gradient in the primal form. Even with this additional primal constraint, the Lagrange duality  which is necessary to kernelize the primal  holds as long as the gradient constraints remain linear in the primal variables. When formulating the dual with these additional constraints, new class support vectors arise in the final optimization problem alongwith an extended kernel matrix. The new optimization problem is still convex and can be completely specified in terms of not just the dataset and a chosen kernel, but also the first and second derivatives of the kernel function.
Applications
ASVM is a general framework for learning nonlinear function profiles when both the function values and its derivatives at certain points in state space are known. That is, the training dataset is of the form . A few applications which may exploit this framework are the following:
Learning energy functions for control applications where the derivatives and values of the desired function may be prescribed by Lyapunov stability criteria.
Surface Reconstruction, where the desired surface is a particular isosurface of the regression function and the surface normals provide the derivative constraints.
This framework is particularly useful in these cases since in a typical learning problem in these domains, the derivative information is obtained every time the systems are sampled for a datapoint. Instead of ignoring the derivative information, one could use it in the ASVM framework for more efficient learning.
Acknowledgments
I would like to thank the following people/organizations for their contribution :
Saurav Aryan wrote a part of this package, specifically the one using NLOPT solvers, during his summer internship at EPFL. Dr. Basilio Noris helped in efficient implementation of SMO for ASVM.
This work was supported by the EU Project FirstMM (FP7/20072013) under grant agreement number 248258.
