Timezone: »

Multiple Incremental Decremental Learning of Support Vector Machines
Masayuki Karasuyama · Ichiro Takeuchi

Mon Dec 07 07:00 PM -- 11:59 PM (PST) @

We propose a multiple incremental decremental algorithm of Support Vector Machine (SVM). Conventional single cremental decremental SVM can update the trained model efficiently when single data point is added to or removed from the training set. When we add and/or remove multiple data points, this algorithm is time-consuming because we need to repeatedly apply it to each data point. The roposed algorithm is computationally more efficient when multiple data points are added and/or removed simultaneously. The single incremental decremental algorithm is built on an optimization technique called parametric programming. We extend the idea and introduce multi-parametric programming for developing the proposed algorithm. Experimental results on synthetic and real data sets indicate that the proposed algorithm can significantly reduce the computational cost of multiple incremental decremental operation. Our approach is especially useful for online SVM learning in which we need to remove old data points and add new data points in a short amount of time.

Author Information

Masayuki Karasuyama (Kyoto University)
Ichiro Takeuchi (Nagoya Institute of Technology)

More from the Same Authors