Robust Novelty Detection via Worst Case CVaR Minimization

Novelty detection models aim to find the minimum volume set covering a given probability mass. This paper proposes a robust single-class support vector machine (SSVM) for novelty detection, which is mainly based on the worst case conditional value-at-risk minimization. By assuming that every input is subject to an uncertainty with a specified symmetric support, this robust formulation results in a maximization term that is similar to the regularization term in the classical SSVM.

When the uncertainty set is $ell _{1}$ -norm, $ell _infty $ -norm or box, its training can be reformulated to a linear program; while the uncertainty set is $ell _{2}$ -norm or ellipsoidal, its training is a tractable second-order cone program. The proposed method has a nice consistent statistical property. As the training size goes to infinity, the estimated normal region converges to the true provided that the magnitude of the uncertainty set decreases in a systematic way. The experimental results on three data sets clearly demonstrate its superiority over three benchmark models.

You might also like