Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Deep Generative Models and Downstream Applications

Self-Supervised Anomaly Detection via Neural Autoregressive Flows with Active Learning

Jiaxin Zhang · Kyle Saleeby · Thomas Feldhausen · Sirui Bi · Alex Plotkowski · David Womble


Abstract:

Many self-supervised methods have been proposed with the target of image anomaly detection. These methods often rely on the paradigm of data augmentation with predefined transformations such as flipping, cropping, and rotations. However, it is not straightforward to apply these techniques for non-image data, such as time series or tabular data, while the performance of the existing deep approaches has been under our expectation on tasks beyond images. In this work, we propose a novel active learning (AL) scheme that relied on neural autoregressive flows (NAF) for self-supervised anomaly detection, specifically on small-scale data. Unlike other generative models such as GANs or VAEs, flow-based models allow to explicitly learn the probability density and thus can assign accurate likelihoods to normal data which makes it usable to detect anomalies. The proposed NAF-AL method is achieved by efficiently generating random samples from latent space and transforming them into feature space along with likelihoods via invertible mapping. The samples with lower likelihoods are selected and further checked by outlier detection using Mahalanobis distance. The augmented samples incorporating with normal samples are used for training a better detector so as to approach decision boundaries. Compared with random transformations, NAF-AL can be interpreted as a likelihood-oriented data augmentation that is more efficient and robust. Extensive experiments show that our approach outperforms existing baselines on multiple time series and tabular datasets, and a real-world application in advanced manufacturing, with significant improvement on anomaly detection accuracy and robustness over the state-of-the-art.

Chat is not available.