Timezone: »

 
Generative Optimization Networks for Memory Efficient Data Generation
Shreshth Tuli · Shikhar Tuli · Giuliano Casale · Nicholas Jennings

Mon Dec 13 01:20 PM -- 01:30 PM (PST) @

In standard generative deep learning models, such as autoencoders or GANs, the size of the parameter set is proportional to the complexity of the generated data distribution. A significant challenge is to deploy resource-hungry deep learning models in devices with limited memory to prevent system upgrade costs. To combat this, we propose a novel framework called generative optimization networks (GON) that is similar to GANs, but does not use a generator, significantly reducing its memory footprint. GONs use a single discriminator network and run optimization in the input space to generate new data samples, achieving an effective compromise between training time and memory consumption. GONs are most suited for data generation problems in limited memory settings. Here we illustrate their use for the problem of anomaly detection in memory-constrained edge devices arising from attacks or intrusion events. Specifically, we use a GON to calculate a reconstruction-based anomaly score for input time-series windows. Experiments on a Raspberry-Pi testbed with two existing and a new suite of datasets show that our framework gives up to 32% higher detection F1 scores and 58% lower memory consumption, with only 5% higher training overheads compared to the state-of-the-art.

Author Information

Shreshth Tuli (Imperial College London)
Shikhar Tuli (Princeton University)
Giuliano Casale (Imperial College London)
Nicholas Jennings (Imperial College, London)

More from the Same Authors