BadTrack: A Poison-Only Backdoor Attack on Visual Object Tracking

Bin Huang · Jiaqian Yu · Yiwei Chen · Siyang Pan · Qiang Wang · Zhi Wang

Great Hall & Hall B1+B2 (level 1) #1625
[ ]
Wed 13 Dec 3 p.m. PST — 5 p.m. PST


Visual object tracking (VOT) is one of the most fundamental tasks in computer vision community. State-of-the-art VOT trackers extract positive and negative examples that are used to guide the tracker to distinguish the object from the background. In this paper, we show that this characteristic can be exploited to introduce new threats and hence propose a simple yet effective poison-only backdoor attack. To be specific, we poison a small part of the training data by attaching a predefined trigger pattern to the background region of each video frame, so that the trigger appears almost exclusively in the extracted negative examples. To the best of our knowledge, this is the first work that reveals the threat of poison-only backdoor attack on VOT trackers. We experimentally show that our backdoor attack can significantly degrade the performance of both two-stream Siamese and one-stream Transformer trackers on the poisoned data while gaining comparable performance with the benign trackers on the clean data.

Chat is not available.