Timezone: »

A Comparative Survey of Deep Active Learning
Xueying Zhan · Qingzhong Wang · Kuan-Hao Huang · Haoyi Xiong · Dejing Dou · Antoni Chan
While deep learning (DL) is data-hungry and usually relies on extensive labeled data to deliver good performance, Active Learning (AL) reduces labeling costs by selecting a small proportion of samples from unlabeled data for labeling and training. Therefore, Deep Active Learning (DAL) has risen as a feasible solution for maximizing model performance under a limited labeling cost/budget in recent years. Abundant DAL methods and various literature reviews have been developed and conducted. In this work, we survey and categorize DAL-related works and construct comparative experiments across $10$ frequently used image classification datasets and $19$ DAL algorithms based on \emph{$\text{DeepAL}^+$} toolbox. Our work is the largest comparative study to date. Additionally, we explore some factors (e.g., batch size, number of epochs in the training process) that influence the efficacy of DAL, which provides better references for researchers to design their DAL experiments or carry out DAL-related applications.