Training deep learning models on medical datasets that perform well for all classes is a challenging task. It is often the case that a suboptimal performance is obtained on some classes due to the natural class imbalance issue that comes with medical data. An effective way to tackle this problem is by using targeted active learning, where we iteratively add data points to the training data that belong to the rare classes. However, existing active learning methods are ineffective in targeting rare classes in medical datasets. In this work, we propose a framework for targeted active learning that uses submodular mutual information functions as acquisition functions. We show that Tailsman outperforms the state-of-the-art active learning methods by ~10%-12% on the rare classes accuracy and ~4%-6% on overall accuracy for Path-MNIST and Pneumonia-MNIST image classification datasets.