Affinity Workshop: Women in Machine Learning

Topic: Building Identification In Aerial Imagery using Deep learning

Proscovia Nakiranda


Building identification is an important task for urban planning, settlement tracking, and can also help to supplement the limited data in developing countries where there is inadequate and infrequent census data. Several Deep learning architectures such as Fully connected network (FCN), UNET and Deeplab can be used to perform building identification in such scenarios where census data is limited and have given promising results. However, most of these architectures have some drawbacks such as poor edge detection thus necessitating the use of very huge training datasets that in turn leads to the utilization of a lot of computation resources. Additionally, there is a challenge when it comes to adapting these trained models to other domains, i.e., a model trained in one region poorly performs on other regions.With several highly performing semantic segmentation architectures being developed and published, comparison studies which help us choose the best architecture for a specific task are of vital importance. In this research, we carry out building identification using semantic segmentation to classify a given pixel as building or non-building. We use the diverse Inria aerial image labeling benchmark dataset (Maggiori, Emmanuel, et al., 2017). We intend to conduct a qualitative and quantitative comparative study of the semantic segmentation architectures that use encoder-decoder architecture, multitask learning, domain adaptation and architectures that use encoder-decoder architectures as their backbone. While comparative studies have been done such as Hu, Junxing, et al, 2019, they are not exhaustive as they consider specific architectures for instance encoder-decoder and do not cover newer architectures. In our work, we also look at other factors that affect model performance such as edge detection, the effect of hyperparameter tuning, and transfer learning.

Chat is not available.