Skip to yearly menu bar Skip to main content


Poster

Boosting the Performance of Generic Deep Neural Network Frameworks with Log-supermodular CRFs

Hao Xiong · Yangxiao Lu · Nicholas Ruozzi

Hall J (level 1) #506

Keywords: [ log-supermodular ] [ Structured Prediction ] [ conditional random fields ]


Abstract:

Historically, conditional random fields (CRFs) were popular tools in a variety of application areas from computer vision to natural language processing, but due to their higher computational cost and weaker practical performance, they have, in many situations, fallen out of favor and been replaced by end-to-end deep neural network (DNN) solutions. More recently, combined DNN-CRF approaches have been considered, but their speed and practical performance still falls short of the best performing pure DNN solutions. In this work, we present a generic combined approach in which a log-supermodular CRF acts as a regularizer to encourage similarity between outputs in a structured prediction task. We show that this combined approach is widely applicable, practical (it incurs only a moderate overhead on top of the base DNN solution) and, in some cases, it can rival carefully engineered pure DNN solutions for the same structured prediction task.

Chat is not available.