Timezone: »

Battling with Larger Models through Grounding and Searching
Yejin Choi

Scale appears to be the winning recipe in today's leaderboards. And yet, extreme-scale neural models are still brittle and make errors that are nonsensical or even counterintuitive. In this talk, I will discuss how smaller models developed in academia can still have an edge over larger industry-scale models, if powered with grounding and searching. First, I will present MERLOT (and RESERVE) that can learn neural script knowledge from complex multimodal data and achieve new SOTA over a dozen multimodal benchmarks. Next, I will discuss NeuralLogic (and NeuralLogic A*) search algorithms that can integrate logic constraints to language model decoding so that smaller unsupervised models can win over larger supervised models for various constrained generation tasks.

Author Information

Yejin Choi (University of Washington)

More from the Same Authors