Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Table Representation Learning Workshop

Tabular Representation, Noisy Operators, and Impacts on Table Structure Understanding Tasks in LLMs

Ananya Singha · José Cambronero · Sumit Gulwani · Vu Le · Chris Parnin

Keywords: [ table structure ] [ Large language models ] [ in-context learnin ]


Abstract:

Large language models (LLMs) are increasingly applied for tabular tasks usingin-context learning. The prompt representation for a table may play a role in theLLMs ability to process the table. Inspired by prior work, we generate a collectionof self-supervised structural tasks (e.g. navigate to a cell and row; transpose thetable) and evaluate the performance differences when using 8 formats. In contrastto past work, we introduce 8 noise operations inspired by real-world messy dataand adversarial inputs, and show that such operations can impact LLM performanceacross formats for different structural understanding tasks.

Chat is not available.