Skip to yearly menu bar Skip to main content


Workshop

Table Representation Learning Workshop (TRL)

Madelon Hulsebos · Haoyu Dong · Laurel Orr · Qian Liu · Vadim Borisov

MTG 11&12

Sat 14 Dec, 8:15 a.m. PST

Tables are a promising modality for representation learning and generative models with too much application potential to ignore. However, tables have long been overlooked despite their dominant presence in the data landscape, e.g. data management, data analysis, and ML pipelines. The majority of datasets in Google Dataset Search, for example, resembles typical tabular file formats like CSVs. Similarly, the top-3 most-used database management systems are all intended for relational data. Representation learning for tables, possibly combined with other modalities such as code and text, has shown impressive performance for tasks like semantic parsing, question answering, table understanding, data preparation, and data analysis (e.g. text-to-sql). The pre-training paradigm was shown to be effective for tabular ML (classification/regression) as well. More recently, we also observe promising potential in applying and enhancing generative models (e.g. LLMs) in the domain of structured data to improve how we process and derive insights from structured data.

The Table Representation Learning workshop has been the key venue driving this research vision and establishing a community around TRL. The goal of the third edition of TRL at NeurIPS 2024 is to:
1) showcase the latest impactful TRL research, with a particular focus on industry insights this year,
2) explore new applications, techniques and open challenges for representation learning and generative models for tabular data,
3) facilitate discussion and collaboration across the ML, NLP, and DB communities.

Live content is unavailable. Log in and register to view live content