Workshop
|
Sat 14:30
|
How to build fully open language models: from pre-training to post-training
Hannaneh Hajishirzi
|
|
Workshop
|
Sat 16:15
|
BLAP: Bootstrapping Language-Audio Pre-training for Music Captioning
|
|
Workshop
|
|
On Pre-training of Multimodal Language Models Customized for Chart Understanding
Wan-Cyuan Fan · Yen-Chun Chen · Mengchen Liu · Lu Yuan · Leonid Sigal
|
|
Workshop
|
Sat 11:00
|
Optimizing Data Use for Efficient Pre-training
Danqi Chen
|
|
Affinity Event
|
|
Hyperspherical Projections for Continual Learning Using Pre-Trained Models
Anuhya Thota
|
|
Poster
|
Thu 16:30
|
Extracting Training Data from Molecular Pre-trained Models
Renhong Huang · Jiarong Xu · Zhiming Yang · Xiang Si · Xin Jiang · Hanyang Yuan · Chunping Wang · YANG YANG
|
|
Workshop
|
Sat 14:45
|
Contributed talk: Evaluating Gender Bias Transfer between Pre-trained and Prompt Adapted Language Models
Natalie Mackraz
|
|
Workshop
|
Sat 9:45
|
Is pre-training the key to successful domain generalization?
Kate Saenko
|
|
Workshop
|
Sat 12:00
|
Mitigate the Gap: Investigating Approaches for Improving Cross-Modal Alignment in CLIP
Sedigheh (Sarah) Eslami · Gerard de Melo
|
|
Poster
|
Fri 16:30
|
Transformers as Game Players: Provable In-context Game-playing Capabilities of Pre-trained Models
Chengshuai Shi · Kun Yang · Jing Yang · Cong Shen
|
|
Workshop
|
|
Pre-Training Multimodal Hallucination Detectors with Corrupted Grounding Data
Spencer Whitehead · Jacob Phillips · Sean Hendryx
|
|
Workshop
|
|
MIM-Refiner: A Contrastive Learning Boost from Intermediate Pre-Trained Representations
Benedikt Alkin · Lukas Miklautz · Sepp Hochreiter · Johannes Brandstetter
|
|