Skip to yearly menu bar Skip to main content


Search All 2024 Events
 

363 Results

<<   <   Page 1 of 31   >   >>
Poster
Fri 11:00 An In-depth Investigation of Sparse Rate Reduction in Transformer-like Models
Yunzhe Hu · Difan Zou · Dong Xu
Workshop
Towards more efficient agricultural practices via transformer-based crop type classification
Isabella Smythe · Eduardo Ulises Moya · Michael J Smith · Yazid Mikail · Daisy Ondwari
Poster
Fri 16:30 ETO:Efficient Transformer-based Local Feature Matching by Organizing Multiple Homography Hypotheses
Junjie Ni · Guofeng Zhang · Guanglin Li · Yijin Li · Xinyang Liu · Zhaoyang Huang · Hujun Bao
Poster
Thu 16:30 Transformation-Invariant Learning and Theoretical Guarantees for OOD Generalization
Omar Montasser · Han Shao · Emmanuel Abbe
Workshop
Resolution-Agnostic Transformer-based Climate Downscaling
Declan Curran · Hira Saleem · Sanaa Hobeichi · Flora Salim
Affinity Event
Modeling cognitive processes of natural reading with transformer-based Language Models
Bruno Bianchi · Fermín Travi · Juan Esteban Kamienkowski
Poster
Fri 11:00 SpikedAttention: Training-Free and Fully Spike-Driven Transformer-to-SNN Conversion with Winner-Oriented Spike Shift for Softmax Operation
Sangwoo Hwang · Seunghyun Lee · Dahoon Park · Donghun Lee · Jaeha Kung
Poster
Wed 16:30 Provably Optimal Memory Capacity for Modern Hopfield Models: Transformer-Compatible Dense Associative Memories as Spherical Codes
Jerry Yao-Chieh Hu · Dennis Wu · Han Liu
Affinity Event
Enhancing Credit Risk Assessment through Transformer-Based Machine Learning Models
Jean Amukwatse · Joan Arimpa
Poster
Wed 11:00 Rethinking Decoders for Transformer-based Semantic Segmentation: A Compression Perspective
Qishuai Wen · Chun-Guang Li
Workshop
MolGen-Transformer: An open-source self-supervised model for Molecular Generation and Latent Space Exploration
Chih-Hsuan Yang · Rebekah Duke · Parker Sornberger · Moses Dominic · Chad Risko · Baskar Ganapathysubramanian
Workshop
Self-Attention Limits Working Memory Capacity of Transformer-Based Models
Dongyu Gong · Hantao Zhang