Timezone: »

 
Workshop
Second Workshop on Efficient Natural Language and Speech Processing (ENLSP-II): The Future of Pre-trained Models
Mehdi Rezagholizadeh · Peyman Passban · Yue Dong · Lili Mou · Pascal Poupart · Ali Ghodsi · Qun Liu

@ Physical
Event URL: https://neurips2022-enlsp.github.io/ »

The second version of the Efficient Natural Language and Speech Processing (ENLSP-II) workshop focuses on fundamental and challenging problems to make natural language and speech processing (especially pre-trained models) more efficient in terms of Data, Model, Training, and Inference. The workshop program offers an interactive platform for gathering different experts and talents from academia and industry through invited talks, panel discussion, paper submissions, reviews, interactive
posters, oral presentations and a mentorship program. This will be a unique opportunity to address the efficiency issues of current models, build connections, exchange ideas and brainstorm solutions, and foster future collaborations. The topics of this workshop can be of interest for people working on general machine learning, deep learning, optimization, theory and NLP & Speech applications.

Author Information

Mehdi Rezagholizadeh (Huawei Technologies)
Peyman Passban (Amazon)
Yue Dong (McGill)

I am a final year Ph.D. student in Computer Science in the Reasoning and Learning lab at McGill University and also a Ph.D. student at MILA, supervised by Dr. Jackie Cheung. I am broadly interested in the research area of artificial intelligence and natural language processing, particularly on applying deep learning and reinforcement learning techniques in natural language understanding and natural language generation.

Lili Mou (University of Alberta)
Pascal Poupart (University of Waterloo & Vector Institute)
Ali Ghodsi (University of Waterloo)
Qun Liu (Huawei Noah's Ark Lab)

More from the Same Authors