Timezone: »
A new area is emerging at the intersection of artificial intelligence, machine learning, and systems design. This has been accelerated by the explosive growth of diverse applications of ML in production, the continued growth in data volume, and the complexity of large-scale learning systems. The goal of this workshop is to bring together experts working at the crossroads of machine learning, system design and software engineering to explore the challenges faced when building large-scale ML systems. In particular, we aim to elicit new connections among these diverse fields, identifying theory, tools and design principles tailored to practical machine learning workflows. We also want to think about best practices for research in this area and how to evaluate it. The workshop will cover state of the art ML and AI platforms and algorithm toolkits (e.g. TensorFlow, PyTorch1.0, MXNet etc.), as well as dive into machine learning-focused developments in distributed learning platforms, programming languages, data structures, hardware accelerators, benchmarking systems and other topics.
This workshop will follow the successful model we have previously run at ICML, NeurIPS and SOSP.
Our plan is to run this workshop annually co-located with one ML venue and one Systems venue, to help build a strong community which we think will complement newer conferences like SysML targeting research at the intersection of systems and machine learning. We believe this dual approach will help to create a low barrier to participation for both communities.
This workshop is part two of a two-part series with one day focusing on ML for Systems and the other on Systems for ML. Although the two workshops are being led by different organizers, we are coordinating our call for papers to ensure that the workshops complement each other and that submitted papers are routed to the appropriate venue.
Fri 8:30 a.m. - 8:40 a.m.
|
Welcome
(
Talk
)
|
🔗 |
Fri 8:40 a.m. - 9:10 a.m.
|
Keynote 1: Machine Learning Reproducibility: An update from the NeurIPS 2019 Reproducibility Co-Chairs, Joelle Pineau, McGill University and Facebook
(
Talk
)
|
🔗 |
Fri 9:10 a.m. - 9:30 a.m.
|
Contributed Talk: SLIDE : Training Deep Neural Networks with Large Outputs on a CPU faster than a V100-GPU
(
Talk
)
|
🔗 |
Fri 9:30 a.m. - 9:50 a.m.
|
Contributed Talk: NeMo: A Toolkit for Building AI Applications Using Neural Modules
(
Talk
)
|
🔗 |
Fri 9:50 a.m. - 10:00 a.m.
|
Poster Overview
(
Talk
)
|
🔗 |
Fri 10:00 a.m. - 11:10 a.m.
|
Posters and Coffee
(
Poster Session
)
|
Sameer Kumar · Tomasz Kornuta · Oleg Bakhteev · Hui Guan · Xiaomeng Dong · Minsik Cho · Sören Laue · Theodoros Vasiloudis · Andreea Anghel · Erik Wijmans · Zeyuan Shang · Oleksii Kuchaiev · Ji Lin · Susan Zhang · Ligeng Zhu · Beidi Chen · Vinu Joseph · Jialin Ding · Jonathan Raiman · Ahnjae Shin · Vithursan Thangarasa · Anush Sankaran · Akhil Mathur · Martino Dazzi · Markus Löning · Darryl Ho · Emanuel Zgraggen · Supun Nakandala · Tomasz Kornuta · Rita Kuznetsova
|
Fri 11:10 a.m. - 11:40 a.m.
|
Keynote 2: Vivienne Sze, MIT
(
Talks
)
|
🔗 |
Fri 11:40 a.m. - 12:00 p.m.
|
Contributed Talk: 5 Parallel Prism: A Topology for Pipelined Implementations of Convolutional Neural Networks Using Computational Memory
(
Talk
)
|
🔗 |
Fri 12:00 p.m. - 1:30 p.m.
|
Lunch
|
🔗 |
Fri 1:30 p.m. - 3:30 p.m.
|
Systems Bonanza (10 minutes each) PyTorch TensorFlow Keras TVM Ray ONNX Runtime CoreML Flux MLFlow MLPerf Microsoft RL Systems MXNet
(
Talk
)
|
🔗 |
Fri 3:30 p.m. - 4:30 p.m.
|
Break and Poster Session
(
Poster Session
)
|
🔗 |
Fri 3:30 p.m. - 4:30 p.m.
|
Posters and Coffee
(
Poster Session
)
|
🔗 |
Fri 4:30 p.m. - 5:00 p.m.
|
Keynote 3
(
Talk
)
|
🔗 |
Fri 5:00 p.m. - 5:20 p.m.
|
Contributed Talk: LISA: Towards Learned DNA Sequence Search
(
Discussion Panel
)
|
🔗 |
Fri 5:20 p.m. - 5:30 p.m.
|
Closing
(
Talk
)
|
🔗 |
Author Information
Aparna Lakshmiratan (Facebook)
Siddhartha Sen (Microsoft Research)
Joseph Gonzalez (UC Berkeley)
Dan Crankshaw (UC Berkeley)
Sarah Bird (Microsoft)
Sarah’s work focuses on research and emerging technology strategy for AI products in Azure. Sarah works to accelerate the adoption and positive impact of AI by bringing together the latest innovations in research with the best of open source and product expertise to create new tools and technologies. Sarah is currently leading Responsible AI for the Azure Cognitive Services. Prior to joining the Cognitive Services, Sarah lead the development of responsible AI tools in Azure Machine Learning. She is an active member of the Microsoft AETHER committee, where she works to develop and drive company-wide adoption of responsible AI principles, best practices, and technologies. Sarah was one of the founding researchers in the Microsoft FATE research group and prior to joining Microsoft worked on AI fairness in Facebook. Sarah is active contributor to the open source ecosystem, she co-founded ONNX, Fairlearn, and OpenDP’s SmartNoise was a leader in the Pytorch 1.0 and InterpretML projects. She was an early member of the machine learning systems research community and has been active in growing and forming the community. She co-founded the MLSys research conference and the Learning Systems workshops. She has a Ph.D. in computer science from UC Berkeley advised by Dave Patterson, Krste Asanovic, and Burton Smith.
More from the Same Authors
-
2021 : TenSet: A Large-scale Program Performance Dataset for Learned Tensor Compilers »
Lianmin Zheng · Ruochen Liu · Junru Shao · Tianqi Chen · Joseph Gonzalez · Ion Stoica · Ameer Haj-Ali -
2021 : Effect of Model Size on Worst-group Generalization »
Alan Pham · Eunice Chan · Vikranth Srivatsa · Dhruba Ghosh · Yaoqing Yang · Yaodong Yu · Ruiqi Zhong · Joseph Gonzalez · Jacob Steinhardt -
2021 : C-Planning: An Automatic Curriculum for Learning Goal-Reaching Tasks »
Tianjun Zhang · Ben Eysenbach · Russ Salakhutdinov · Sergey Levine · Joseph Gonzalez -
2022 : Improving the Strength of Human-Like Models in Chess »
Saumik Narayanan · Kassa Korley · Chien-Ju Ho · Siddhartha Sen -
2021 Workshop: Distribution shifts: connecting methods and applications (DistShift) »
Shiori Sagawa · Pang Wei Koh · Fanny Yang · Hongseok Namkoong · Jiashi Feng · Kate Saenko · Percy Liang · Sarah Bird · Sergey Levine -
2021 Poster: Accelerating Quadratic Optimization with Reinforcement Learning »
Jeffrey Ichnowski · Paras Jain · Bartolomeo Stellato · Goran Banjac · Michael Luo · Francesco Borrelli · Joseph Gonzalez · Ion Stoica · Ken Goldberg -
2021 Poster: Hindsight Task Relabelling: Experience Replay for Sparse Reward Meta-RL »
Charles Packer · Pieter Abbeel · Joseph Gonzalez -
2021 Poster: RLlib Flow: Distributed Reinforcement Learning is a Dataflow Problem »
Eric Liang · Zhanghao Wu · Michael Luo · Sven Mika · Joseph Gonzalez · Ion Stoica -
2021 Poster: Detecting Individual Decision-Making Style: Exploring Behavioral Stylometry in Chess »
Reid McIlroy-Young · Russell Wang · Siddhartha Sen · Jon Kleinberg · Ashton Anderson -
2021 Poster: Representing Long-Range Context for Graph Neural Networks with Global Attention »
Zhanghao Wu · Paras Jain · Matthew Wright · Azalia Mirhoseini · Joseph Gonzalez · Ion Stoica -
2021 Poster: NovelD: A Simple yet Effective Exploration Criterion »
Tianjun Zhang · Huazhe Xu · Xiaolong Wang · Yi Wu · Kurt Keutzer · Joseph Gonzalez · Yuandong Tian -
2021 Poster: MADE: Exploration via Maximizing Deviation from Explored Regions »
Tianjun Zhang · Paria Rashidinejad · Jiantao Jiao · Yuandong Tian · Joseph Gonzalez · Stuart Russell -
2021 Poster: Learning Space Partitions for Path Planning »
Kevin Yang · Tianjun Zhang · Chris Cummins · Brandon Cui · Benoit Steiner · Linnan Wang · Joseph Gonzalez · Dan Klein · Yuandong Tian -
2021 Poster: Taxonomizing local versus global structure in neural network loss landscapes »
Yaoqing Yang · Liam Hodgkinson · Ryan Theisen · Joe Zou · Joseph Gonzalez · Kannan Ramchandran · Michael Mahoney -
2020 Poster: Boundary thickness and robustness in learning models »
Yaoqing Yang · Rajiv Khanna · Yaodong Yu · Amir Gholami · Kurt Keutzer · Joseph Gonzalez · Kannan Ramchandran · Michael Mahoney -
2020 Poster: A Statistical Framework for Low-bitwidth Training of Deep Neural Networks »
Jianfei Chen · Yu Gai · Zhewei Yao · Michael Mahoney · Joseph Gonzalez -
2019 Poster: ANODEV2: A Coupled Neural ODE Framework »
Tianjun Zhang · Zhewei Yao · Amir Gholami · Joseph Gonzalez · Kurt Keutzer · Michael Mahoney · George Biros -
2018 : Welcome »
Sarah Bird -
2018 : Welcome and organisers comments »
Chloé Bakalar · Finnian Lattimore · Sarah Bird · Sendhil Mullainathan -
2018 Workshop: MLSys: Workshop on Systems for ML and Open Source Software »
Aparna Lakshmiratan · Sarah Bird · Siddhartha Sen · Joseph Gonzalez · Daniel Crankshaw -
2018 Workshop: Workshop on Ethical, Social and Governance Issues in AI »
Chloe Bakalar · Sarah Bird · Tiberio Caetano · Edward W Felten · Dario Garcia · Isabel Kloumann · Finnian Lattimore · Sendhil Mullainathan · D. Sculley -
2017 : Invited Talk: Creating an Open and Flexible ecosystem for AI models with ONNX, Sarah Bird, Dmytro Dzhulgakov, Facebook Research »
Sarah Bird -
2017 Workshop: ML Systems Workshop @ NIPS 2017 »
Aparna Lakshmiratan · Sarah Bird · Siddhartha Sen · Christopher Ré · Li Erran Li · Joseph Gonzalez · Daniel Crankshaw -
2016 Workshop: Machine Learning Systems »
Aparna Lakshmiratan · Li Erran Li · Siddhartha Sen · Sarah Bird · Hussein Mehanna -
2015 : Multiworld Testing »
Sarah Bird