Timezone: »
We resolve the fundamental problem of online decoding with general nth order ergodic Markov chain models. Specifically, we provide deterministic and randomized algorithms whose performance is close to that of the optimal offline algorithm even when latency is small. Our algorithms admit efficient implementation via dynamic programs, and readily extend to (adversarial) non-stationary or time-varying settings. We also establish lower bounds for online methods under latency constraints in both deterministic and randomized settings, and show that no online algorithm can perform significantly better than our algorithms. To our knowledge, our work is the first to analyze general Markov chain decoding under hard constraints on latency. We provide strong empirical evidence to illustrate the potential impact of our work in applications such as gene sequencing.
Author Information
Vikas Garg (MIT)
Tamar Pichkhadze (MIT)
More from the Same Authors
-
2019 Poster: Solving graph compression via optimal transport »
Vikas Garg · Tommi Jaakkola -
2019 Poster: Generative Models for Graph-Based Protein Design »
John Ingraham · Vikas Garg · Regina Barzilay · Tommi Jaakkola -
2018 Poster: Learning SMaLL Predictors »
Vikas Garg · Ofer Dekel · Lin Xiao -
2018 Poster: Supervising Unsupervised Learning »
Vikas Garg · Adam Kalai -
2018 Spotlight: Supervising Unsupervised Learning »
Vikas Garg · Adam Kalai -
2016 Poster: Learning Tree Structured Potential Games »
Vikas Garg · Tommi Jaakkola