Learning Discrete Distributions from Metastable Data via Pseudo-Likelihood
Abhijith Jayakumar · Andrey Lokhov · Sidhant Misra · Marc Vuffray
Abstract
Physically motivated stochastic dynamics are standard tools for sampling high-dimensional distributions but often mix slowly due to metastability. We show that for multivariate discrete distributions, the true stationary model can be learned from i.i.d. samples drawn from a metastable distribution. The key observation is that for strongly metastable states of a reversible chain with stationary distribution $\mu$, the single-variable conditionals of the metastable law are, on average, close to those of $\mu$, even when the two distributions are far in global metrics. This enables accurate parameter recovery with conditional-likelihood estimators such as pseudo-likelihood (PL). We formalize these guarantees and illustrate them numerically on the Curie-Weiss model, where PL succeeds while maximum likelihood fails.
Chat is not available.
Successful Page Load