Skip to yearly menu bar Skip to main content


Poster

Mean Estimation in High-Dimensional Binary Markov Gaussian Mixture Models

Yihan Zhang · Nir Weinberger

Hall J (level 1) #822

Keywords: [ minimax rate ] [ high-dimensional statistics ] [ parameter estimation ] [ spectral estimator ] [ hidden Markov model ]


Abstract: We consider a high-dimensional mean estimation problem over a binary hidden Markov model, which illuminates the interplay between memory in data, sample size, dimension, and signal strength in statistical inference. In this model, an estimator observes nn samples of a dd-dimensional parameter vector θRdθRd, multiplied by a random sign SiSi (1in1in), and corrupted by isotropic standard Gaussian noise. The sequence of signs {Si}i[n]{1,1}n{Si}i[n]{1,1}n is drawn from a stationary homogeneous Markov chain with flip probability δ[0,1/2]δ[0,1/2]. As δδ varies, this model smoothly interpolates two well-studied models: the Gaussian Location Model for which δ=0δ=0 and the Gaussian Mixture Model for which δ=1/2δ=1/2. Assuming that the estimator knows δδ, we establish a nearly minimax optimal (up to logarithmic factors) estimation error rate, as a function of θ,δ,d,nθ,δ,d,n. We then provide an upper bound to the case of estimating δδ, assuming a (possibly inaccurate) knowledge of θθ. The bound is proved to be tight when θθ is an accurately known constant. These results are then combined to an algorithm which estimates θθ with δδ unknown a priori, and theoretical guarantees on its error are stated.

Chat is not available.