Richard Samworth. Adaptation in log-concave density estimation
Richard J Samworth
2016 Invited talk
in
Workshop: Adaptive and Scalable Nonparametric Methods in Machine Learning
in
Workshop: Adaptive and Scalable Nonparametric Methods in Machine Learning
Abstract
The log-concave maximum likelihood estimator of a density on the real line based on a sample of size $n$ is known to attain the minimax optimal rate of convergence of $O(n^{-4/5})$ with respect to, e.g., squared Hellinger distance. In this talk, we show that it also enjoys attractive adaptation properties, in the sense that it achieves a faster rate of convergence when the logarithm of the true density is $k$-affine (i.e. made up of $k$ affine pieces), provided $k$ is not too large. Our results use two different techniques: the first relies on a new Marshall's inequality for log-concave density estimation, and reveals that when the true density is close to log-linear on its support, the log-concave maximum likelihood estimator can achieve the parametric rate of convergence in total variation distance. Our second approach depends on local bracketing entropy methods, and allows us to prove a sharp oracle inequality, which implies in particular that the rate of convergence with respect to various global loss functions, including Kullback--Leibler divergence, is $O(kn^{-1} \log^{5/4} n)$ when the true density is log-concave and its logarithm is close to $k$-affine.
Chat is not available.
Successful Page Load