Timezone: »

Kernels and learning curves for Gaussian process regression on random graphs
Peter Sollich · Matthew J Urry · Camille Coti

Tue Dec 08 03:24 PM -- 03:25 PM (PST) @ None
We investigate how well Gaussian process regression can learn functions defined on graphs, using large regular random graphs as a paradigmatic example. Random-walk based kernels are shown to have some surprising properties: within the standard approximation of a locally tree-like graph structure, the kernel does not become constant, i.e.neighbouring function values do not become fully correlated, when the lengthscale $\sigma$ of the kernel is made large. Instead the kernel attains a non-trivial limiting form, which we calculate. The fully correlated limit is reached only once loops become relevant, and we estimate where the crossover to this regime occurs. Our main subject are learning curves of Bayes error versus training set size. We show that these are qualitatively well predicted by a simple approximation using only the spectrum of a large tree as input, and generically scale with $n/V$, the number of training examples per vertex. We also explore how this behaviour changes once kernel lengthscales are large enough for loops to become important.

Author Information

Peter Sollich (King's College London)
Matthew J Urry (King's College London)
Camille Coti (INRIA Saclay-Île de France)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors