Statistical analysis of a graph often starts with embedding, the process of representing its nodes as points in space. How to choose the embedding dimension is a nuanced decision in practice, but in theory a notion of true dimension is often available. In spectral embedding, this dimension may be very high. However, this paper shows that existing random graph models, including graphon and other latent position models, predict the data should live near a much lower-dimensional set. One may therefore circumvent the curse of dimensionality by employing methods which exploit hidden manifold structure.
|Publication status||Accepted/In press - 2020|
|Event||Neural Information Processing Systems (NeurIPS) - Virtual-only|
Duration: 6 Dec 2020 → 12 Dec 2020
|Conference||Neural Information Processing Systems (NeurIPS)|
|Period||6/12/20 → 12/12/20|