Upcoming Events
Wed Dec. 20
Learning from higher-order correlations, efficiently
Higher-order correlations in data are important, but how do neural networks extract information from them efficiently? We study this question in simple models of single- and two-layer neural networks. We first show that neural networks learn the statistics of their data in a hierarchical way. We then show that while learning from higher-order correlations is expensive in terms of sample complexity, correlations between the latent variables of the data help neural networks accelerate learning. We close by discussing some phase transitions in the higher-order cumulants of inputs with translation symmetry, and discuss their importance for feature learning in neural networks.