Neural intel Pod

Online Learning Neural Networks: Bounds and Characterization


Listen Later

This research investigates online learning for feedforward neural networks utilizing the sign activation function. The paper identifies a margin condition in the first hidden layer as crucial for learnability, demonstrating that the optimal error bound is closely tied to the totally-separable packing number of the input space, showing an exponential dependence on dimension in some cases. To address this dimensionality issue, the authors examine two scenarios: a multi-index model where the function depends on a lower-dimensional projection, achieving better bounds, and a setting with a large margin throughout all layers, yielding bounds dependent on network depth and the number of output labels. The study also provides a method for adaptive learning when these margin parameters are unknown and extends its analysis to the agnostic learning setting.

...more
View all episodesView all episodes
Download on the App Store

Neural intel PodBy Neural Intelligence Network