Abstract: Neural networks (NNs) with random weights appear in a variety of machine learning applications, perhaps most prominently as initialization of many deep learning algorithms. We take one step closer to their theoretical foundation by addressing the following data separation problem: Under what conditions can a random NN make two classes \(\mathcal X^{-}, \mathcal X^{\plus} \subset \mathbb R^{d}\) (with positive distance) linearly separable?
2021
2021 Online International Conference on Computational Harmonic Analysis (Online-ICCHA2021).