“Random Neural Network”

The Separation Capacity of Random Neural Networks

Abstract: Neural networks with random weights appear in a variety of machine learning applications, most prominently as the initialization of many deep learning algorithms and as a computationally cheap alternative to fully learned neural networks.

The Separation Capacity of Random Neural Networks

Abstract: Neural networks (NNs) with random weights appear in a variety of machine learning applications, perhaps most prominently as initialization of many deep learning algorithms. We take one step closer to their theoretical foundation by addressing the following data separation problem: Under what conditions can a random NN make two classes \(\mathcal X^{-}, \mathcal X^{\plus} \subset \mathbb R^{d}\) (with positive distance) linearly separable?

Compressive Classification (Machine Learning without learning)

Abstract: Compressive learning is a framework where (so far unsupervised) learning tasks use not the entire dataset but a compressed summary (sketch) of it. We propose a compressive learning classification method, and a novel sketch function for images.