Compressive learning

Unsupervised compressive learning with spintronics

Max. student(s): 1 Advisors: Laurent Jacques and Flavio Abreu Araujo Teaching Assistants: Anatole Moureau and Rémi Delogne. Over the last few years, machine learning—the discipline of automatically fitting mathematical models or rules from data—revolutionized science, engineering, and our society.

Compressive learning with privacy guarantees

Abstract: This work addresses the problem of learning from large collections of data with privacy guarantees. The compressive learning framework proposes to deal with the large scale of datasets by compressing them into a single vector of generalized random moments, called a sketch vector, from which the learning task is then performed.

Compressive Learning of Generative Networks

Abstract: Generative networks implicitly approximate complex densities from their sampling with impressive accuracy. However, because of the enormous scale of modern datasets, this training process is often computationally expensive. We cast generative network training into the recent framework of compressive learning: we reduce the computational burden of large-scale datasets by first harshly compressing them in a single pass as a single sketch vector.

Compressive k-Means with Differential Privacy

Abstract: In the compressive learning framework, one harshly compresses a whole training dataset into a single vector of generalized random moments, the sketch, from which a learning task can subsequently be performed.

Differentially Private Compressive K-means

Abstract: This work addresses the problem of learning from large collections of data with privacy guarantees. The sketched learning framework proposes to deal with the large scale of datasets by compressing them into a single vector of generalized random moments, from which the learning task is then performed.

Compressive Classification (Machine Learning without learning)

Abstract: Compressive learning is a framework where (so far unsupervised) learning tasks use not the entire dataset but a compressed summary (sketch) of it. We propose a compressive learning classification method, and a novel sketch function for images.

Quantized Compressive K-Means

Abstract: The recent framework of compressive statistical learning proposes to design tractable learning algorithms that use only a heavily compressed representation - or sketch - of massive datasets. Compressive K-Means (CKM) is such a method: It aims at estimating the centroids of data clusters from pooled, nonlinear, and random signatures of the learning examples.