Kernel Random Matrices of Large Concentrated Data: the Example of GAN-Generated Images

Based on recent random matrix advances in the analysis of kernel methods for classification and clustering, this paper proposes the study of large kernel methods for a wide class of random inputs, i.e., concentrated data, which are more generic than Gaussian mixtures. The concentration assumption is...

Full description

Saved in:
Bibliographic Details
Published in:ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 7480 - 7484
Main Authors: Amine Seddik, Mohamed El, Tamaazousti, Mohamed, Couillet, Romain
Format: Conference Proceeding
Language:English
Published: IEEE 01-05-2019
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Based on recent random matrix advances in the analysis of kernel methods for classification and clustering, this paper proposes the study of large kernel methods for a wide class of random inputs, i.e., concentrated data, which are more generic than Gaussian mixtures. The concentration assumption is motivated by the fact that one can use generative models to design complex data structures, through Lipschitzally transformed concentrated vectors (e.g., Gaussian) which remain concentrated vectors. Applied to spectral clustering, we demonstrate that our theoretical findings closely match the behavior of large kernel matrices, when considering the fed-in data as CNN representations of GAN-generated images (i.e., concentrated vectors by design).
ISSN:2379-190X
DOI:10.1109/ICASSP.2019.8683333