Random Projection below the JL Limit


The Johnson-Lindenstrauss (JL) lemma, with known probability, sets a lower bound q0 on the dimension for which a random projection of p-dimensional vector data is guaranteed to be within (1±ε) of being an isometry in a randomly projected downspace. We study several ways to identify a “good” rogue random projection when the target downspace has dimensions below the JL limit. The tools used towards this end are Pearson and Spearman correlation coefficients, and a visual imaging method (a cluster heat map) that usually reveals cluster structure in spaces of any dimension. We use four synthetic data sets and the ubiquitous Iris data to study our procedures for tracking the reliability of RRPs. Unsurprisingly, rogue random projection is quite unpredictable. At its best, it is every bit as good as Principal Components Analysis, but at it’s worst, it is awful. Pearson and Spearman correlations do signal good and bad projections, but the visual imaging method seems even more effective in determining the quality of RRPs.




J. Bezdek, X. Ye, M. Popescu, J. Keller and A. Zare, "Random projection below the JL limit," 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, 2016, pp. 2414-2423.
author={J. Bezdek and X. Ye and M. Popescu and J. Keller and A. Zare}, 
booktitle={2016 International Joint Conference on Neural Networks (IJCNN)}, 
title={Random projection below the JL limit},