Browse > Methodology > Dimensionality Reduction

Dimensionality Reduction

72 papers with code · Methodology

Dimensionality reduction is the task of reducing the dimensionality of a dataset.

State-of-the-art leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction

9 Feb 2018lmcinnes/umap

UMAP (Uniform Manifold Approximation and Projection) is a novel manifold learning technique for dimension reduction. UMAP is constructed from a theoretical framework based in Riemannian geometry and algebraic topology.

DIMENSIONALITY REDUCTION

Efficient Algorithms for t-distributed Stochastic Neighborhood Embedding

25 Dec 2017pavlin-policar/openTSNE

t-distributed Stochastic Neighborhood Embedding (t-SNE) is a method for dimensionality reduction and visualization that has become widely popular in recent years. Efficient implementations of t-SNE are available, but they scale poorly to datasets with hundreds of thousands to millions of high dimensional data-points.

DIMENSIONALITY REDUCTION

SOM-VAE: Interpretable Discrete Representation Learning on Time Series

ICLR 2019 JustGlowing/minisom

To address this problem, we propose a new representation learning framework building on ideas from interpretable discrete dimensionality reduction and deep generative modeling. We evaluate our model in terms of clustering performance and interpretability on static (Fashion-)MNIST data, a time series of linearly interpolated (Fashion-)MNIST images, a chaotic Lorenz attractor system with two macro states, as well as on a challenging real world medical time series application on the eICU data set.

DIMENSIONALITY REDUCTION REPRESENTATION LEARNING TIME SERIES

t-SNE-CUDA: GPU-Accelerated t-SNE and its Applications to Modern Data

31 Jul 2018CannyLab/tsne-cuda

Modern datasets and models are notoriously difficult to explore and analyze due to their inherent high dimensionality and massive numbers of samples. t-SNE-CUDA significantly outperforms current implementations with 50-700x speedups on the CIFAR-10 and MNIST datasets.

DIMENSIONALITY REDUCTION

Demixed principal component analysis of population activity in higher cortical areas reveals independent representation of task parameters

22 Oct 2014wielandbrendel/dPCA

Neurons in higher cortical areas, such as the prefrontal cortex, are known to be tuned to a variety of sensory and motor variables. The population activity is decomposed into a few demixed components that capture most of the variance in the data and that highlight dynamic tuning of the population to various task parameters, such as stimuli, decisions, rewards, etc.

DECISION MAKING DIMENSIONALITY REDUCTION

Deep Continuous Clustering

ICLR 2018 shahsohil/DCC

We present a clustering algorithm that performs nonlinear dimensionality reduction and clustering jointly. The autoencoder is optimized as part of the clustering process.

DIMENSIONALITY REDUCTION

From Principal Subspaces to Principal Components with Linear Autoencoders

26 Apr 2018danielkunin/Regularized-Linear-Autoencoders

The autoencoder is an effective unsupervised learning model which is widely used in deep learning. In this paper, we show how to recover the loading vectors from the autoencoder weights.

DIMENSIONALITY REDUCTION

Self-Taught Convolutional Neural Networks for Short Text Clustering

1 Jan 2017jacoxu/STC2

Short text clustering is a challenging problem due to its sparseness of text representation. Here we propose a flexible Self-Taught Convolutional neural network framework for Short Text Clustering (dubbed STC^2), which can flexibly and successfully incorporate more useful semantic features and learn non-biased deep text representation in an unsupervised manner.

DIMENSIONALITY REDUCTION TEXT CLUSTERING WORD EMBEDDINGS

ZOO: Zeroth Order Optimization based Black-box Attacks to Deep Neural Networks without Training Substitute Models

14 Aug 2017huanzhang12/ZOO-Attack

Deep neural networks (DNNs) are one of the most prominent technologies of our time, as they achieve state-of-the-art performance in many machine learning tasks, including but not limited to image classification, text mining, and speech processing. However, different from leveraging attack transferability from substitute models, we propose zeroth order optimization (ZOO) based attacks to directly estimate the gradients of the targeted DNN for generating adversarial examples.

ADVERSARIAL ATTACK ADVERSARIAL DEFENSE AUTONOMOUS DRIVING DIMENSIONALITY REDUCTION IMAGE CLASSIFICATION

Bayesian latent structure discovery from multi-neuron recordings

NeurIPS 2016 slinderman/pypolyagamma

Neural circuits contain heterogeneous groups of neurons that differ in type, location, connectivity, and basic response properties. However, traditional methods for dimensionality reduction and clustering are ill-suited to recovering the structure underlying the organization of neural circuits.

BAYESIAN INFERENCE DIMENSIONALITY REDUCTION