Dimensionality Reduction

722 papers with code • 0 benchmarks • 10 datasets

Dimensionality reduction is the task of reducing the dimensionality of a dataset.

( Image credit: openTSNE )

Libraries

Use these libraries to find Dimensionality Reduction models and implementations

Latest papers with no code

Explainable Light-Weight Deep Learning Pipeline for Improved Drought Stres

no code yet • 15 Apr 2024

The novelty lies in the synergistic combination of a pretrained network with carefully designed custom layers.

Formation-Controlled Dimensionality Reduction

no code yet • 10 Apr 2024

Dimensionality reduction represents the process of generating a low dimensional representation of high dimensional data.

Dimensionality Reduction in Sentence Transformer Vector Databases with Fast Fourier Transform

no code yet • 9 Apr 2024

This paper advocates for the broader adoption of FFT in vector database management, marking a significant stride towards addressing the challenges of data volume and complexity in AI research and applications.

Tangling-Untangling Cycle for Efficient Learning

no code yet • 8 Apr 2024

A new insight brought by this work is to introduce class labels as the context variables in the lifted higher-dimensional space (so supervised learning becomes unsupervised learning).

CAVIAR: Categorical-Variable Embeddings for Accurate and Robust Inference

no code yet • 7 Apr 2024

Social science research often hinges on the relationship between categorical variables and outcomes.

Low-Rank Robust Subspace Tensor Clustering for Metro Passenger Flow Modeling

no code yet • 5 Apr 2024

Moreover, a case study in the station clustering based on real passenger flow data is conducted, with quite valuable insights discovered.

Human Activity Recognition using Smartphones

no code yet • 3 Apr 2024

In our project, we have created an Android application that recognizes the daily human activities and calculate the calories burnt in real time.

Non-negative Subspace Feature Representation for Few-shot Learning in Medical Imaging

no code yet • 3 Apr 2024

Extensive empirical studies are conducted in terms of validating the effectiveness of NMF, especially its supervised variants (e. g., discriminative NMF, and supervised and constrained NMF with sparseness), and the comparison with principal component analysis (PCA), i. e., the collaborative representation-based dimensionality reduction technique derived from eigenvectors.

Preventing Model Collapse in Gaussian Process Latent Variable Models

no code yet • 2 Apr 2024

Gaussian process latent variable models (GPLVMs) are a versatile family of unsupervised learning models, commonly used for dimensionality reduction.

On the reduction of Linear Parameter-Varying State-Space models

no code yet • 2 Apr 2024

This paper presents an overview and comparative study of the state of the art in State-Order Reduction (SOR) and Scheduling Dimension Reduction (SDR) for Linear Parameter-Varying (LPV) State-Space (SS) models, comparing and benchmarking their capabilities, limitations and performance.