Dimensionality Reduction
721 papers with code • 0 benchmarks • 10 datasets
Dimensionality reduction is the task of reducing the dimensionality of a dataset.
( Image credit: openTSNE )
Benchmarks
These leaderboards are used to track progress in Dimensionality Reduction
Libraries
Use these libraries to find Dimensionality Reduction models and implementationsDatasets
Latest papers
Quiver Laplacians and Feature Selection
The challenge of selecting the most relevant features of a given dataset arises ubiquitously in data analysis and dimensionality reduction.
scCDCG: Efficient Deep Structural Clustering for single-cell RNA-seq via Deep Cut-informed Graph Embedding
Addressing these limitations, we introduce scCDCG (single-cell RNA-seq Clustering via Deep Cut-informed Graph), a novel framework designed for efficient and accurate clustering of scRNA-seq data that simultaneously utilizes intercellular high-order structural information.
Remote sensing framework for geological mapping via stacked autoencoders and clustering
In this study, we present an unsupervised machine learning framework for processing remote sensing data by utilizing stacked autoencoders for dimensionality reduction and k-means clustering for mapping geological units.
DMSSN: Distilled Mixed Spectral-Spatial Network for Hyperspectral Salient Object Detection
To address these challenges, we propose a novel approach termed the Distilled Mixed Spectral-Spatial Network (DMSSN), comprising a Distilled Spectral Encoding process and a Mixed Spectral-Spatial Transformer (MSST) feature extraction network.
Enhancing Dimension-Reduced Scatter Plots with Class and Feature Centroids
We illustrate the utility of this approach with data derived from the phenotypes of three neurogenetic diseases and demonstrate how the addition of class and feature centroids increases the interpretability of scatter plots.
Efficient Algorithms for Regularized Nonnegative Scale-invariant Low-rank Approximation Models
However, from a practical perspective, the choice of regularizers and regularization coefficients, as well as the design of efficient algorithms, is challenging because of the multifactor nature of these models and the lack of theory to back these choices.
Targeted Visualization of the Backbone of Encoder LLMs
Attention based Large Language Models (LLMs) are the state-of-the-art in natural language processing (NLP).
S+t-SNE - Bringing dimensionality reduction to data streams
We present S+t-SNE, an adaptation of the t-SNE algorithm designed to handle infinite data streams.
Assessing the similarity of real matrices with arbitrary shape
We conclude that SAS is a suitable measure for quantifying the shared structure of matrices with arbitrary shape.
Once for Both: Single Stage of Importance and Sparsity Search for Vision Transformer Compression
Recent Vision Transformer Compression (VTC) works mainly follow a two-stage scheme, where the importance score of each model unit is first evaluated or preset in each submodule, followed by the sparsity score evaluation according to the target sparsity constraint.