Search Results for author: Chieh Wu

Found 9 papers, 0 papers with code

Deep Layer-wise Networks Have Closed-Form Weights

no code implementations1 Feb 2022 Chieh Wu, Aria Masoomi, Arthur Gretton, Jennifer Dy

There is currently a debate within the neuroscience community over the likelihood of the brain performing backpropagation (BP).

Instance-wise Feature Grouping

no code implementations NeurIPS 2020 Aria Masoomi, Chieh Wu, Tingting Zhao, Zifeng Wang, Peter Castaldi, Jennifer Dy

Moreover, the features that belong to each group, and the important feature groups may vary per sample.

General Classification

Kernel Dependence Network

no code implementations4 Nov 2020 Chieh Wu, Aria Masoomi, Arthur Gretton, Jennifer Dy

We propose a greedy strategy to spectrally train a deep network for multi-class classification.

Multi-class Classification

Deep Layer-wise Networks Have Closed-Form Weights

no code implementations15 Jun 2020 Chieh Wu, Aria Masoomi, Arthur Gretton, Jennifer Dy

There is currently a debate within the neuroscience community over the likelihood of the brain performing backpropagation (BP).

Multi-class Classification

Solving Interpretable Kernel Dimensionality Reduction

no code implementations NeurIPS 2019 Chieh Wu, Jared Miller, Yale Chang, Mario Sznaier, Jennifer Dy

While KDR methods can be easily solved by keeping the most dominant eigenvectors of the kernel matrix, its features are no longer easy to interpret.

Clustering Dimensionality Reduction

Iterative Spectral Method for Alternative Clustering

no code implementations8 Sep 2019 Chieh Wu, Stratis Ioannidis, Mario Sznaier, Xiangyu Li, David Kaeli, Jennifer G. Dy

Given a dataset and an existing clustering as input, alternative clustering aims to find an alternative partition.

Clustering

Spectral Non-Convex Optimization for Dimension Reduction with Hilbert-Schmidt Independence Criterion

no code implementations6 Sep 2019 Chieh Wu, Jared Miller, Yale Chang, Mario Sznaier, Jennifer Dy

The Hilbert Schmidt Independence Criterion (HSIC) is a kernel dependence measure that has applications in various aspects of machine learning.

Clustering Dimensionality Reduction

Solving Interpretable Kernel Dimension Reduction

no code implementations6 Sep 2019 Chieh Wu, Jared Miller, Yale Chang, Mario Sznaier, Jennifer Dy

While KDR methods can be easily solved by keeping the most dominant eigenvectors of the kernel matrix, its features are no longer easy to interpret.

Clustering Dimensionality Reduction

Deep Kernel Learning for Clustering

no code implementations9 Aug 2019 Chieh Wu, Zulqarnain Khan, Yale Chang, Stratis Ioannidis, Jennifer Dy

We propose a deep learning approach for discovering kernels tailored to identifying clusters over sample data.

Clustering Deep Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.