Search Results for author: Michael Zhu

Found 12 papers, 4 papers with code

To prune, or not to prune: exploring the efficacy of pruning for model compression

4 code implementations ICLR 2018 Michael Zhu, Suyog Gupta

Model pruning seeks to induce sparsity in a deep neural network's various connection matrices, thereby reducing the number of nonzero-valued parameters in the model.

Model Compression

Revisiting Cross-Lingual Summarization: A Corpus-based Study and A New Benchmark with Improved Annotation

1 code implementation8 Jul 2023 Yulong Chen, Huajian Zhang, Yijie Zhou, Xuefeng Bai, Yueguan Wang, Ming Zhong, Jianhao Yan, Yafu Li, Judy Li, Michael Zhu, Yue Zhang

Additionally, based on the same intuition, we propose a 2-Step method, which takes both conversation and summary as input to simulate human annotation process.

A Practical Algorithm for Topic Modeling with Provable Guarantees

2 code implementations19 Dec 2012 Sanjeev Arora, Rong Ge, Yoni Halpern, David Mimno, Ankur Moitra, David Sontag, Yichen Wu, Michael Zhu

Topic models provide a useful method for dimensionality reduction and exploratory data analysis in large text corpora.

Dimensionality Reduction Topic Models

Group Additive Structure Identification for Kernel Nonparametric Regression

no code implementations NeurIPS 2017 Chao Pan, Michael Zhu

The additive model is one of the most popularly used models for high dimensional nonparametric regression analysis.

regression

Sample Adaptive MCMC

1 code implementation NeurIPS 2019 Michael Zhu

For MCMC methods like Metropolis-Hastings, tuning the proposal distribution is important in practice for effective sampling from the target distribution \pi.

Towards Differentiable Resampling

no code implementations24 Apr 2020 Michael Zhu, Kevin Murphy, Rico Jonschkowski

Resampling is a key component of sample-based recursive state estimation in particle filters.

Variance Reduction and Quasi-Newton for Particle-Based Variational Inference

no code implementations ICML 2020 Michael Zhu, Chang Liu, Jun Zhu

Particle-based Variational Inference methods (ParVIs), like Stein Variational Gradient Descent, are nonparametric variational inference methods that optimize a set of particles to best approximate a target distribution.

Bayesian Inference Riemannian optimization +1

COVID-19 Vaccine and Social Media: Exploring Emotions and Discussions on Twitter

no code implementations29 Jul 2021 Amir Karami, Michael Zhu, Bailey Goldschmidt, Hannah R. Boyajieff, Mahdi M. Najafabadi

Our findings show that the negative sentiment regarding the COVID-19 vaccine had a decreasing trend between November 2020 and February 2021.

2020 U.S. presidential election in swing states: Gender differences in Twitter conversations

no code implementations21 Aug 2021 Amir Karami, Spring B. Clark, Anderson Mackenzie, Dorathea Lee, Michael Zhu, Hannah R. Boyajieff, Bailey Goldschmidt

Social media is commonly used by the public during election campaigns to express their opinions regarding different issues.

Recurrent Neural Networks with Mixed Hierarchical Structures and EM Algorithm for Natural Language Processing

no code implementations LREC 2022 Zhaoxin Luo, Michael Zhu

In this paper, we propose a novel approach called the latent indicator layer to identify and learn implicit hierarchical information (e. g., phrases), and further develop an EM algorithm to handle the latent indicator layer in training.

Document Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.