Search Results for author: Ruixiang Zhang

Found 15 papers, 6 papers with code

Meta-RangeSeg: LiDAR Sequence Semantic Segmentation Using Multiple Feature Aggregation

1 code implementation27 Feb 2022 Song Wang, Jianke Zhu, Ruixiang Zhang

Furthermore, Feature Aggregation Module (FAM) aggregates the meta features and multi-scale features, which tends to strengthen the role of range channel.

3D Semantic Segmentation Autonomous Vehicles +1

Learning Representation from Neural Fisher Kernel with Low-rank Approximation

no code implementations ICLR 2022 Ruixiang Zhang, Shuangfei Zhai, Etai Littwin, Josh Susskind

We show that the low-rank approximation of NFKs derived from unsupervised generative models and supervised learning models gives rise to high-quality compact representations of data, achieving competitive results on a variety of machine learning tasks.

A Dot Product Attention Free Transformer

no code implementations29 Sep 2021 Shuangfei Zhai, Walter Talbott, Nitish Srivastava, Chen Huang, Hanlin Goh, Ruixiang Zhang, Joshua M. Susskind

We introduce Dot Product Attention Free Transformer (DAFT), an efficient variant of Transformers \citep{transformer} that eliminates the query-key dot product in self attention.

An Attention Free Transformer

3 code implementations28 May 2021 Shuangfei Zhai, Walter Talbott, Nitish Srivastava, Chen Huang, Hanlin Goh, Ruixiang Zhang, Josh Susskind

We introduce Attention Free Transformer (AFT), an efficient variant of Transformers that eliminates the need for dot product self attention.

A nonabelian Brunn-Minkowski inequality

no code implementations19 Jan 2021 Yifan Jing, Chieu-Minh Tran, Ruixiang Zhang

Henstock and Macbeath asked in 1953 whether the Brunn-Minkowski inequality can be generalized to nonabelian locally compact groups; questions in the same line were also asked by Hrushovski, McCrudden, and Tao.

Group Theory Classical Analysis and ODEs Combinatorics Functional Analysis Metric Geometry 22D05, 43A05, 49Q20, 60B15, 05D99

Improving unsupervised anomaly localization by applying multi-scale memories to autoencoders

no code implementations21 Dec 2020 Yifei Yang, Shibing Xiang, Ruixiang Zhang

Autoencoder and its variants have been widely applicated in anomaly detection. The previous work memory-augmented deep autoencoder proposed memorizing normality to detect anomaly, however it neglects the feature discrepancy between different resolution scales, therefore we introduce multi-scale memories to record scale-specific features and multi-scale attention fuser between the encoding and decoding module of the autoencoder for anomaly detection, namely MMAE. MMAE updates slots at corresponding resolution scale as prototype features during unsupervised learning.

Anomaly Detection

Learning Structured Latent Factors from Dependent Data:A Generative Model Framework from Information-Theoretic Perspective

no code implementations ICML 2020 Ruixiang Zhang, Masanori Koyama, katsuhiko Ishiguro

Learning controllable and generalizable representation of multivariate data with desired structural properties remains a fundamental problem in machine learning.


Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling

3 code implementations NeurIPS 2020 Tong Che, Ruixiang Zhang, Jascha Sohl-Dickstein, Hugo Larochelle, Liam Paull, Yuan Cao, Yoshua Bengio

To make that practical, we show that sampling from this modified density can be achieved by sampling in latent space according to an energy-based model induced by the sum of the latent prior log-density and the discriminator output score.

Image Generation

Deep Verifier Networks: Verification of Deep Discriminative Models with Deep Generative Models

no code implementations18 Nov 2019 Tong Che, Xiaofeng Liu, Site Li, Yubin Ge, Ruixiang Zhang, Caiming Xiong, Yoshua Bengio

We test the verifier network on out-of-distribution detection and adversarial example detection problems, as well as anomaly detection problems in structured prediction tasks such as image caption generation.

Anomaly Detection Autonomous Driving +3

Perceptual Generative Autoencoders

2 code implementations ICML 2020 Zijun Zhang, Ruixiang Zhang, Zongpeng Li, Yoshua Bengio, Liam Paull

We therefore propose to map both the generated and target distributions to a latent space using the encoder of a standard autoencoder, and train the generator (or decoder) to match the target distribution in the latent space.

Understanding Hidden Memories of Recurrent Neural Networks

1 code implementation30 Oct 2017 Yao Ming, Shaozu Cao, Ruixiang Zhang, Zhen Li, Yuanzhe Chen, Yangqiu Song, Huamin Qu

We propose a technique to explain the function of individual hidden state units based on their expected response to input texts.

Maximum-Likelihood Augmented Discrete Generative Adversarial Networks

no code implementations26 Feb 2017 Tong Che, Yan-ran Li, Ruixiang Zhang, R. Devon Hjelm, Wenjie Li, Yangqiu Song, Yoshua Bengio

Despite the successes in capturing continuous distributions, the application of generative adversarial networks (GANs) to discrete settings, like natural language tasks, is rather restricted.

Cannot find the paper you are looking for? You can Submit a new open access paper.