Search Results for author: Guodong Li

Found 8 papers, 3 papers with code

Cross-Layer Retrospective Retrieving via Layer Attention

1 code implementation8 Feb 2023 Yanwen Fang, Yuxi Cai, Jintai Chen, Jingyu Zhao, Guangjian Tian, Guodong Li

Motivated by this, we devise a cross-layer attention mechanism, called multi-head recurrent layer attention (MRLA), that sends a query representation of the current layer to all previous layers to retrieve query-related information from different levels of receptive fields.

Image Classification Instance Segmentation +3

High-Frequency-Based Volatility Model with Network Structure

no code implementations14 Apr 2022 Huiling Yuan, Guodong Li, Junhui Wang

This paper introduces one new multivariate volatility model that can accommodate an appropriately defined network structure based on low-frequency and high-frequency data.

Vocal Bursts Intensity Prediction

A New Measure of Model Redundancy for Compressed Convolutional Neural Networks

no code implementations9 Dec 2021 Feiqing Huang, Yuefeng Si, Yao Zheng, Guodong Li

While recently many designs have been proposed to improve the model efficiency of convolutional neural networks (CNNs) on a fixed resource budget, theoretical understanding of these designs is still conspicuously lacking.

Tensor Decomposition

Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation

1 code implementation NeurIPS 2021 Jingyu Zhao, Yanwen Fang, Guodong Li

This paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer.

Image Classification Instance Segmentation +3

Rethinking Compressed Convolution Neural Network from a Statistical Perspective

no code implementations1 Jan 2021 Feiqing Huang, Yuefeng Si, Guodong Li

Many designs have recently been proposed to improve the model efficiency of convolutional neural networks (CNNs) at a fixed resource budget, while there is a lack of theoretical analysis to justify them.

Tensor Decomposition

Do RNN and LSTM have Long Memory?

1 code implementation ICML 2020 Jingyu Zhao, Feiqing Huang, Jia Lv, Yanjie Duan, Zhen Qin, Guodong Li, Guangjian Tian

The LSTM network was proposed to overcome the difficulty in learning long-term dependence, and has made significant advancements in applications.

Compact Autoregressive Network

no code implementations6 Sep 2019 Di Wang, Feiqing Huang, Jingyu Zhao, Guodong Li, Guangjian Tian

Autoregressive networks can achieve promising performance in many sequence modeling tasks with short-range dependence.

TAR

Cannot find the paper you are looking for? You can Submit a new open access paper.