Search Results for author: Pengyu Liu

Found 11 papers, 4 papers with code

MMAD: Multi-label Micro-Action Detection in Videos

1 code implementation7 Jul 2024 Kun Li, Dan Guo, Pengyu Liu, Guoliang Chen, Meng Wang

To support the MMAD task, we introduce a new dataset named Multi-label Micro-Action-52 (MMA-52), specifically designed to facilitate the detailed analysis and exploration of complex human micro-actions.

Action Detection

Micro-gesture Online Recognition using Learnable Query Points

no code implementations5 Jul 2024 Pengyu Liu, Fei Wang, Kun Li, Guoliang Chen, Yanyan Wei, Shengeng Tang, Zhiliang Wu, Dan Guo

The Micro-gesture Online Recognition task involves identifying the category and locating the start and end times of micro-gestures in video clips.

Action Detection

Transformer-QEC: Quantum Error Correction Code Decoding with Transferable Transformers

no code implementations27 Nov 2023 Hanrui Wang, Pengyu Liu, Kevin Shao, Dantong Li, Jiaqi Gu, David Z. Pan, Yongshan Ding, Song Han

Quantum Error Correction (QEC) mitigates this by employing redundancy, distributing quantum information across multiple data qubits and utilizing syndrome qubits to monitor their states for errors.

Decoder Transfer Learning

RobustState: Boosting Fidelity of Quantum State Preparation via Noise-Aware Variational Training

no code implementations27 Nov 2023 Hanrui Wang, Yilian Liu, Pengyu Liu, Jiaqi Gu, Zirui Li, Zhiding Liang, Jinglei Cheng, Yongshan Ding, Xuehai Qian, Yiyu Shi, David Z. Pan, Frederic T. Chong, Song Han

Arbitrary state preparation algorithms can be broadly categorized into arithmetic decomposition (AD) and variational quantum state preparation (VQSP).

DGR: Tackling Drifted and Correlated Noise in Quantum Error Correction via Decoding Graph Re-weighting

no code implementations27 Nov 2023 Hanrui Wang, Pengyu Liu, Yilian Liu, Jiaqi Gu, Jonathan Baker, Frederic T. Chong, Song Han

By counting the occurrences of edges and edge pairs in decoded matchings, we can statistically estimate the up-to-date probabilities of each edge and the correlations between them.

Decoder

Quantifying syntax similarity with a polynomial representation of dependency trees

1 code implementation13 Nov 2022 Pengyu Liu, Tinghao Feng, Rui Liu

We introduce a graph polynomial that distinguishes tree structures to represent dependency grammar and a measure based on the polynomial representation to quantify syntax similarity.

Diversity Sentence

QuEst: Graph Transformer for Quantum Circuit Reliability Estimation

1 code implementation30 Oct 2022 Hanrui Wang, Pengyu Liu, Jinglei Cheng, Zhiding Liang, Jiaqi Gu, Zirui Li, Yongshan Ding, Weiwen Jiang, Yiyu Shi, Xuehai Qian, David Z. Pan, Frederic T. Chong, Song Han

Specifically, the TorchQuantum library also supports using data-driven ML models to solve problems in quantum system research, such as predicting the impact of quantum noise on circuit fidelity and improving the quantum circuit compilation efficiency.

Comparing the topology of phylogenetic network generators

no code implementations12 Jun 2021 Remie Janssen, Pengyu Liu

Phylogenetic networks represent evolutionary history of species and can record natural reticulate evolutionary processes such as horizontal gene transfer and gene recombination.

On the Compressive Power of Boolean Threshold Autoencoders

no code implementations21 Apr 2020 Avraham A. Melkman, Sini Guo, Wai-Ki Ching, Pengyu Liu, Tatsuya Akutsu

An autoencoder is a layered neural network whose structure can be viewed as consisting of an encoder, which compresses an input vector of dimension $D$ to a vector of low dimension $d$, and a decoder which transforms the low-dimensional vector back to the original input vector (or one that is very similar).

A tree distinguishing polynomial

2 code implementations6 Apr 2019 Pengyu Liu

We define a bivariate polynomial for unlabeled rooted trees and show that the polynomial of an unlabeled rooted tree $T$ is the generating function of a class of subtrees of $T$.

Combinatorics 05C31 and 05C05

Cannot find the paper you are looking for? You can Submit a new open access paper.