1 code implementation • 7 Jul 2024 • Kun Li, Dan Guo, Pengyu Liu, Guoliang Chen, Meng Wang
To support the MMAD task, we introduce a new dataset named Multi-label Micro-Action-52 (MMA-52), specifically designed to facilitate the detailed analysis and exploration of complex human micro-actions.
no code implementations • 5 Jul 2024 • Pengyu Liu, Fei Wang, Kun Li, Guoliang Chen, Yanyan Wei, Shengeng Tang, Zhiliang Wu, Dan Guo
The Micro-gesture Online Recognition task involves identifying the category and locating the start and end times of micro-gestures in video clips.
no code implementations • 25 Feb 2024 • Lekai Song, Pengyu Liu, Jingfang Pei, Yang Liu, Songwei Liu, Shengbo Wang, Leonard W. T. Ng, Tawfique Hasan, Kong-Pang Pun, Shuo Gao, Guohua Hu
The demand for efficient edge vision has spurred the interest in developing stochastic computing approaches for performing image processing tasks.
no code implementations • 27 Nov 2023 • Hanrui Wang, Pengyu Liu, Kevin Shao, Dantong Li, Jiaqi Gu, David Z. Pan, Yongshan Ding, Song Han
Quantum Error Correction (QEC) mitigates this by employing redundancy, distributing quantum information across multiple data qubits and utilizing syndrome qubits to monitor their states for errors.
no code implementations • 27 Nov 2023 • Hanrui Wang, Yilian Liu, Pengyu Liu, Jiaqi Gu, Zirui Li, Zhiding Liang, Jinglei Cheng, Yongshan Ding, Xuehai Qian, Yiyu Shi, David Z. Pan, Frederic T. Chong, Song Han
Arbitrary state preparation algorithms can be broadly categorized into arithmetic decomposition (AD) and variational quantum state preparation (VQSP).
no code implementations • 27 Nov 2023 • Hanrui Wang, Pengyu Liu, Yilian Liu, Jiaqi Gu, Jonathan Baker, Frederic T. Chong, Song Han
By counting the occurrences of edges and edge pairs in decoded matchings, we can statistically estimate the up-to-date probabilities of each edge and the correlations between them.
1 code implementation • 13 Nov 2022 • Pengyu Liu, Tinghao Feng, Rui Liu
We introduce a graph polynomial that distinguishes tree structures to represent dependency grammar and a measure based on the polynomial representation to quantify syntax similarity.
1 code implementation • 30 Oct 2022 • Hanrui Wang, Pengyu Liu, Jinglei Cheng, Zhiding Liang, Jiaqi Gu, Zirui Li, Yongshan Ding, Weiwen Jiang, Yiyu Shi, Xuehai Qian, David Z. Pan, Frederic T. Chong, Song Han
Specifically, the TorchQuantum library also supports using data-driven ML models to solve problems in quantum system research, such as predicting the impact of quantum noise on circuit fidelity and improving the quantum circuit compilation efficiency.
no code implementations • 12 Jun 2021 • Remie Janssen, Pengyu Liu
Phylogenetic networks represent evolutionary history of species and can record natural reticulate evolutionary processes such as horizontal gene transfer and gene recombination.
no code implementations • 21 Apr 2020 • Avraham A. Melkman, Sini Guo, Wai-Ki Ching, Pengyu Liu, Tatsuya Akutsu
An autoencoder is a layered neural network whose structure can be viewed as consisting of an encoder, which compresses an input vector of dimension $D$ to a vector of low dimension $d$, and a decoder which transforms the low-dimensional vector back to the original input vector (or one that is very similar).
2 code implementations • 6 Apr 2019 • Pengyu Liu
We define a bivariate polynomial for unlabeled rooted trees and show that the polynomial of an unlabeled rooted tree $T$ is the generating function of a class of subtrees of $T$.
Combinatorics 05C31 and 05C05