1 code implementation • 20 Feb 2024 • Jinjing Shi, Zimeng Xiao, Heyuan Shi, Yu Jiang, Xuelong Li
Subsequently, QuanTest formulates the problem of generating test inputs that maximize the quantum entanglement sufficiency and capture incorrect behaviors of the QNN system as a joint optimization problem and solves it in a gradient-based manner to generate quantum adversarial examples.
no code implementations • 25 Jan 2024 • Ren-xin Zhao, Jinjing Shi, Xuelong Li
In response to the dilemma of HAM and QML, a Grover-inspired Quantum Hard Attention Mechanism (GQHAM) consisting of a Flexible Oracle (FO) and an Adaptive Diffusion Operator (ADO) is proposed.
no code implementations • 25 Aug 2023 • Ren-xin Zhao, Jinjing Shi, Xuelong Li
Self-Attention Mechanism (SAM) excels at distilling important information from the interior of data to improve the computational efficiency of models.
no code implementations • 14 Jul 2022 • Jinjing Shi, Ren-xin Zhao, Wenxuan Wang, Shichao Zhang, Xuelong Li
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features and greatly improves the performance of machine learning models, espeacially requiring efficient characterization and feature extraction of high-dimensional data.
no code implementations • 25 Sep 2019 • Junhao Qiu, Ronghua Shi, Fangfang Li (the corresponding author), Jinjing Shi, Wangmin Liao
), we propose a new text classification model, which uses words, characters, and labels as input.