no code implementations • 26 Apr 2024 • Ren-xin Zhao, Shi Wang, Yaonan Wang
Quantum Convolutional Layer (QCL) is considered as one of the core of Quantum Convolutional Neural Networks (QCNNs) due to its efficient data feature extraction capability.
no code implementations • 25 Jan 2024 • Ren-xin Zhao, Jinjing Shi, Xuelong Li
In response to the dilemma of HAM and QML, a Grover-inspired Quantum Hard Attention Mechanism (GQHAM) consisting of a Flexible Oracle (FO) and an Adaptive Diffusion Operator (ADO) is proposed.
no code implementations • 25 Aug 2023 • Ren-xin Zhao, Jinjing Shi, Xuelong Li
Self-Attention Mechanism (SAM) excels at distilling important information from the interior of data to improve the computational efficiency of models.
no code implementations • 14 Jul 2022 • Jinjing Shi, Ren-xin Zhao, Wenxuan Wang, Shichao Zhang, Xuelong Li
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features and greatly improves the performance of machine learning models, espeacially requiring efficient characterization and feature extraction of high-dimensional data.