no code implementations • 25 Jan 2024 • Ren-xin Zhao, Jinjing Shi, Xuelong Li
In response to the dilemma of HAM and QML, a Grover-inspired Quantum Hard Attention Mechanism (GQHAM) consisting of a Flexible Oracle (FO) and an Adaptive Diffusion Operator (ADO) is proposed.
no code implementations • 25 Aug 2023 • Ren-xin Zhao, Jinjing Shi, Xuelong Li
Self-Attention Mechanism (SAM) excels at distilling important information from the interior of data to improve the computational efficiency of models.
no code implementations • 14 Jul 2022 • Jinjing Shi, Ren-xin Zhao, Wenxuan Wang, Shichao Zhang, Xuelong Li
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features and greatly improves the performance of machine learning models, espeacially requiring efficient characterization and feature extraction of high-dimensional data.