no code implementations • NAACL (CMCL) 2021 • Erik McGuire, Noriko Tomuro
Many current language models such as BERT utilize attention mechanisms to transform sequence representations.
no code implementations • 15 Mar 2024 • Jin Cui, Fumiyo Fukumoto, Xinfeng Wang, Yoshimi Suzuki, Jiyi Li, Noriko Tomuro, Wanzeng Kong
To address the issue of multiple aspect categories and sentiment entanglement, we propose a hierarchical disentanglement module to extract distinct categories and sentiment features.