no code implementations • 9 Oct 2024 • Jingyang Deng, Zhengyang Shen, Boyang Wang, Lixin Su, Suqi Cheng, Ying Nie, Junfeng Wang, Dawei Yin, Jinwen Ma
The development of Long-Context Large Language Models (LLMs) has markedly advanced natural language processing by facilitating the process of textual data across long documents and multiple corpora.
no code implementations • 15 Aug 2024 • Xihong Yang, Heming Jing, Zixing Zhang, Jindong Wang, Huakang Niu, Shuaiqiang Wang, Yu Lu, Junfeng Wang, Dawei Yin, Xinwang Liu, En Zhu, Defu Lian, Erxue Min
In this work, we prove that directly aligning the representations of LLMs and collaborative models is sub-optimal for enhancing downstream recommendation tasks performance, based on the information theorem.
no code implementations • 6 May 2024 • Wenjie Zhou, Zhenxin Ding, Xiaodong Zhang, Haibo Shi, Junfeng Wang, Dawei Yin
Pre-trained language models have become an integral component of question-answering systems, achieving remarkable performance.
1 code implementation • 12 Feb 2024 • Dongsheng Zhu, Xunzhu Tang, Weidong Han, Jinghui Lu, Yukun Zhao, Guoliang Xing, Junfeng Wang, Dawei Yin
This paper presents VisLingInstruct, a novel approach to advancing Multi-Modal Language Models (MMLMs) in zero-shot learning.
no code implementations • 6 Jan 2024 • Qian Li, Lixin Su, Jiashu Zhao, Long Xia, Hengyi Cai, Suqi Cheng, Hengzhu Tang, Junfeng Wang, Dawei Yin
Compared to conventional textual retrieval, the main obstacle for text-video retrieval is the semantic gap between the textual nature of queries and the visual richness of video content.
no code implementations • 24 Dec 2023 • Xiaopeng Li, Lixin Su, Pengyue Jia, Xiangyu Zhao, Suqi Cheng, Junfeng Wang, Dawei Yin
To be specific, we use Chain of Thought (CoT) technology to utilize Large Language Models (LLMs) as agents to emulate various demographic profiles, then use them for efficient query rewriting, and we innovate a robust Multi-gate Mixture of Experts (MMoE) architecture coupled with a hybrid loss function, collectively strengthening the ranking models' robustness.
1 code implementation • 1 Nov 2023 • Wei Wei, Xubin Ren, Jiabin Tang, Qinyong Wang, Lixin Su, Suqi Cheng, Junfeng Wang, Dawei Yin, Chao Huang
By employing these strategies, we address the challenges posed by sparse implicit feedback and low-quality side information in recommenders.
1 code implementation • 24 Oct 2023 • Xubin Ren, Wei Wei, Lianghao Xia, Lixin Su, Suqi Cheng, Junfeng Wang, Dawei Yin, Chao Huang
RLMRec incorporates auxiliary textual signals, develops a user/item profiling paradigm empowered by LLMs, and aligns the semantic space of LLMs with the representation space of collaborative relational signals through a cross-view alignment framework.
no code implementations • 31 Aug 2023 • Xiao Wang, Fang Dai, Wenyan Guo, Junfeng Wang
Therefore, a stochastic block model that integrates betweenness centrality and clustering coefficient of nodes for community detection in attributed networks, named BCSBM, is proposed in this paper.
no code implementations • 10 Aug 2021 • Ting Pan, Jizhong Duan, Junfeng Wang, Yu Liu
Recent methods have exploited the nonlocal self-similarity (NSS) of images by imposing nonlocal low-rankness of similar patches to achieve a superior performance.
no code implementations • 11 Mar 2021 • Xingyu Jiang, Mingyang Qin, Xinjian Wei, Zhongpei Feng, Jiezun Ke, Haipeng Zhu, Fucong Chen, Liping Zhang, Li Xu, Xu Zhang, Ruozhou Zhang, Zhongxu Wei, Peiyu Xiong, Qimei Liang, Chuanying Xi, Zhaosheng Wang, Jie Yuan, Beiyi Zhu, Kun Jiang, Ming Yang, Junfeng Wang, Jiangping Hu, Tao Xiang, Brigitte Leridon, Rong Yu, Qihong Chen, Kui Jin, Zhongxian Zhao
Iron selenide (FeSe) - the structurally simplest iron-based superconductor, has attracted tremendous interest in the past years.
Superconductivity
no code implementations • 4 Sep 2020 • Jingzhe Ma, W. Peter Maksym, G. Fabbiano, Martin Elvis, Thaisa Storchi-Bergmann, Margarita Karovska, Junfeng Wang, Andrea Travascio
The Seyfert nucleus and ionization cones transition to and are surrounded by a LINER cocoon, extending up to $\sim$ 250 pc in thickness.
Astrophysics of Galaxies