no code implementations • 28 Mar 2025 • Pengsong Zhang, Heng Zhang, Huazhe Xu, Renjun Xu, Zhenting Wang, Cong Wang, Animesh Garg, Zhibin Li, Arash Ajoudani, Xinyu Liu
Scientific discovery is poised for rapid advancement through advanced robotics and artificial intelligence.
no code implementations • 17 Feb 2025 • Chunan Yu, Yidong Han, Chaotao Ding, Ying Zang, Lanyun Zhu, Xinhao Chen, Zejian Li, Renjun Xu, Tianrun Chen
In the era of the metaverse, where immersive technologies redefine human experiences, translating abstract literary concepts into navigable 3D environments presents a fundamental challenge in preserving semantic and emotional fidelity.
1 code implementation • 30 Dec 2024 • Zijie Chen, Zhanchao Zhou, Yu Lu, Renjun Xu, Lili Pan, Zhenzhong Lan
Solving NP-hard problems traditionally relies on heuristics, yet manually designing effective heuristics for complex problems remains a significant challenge.
no code implementations • 18 Dec 2024 • Kejie Chen, Lin Wang, Qinghai Zhang, Renjun Xu
Recent studies have highlighted the limitations of large language models in mathematical reasoning, particularly their inability to capture the underlying logic.
no code implementations • 18 Oct 2024 • Xiang Hu, Hongyu Fu, Jinge Wang, Yifeng Wang, Zhikun Li, Renjun Xu, Yu Lu, Yaochu Jin, Lili Pan, Zhenzhong Lan
Scientific innovation is pivotal for humanity, and harnessing large language models (LLMs) to generate research ideas could transform discovery.
no code implementations • 21 Jun 2024 • Lichao Zhang, JIA YU, Shuai Zhang, Long Li, Yangyang Zhong, Guanbao Liang, Yuming Yan, Qing Ma, Fangsheng Weng, Fayu Pan, Jing Li, Renjun Xu, Zhenzhong Lan
Large Language Models (LLMs) have significantly advanced user-bot interactions, enabling more complex and coherent dialogues.
1 code implementation • 26 Jan 2024 • Qiang Zhang, Keyang Ding, Tianwen Lyv, Xinda Wang, Qingyu Yin, Yiwen Zhang, Jing Yu, Yuhao Wang, Xiaotong Li, Zhuoyi Xiang, Kehua Feng, Xiang Zhuang, Zeyuan Wang, Ming Qin, Mengyao Zhang, Jinlu Zhang, Jiyu Cui, Tao Huang, Pengju Yan, Renjun Xu, Hongyang Chen, Xiaolin Li, Xiaohui Fan, Huabin Xing, Huajun Chen
Large Language Models (LLMs) have emerged as a transformative power in enhancing natural language comprehension, representing a significant stride toward artificial general intelligence.
1 code implementation • 28 Jun 2023 • Ke Liu, Kaifan Yang, Jiahong Zhang, Renjun Xu
To the best of our knowledge, S2SNet is the first work to predict superconductivity with only information of crystal structures.
1 code implementation • 11 Jun 2023 • Renjun Xu, Kaifan Yang, Ke Liu, Fengxiang He
To address this issue, we design a Group Equivariant Vision Transformer (GE-ViT) via a novel, effective positional encoding operator.
1 code implementation • 1 Dec 2021 • Wang Lu, Jindong Wang, Yiqiang Chen, Xin Qin, Renjun Xu, Dimitrios Dimitriadis, Tao Qin
There is a growing interest in applying machine learning techniques to healthcare.
no code implementations • 20 Oct 2021 • Bei Yang, Jie Gu, Ke Liu, Xiaoxiao Xu, Renjun Xu, Qinghui Sun, Hong Liu
User Modeling plays an essential role in industry.
no code implementations • 29 Sep 2021 • Bei Yang, Ke Liu, Xiaoxiao Xu, Renjun Xu, Hong Liu, Huan Xu
However, existing researches have little ability to model universal user representation based on lifelong behavior sequences since user registration.
no code implementations • 18 Sep 2021 • Qinghui Sun, Jie Gu, Bei Yang, Xiaoxiao Xu, Renjun Xu, Shangde Gao, Hong Liu, Huan Xu
Universal user representation has received many interests recently, with which we can be free from the cumbersome work of training a specific model for each downstream application.
2 code implementations • 10 Aug 2021 • Yuntao Du, Jindong Wang, Wenjie Feng, Sinno Pan, Tao Qin, Renjun Xu, Chongjun Wang
This paper proposes Adaptive RNNs (AdaRNN) to tackle the TCS problem by building an adaptive model that generalizes well on the unseen test data.
2 code implementations • 18 May 2021 • Wenxin Hou, Han Zhu, Yidong Wang, Jindong Wang, Tao Qin, Renjun Xu, Takahiro Shinozaki
Based on our previous MetaAdapter that implicitly leverages adapters, we propose a novel algorithms called SimAdapter for explicitly learning knowledge from adapters.
Ranked #1 on
Cross-Lingual ASR
on Common Voice
no code implementations • 3 Mar 2021 • Jindong Wang, Wenjie Feng, Chang Liu, Chaohui Yu, Mingxuan Du, Renjun Xu, Tao Qin, Tie-Yan Liu
Being expensive and time-consuming to collect massive COVID-19 image samples to train deep classification models, transfer learning is a promising approach by transferring knowledge from the abundant typical pneumonia datasets for COVID-19 image classification.
1 code implementation • 17 Jul 2020 • Chaohui Yu, Jindong Wang, Chang Liu, Tao Qin, Renjun Xu, Wenjie Feng, Yiqiang Chen, Tie-Yan Liu
However, it remains challenging to determine which method is suitable for a given application since they are built with certain priors or bias.
no code implementations • 11 Jul 2020 • Renjun Xu, Pelen Liu, Yin Zhang, Fang Cai, Jindong Wang, Shuoying Liang, Heting Ying, Jianwei Yin
However, in a general setting when the target domain contains classes that are never observed in the source domain, namely in Open Set Domain Adaptation (OSDA), existing DA methods failed to work because of the interference of the extra unknown classes.
no code implementations • CVPR 2020 • Renjun Xu, Pelen Liu, Liyan Wang, Chao Chen, Jindong Wang
Besides, the weighted optimal transport strategy based on SSR is exploited to achieve the precise-pair-wise optimal transport procedure, which reduces negative transfer brought by the samples near decision boundaries in the target domain.