no code implementations • 21 Jan 2024 • Rongqing Cong, Wenyang He, Mingxuan Li, Bangning Luo, Zebin Yang, Yuchao Yang, Ru Huang, Bonan Yan
Large language models (LLMs) with Transformer architectures have become phenomenal in natural language processing, multimodal generative artificial intelligence, and agent-oriented artificial intelligence.
no code implementations • 31 Oct 2023 • Cenlin Duan, Jianlei Yang, Xiaolin He, Yingjie Qi, Yikun Wang, Yiou Wang, Ziyan He, Bonan Yan, Xueyan Wang, Xiaotao Jia, Weitao Pan, Weisheng Zhao
Processing-in-memory (PIM), as a novel computing paradigm, provides significant performance benefits from the aspect of effective data movement reduction.
no code implementations • 15 Jun 2019 • Bing Li, Bonan Yan, Hai, Li
The conventional von Neumann architecture has been revealed as a major performance and energy bottleneck for rising data-intensive applications.