Search Results for author: Bonan Yan

Found 3 papers, 0 papers with code

AttentionLego: An Open-Source Building Block For Spatially-Scalable Large Language Model Accelerator With Processing-In-Memory Technology

no code implementations21 Jan 2024 Rongqing Cong, Wenyang He, Mingxuan Li, Bangning Luo, Zebin Yang, Yuchao Yang, Ru Huang, Bonan Yan

Large language models (LLMs) with Transformer architectures have become phenomenal in natural language processing, multimodal generative artificial intelligence, and agent-oriented artificial intelligence.

Language Modelling Large Language Model

DDC-PIM: Efficient Algorithm/Architecture Co-design for Doubling Data Capacity of SRAM-based Processing-In-Memory

no code implementations31 Oct 2023 Cenlin Duan, Jianlei Yang, Xiaolin He, Yingjie Qi, Yikun Wang, Yiou Wang, Ziyan He, Bonan Yan, Xueyan Wang, Xiaotao Jia, Weitao Pan, Weisheng Zhao

Processing-in-memory (PIM), as a novel computing paradigm, provides significant performance benefits from the aspect of effective data movement reduction.

An Overview of In-memory Processing with Emerging Non-volatile Memory for Data-intensive Applications

no code implementations15 Jun 2019 Bing Li, Bonan Yan, Hai, Li

The conventional von Neumann architecture has been revealed as a major performance and energy bottleneck for rising data-intensive applications.

Cannot find the paper you are looking for? You can Submit a new open access paper.