no code implementations • 21 Nov 2024 • Zehua Pei, Hui-Ling Zhen, Xianzhi Yu, Sinno Jialin Pan, Mingxuan Yuan, Bei Yu
In this paper, we propose FuseGPT, a novel methodology to recycle the pruned transformer blocks to further recover the model performance.
no code implementations • 19 Feb 2024 • Yu Zhang, Hui-Ling Zhen, Zehua Pei, Yingzhao Lian, Lihao Yin, Mingxuan Yuan, Bei Yu
In this paper, we propose a novel differential logic layer-aided language modeling (DiLA) approach, where logical constraints are integrated into the forward and backward passes of a network layer, to provide another option for LLM tool learning.
no code implementations • 3 Feb 2024 • Zehua Pei, Hui-Ling Zhen, Mingxuan Yuan, Yu Huang, Bei Yu
In this work, we propose a Verilog generation framework, BetterV, which fine-tunes the large language models (LLMs) on processed domain-specific datasets and incorporates generative discriminators for guidance on particular design demands.
no code implementations • 15 Mar 2023 • Guojin Chen, Zehua Pei, HaoYu Yang, Yuzhe ma, Bei Yu, Martin D. F. Wong
Lithography is fundamental to integrated circuit fabrication, necessitating large computation overhead.
1 code implementation • 29 Sep 2021 • Juncheng Li, Zehua Pei, Wenjie Li, Guangwei Gao, Longguang Wang, Yingqian Wang, Tieyong Zeng
This is an exhaustive survey of SISR, which can help researchers better understand SISR and inspire more exciting research in this field.