Search Results for author: Qihao Zhu

Found 21 papers, 13 papers with code

DeepSeek-Prover-V1.5: Harnessing Proof Assistant Feedback for Reinforcement Learning and Monte-Carlo Tree Search

1 code implementation15 Aug 2024 Huajian Xin, Z. Z. Ren, Junxiao Song, Zhihong Shao, Wanjia Zhao, Haocheng Wang, Bo Liu, Liyue Zhang, Xuan Lu, Qiushi Du, Wenjun Gao, Qihao Zhu, Dejian Yang, Zhibin Gou, Z. F. Wu, Fuli Luo, Chong Ruan

We introduce DeepSeek-Prover-V1. 5, an open-source language model designed for theorem proving in Lean 4, which enhances DeepSeek-Prover-V1 by optimizing both training and inference processes.

Automated Theorem Proving Language Modelling

DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model

4 code implementations7 May 2024 DeepSeek-AI, Aixin Liu, Bei Feng, Bin Wang, Bingxuan Wang, Bo Liu, Chenggang Zhao, Chengqi Dengr, Chong Ruan, Damai Dai, Daya Guo, Dejian Yang, Deli Chen, Dongjie Ji, Erhang Li, Fangyun Lin, Fuli Luo, Guangbo Hao, Guanting Chen, Guowei Li, H. Zhang, Hanwei Xu, Hao Yang, Haowei Zhang, Honghui Ding, Huajian Xin, Huazuo Gao, Hui Li, Hui Qu, J. L. Cai, Jian Liang, JianZhong Guo, Jiaqi Ni, Jiashi Li, Jin Chen, Jingyang Yuan, Junjie Qiu, Junxiao Song, Kai Dong, Kaige Gao, Kang Guan, Lean Wang, Lecong Zhang, Lei Xu, Leyi Xia, Liang Zhao, Liyue Zhang, Meng Li, Miaojun Wang, Mingchuan Zhang, Minghua Zhang, Minghui Tang, Mingming Li, Ning Tian, Panpan Huang, Peiyi Wang, Peng Zhang, Qihao Zhu, Qinyu Chen, Qiushi Du, R. J. Chen, R. L. Jin, Ruiqi Ge, Ruizhe Pan, Runxin Xu, Ruyi Chen, S. S. Li, Shanghao Lu, Shangyan Zhou, Shanhuang Chen, Shaoqing Wu, Shengfeng Ye, Shirong Ma, Shiyu Wang, Shuang Zhou, Shuiping Yu, Shunfeng Zhou, Size Zheng, T. Wang, Tian Pei, Tian Yuan, Tianyu Sun, W. L. Xiao, Wangding Zeng, Wei An, Wen Liu, Wenfeng Liang, Wenjun Gao, Wentao Zhang, X. Q. Li, Xiangyue Jin, Xianzu Wang, Xiao Bi, Xiaodong Liu, Xiaohan Wang, Xiaojin Shen, Xiaokang Chen, Xiaosha Chen, Xiaotao Nie, Xiaowen Sun, Xiaoxiang Wang, Xin Liu, Xin Xie, Xingkai Yu, Xinnan Song, Xinyi Zhou, Xinyu Yang, Xuan Lu, Xuecheng Su, Y. Wu, Y. K. Li, Y. X. Wei, Y. X. Zhu, Yanhong Xu, Yanping Huang, Yao Li, Yao Zhao, Yaofeng Sun, Yaohui Li, Yaohui Wang, Yi Zheng, Yichao Zhang, Yiliang Xiong, Yilong Zhao, Ying He, Ying Tang, Yishi Piao, Yixin Dong, Yixuan Tan, Yiyuan Liu, Yongji Wang, Yongqiang Guo, Yuchen Zhu, Yuduan Wang, Yuheng Zou, Yukun Zha, Yunxian Ma, Yuting Yan, Yuxiang You, Yuxuan Liu, Z. Z. Ren, Zehui Ren, Zhangli Sha, Zhe Fu, Zhen Huang, Zhen Zhang, Zhenda Xie, Zhewen Hao, Zhihong Shao, Zhiniu Wen, Zhipeng Xu, Zhongyu Zhang, Zhuoshu Li, Zihan Wang, Zihui Gu, Zilin Li, Ziwei Xie

MLA guarantees efficient inference through significantly compressing the Key-Value (KV) cache into a latent vector, while DeepSeekMoE enables training strong models at an economical cost through sparse computation.

Language Modelling Reinforcement Learning (RL)

Reading Users' Minds from What They Say: An Investigation into LLM-based Empathic Mental Inference

no code implementations20 Mar 2024 Qihao Zhu, Leah Chong, Maria Yang, Jianxi Luo

In human-centered design, developing a comprehensive and in-depth understanding of user experiences, i. e., empathic understanding, is paramount for designing products that truly meet human needs.

Prompt Engineering

DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models

2 code implementations5 Feb 2024 Zhihong Shao, Peiyi Wang, Qihao Zhu, Runxin Xu, Junxiao Song, Xiao Bi, Haowei Zhang, Mingchuan Zhang, Y. K. Li, Y. Wu, Daya Guo

Mathematical reasoning poses a significant challenge for language models due to its complex and structured nature.

Ranked #26 on Math Word Problem Solving on MATH (using extra training data)

Arithmetic Reasoning Math +1

Toward Artificial Empathy for Human-Centered Design: A Framework

no code implementations19 Mar 2023 Qihao Zhu, Jianxi Luo

Specifically, we conduct an interdisciplinary investigation of research areas such as data-driven user studies, empathic understanding development, and artificial empathy.

Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers

no code implementations26 Dec 2022 Qihao Zhu, Xinyu Zhang, Jianxi Luo

This paper proposes a generative design approach based on the generative pre-trained language model (PLM) to automatically retrieve and map biological analogy and generate BID in the form of natural language.

Language Modelling

Generative Transformers for Design Concept Generation

no code implementations7 Nov 2022 Qihao Zhu, Jianxi Luo

Generating novel and useful concepts is essential during the early design stage to explore a large variety of design opportunities, which usually requires advanced design thinking ability and a wide range of knowledge from designers.

Retrieval Text Generation

Generative Pre-Trained Transformers for Biologically Inspired Design

no code implementations31 Mar 2022 Qihao Zhu, Xinyu Zhang, Jianxi Luo

Biological systems in nature have evolved for millions of years to adapt and survive the environment.

Language Modelling

Generative Design Ideation: A Natural Language Generation Approach

no code implementations28 Mar 2022 Qihao Zhu, Jianxi Luo

This paper aims to explore a generative approach for knowledge-based design ideation by applying the latest pre-trained language models in artificial intelligence (AI).

Text Generation

Generative Pre-Trained Transformer for Design Concept Generation: An Exploration

no code implementations16 Nov 2021 Qihao Zhu, Jianxi Luo

Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers.

Novel Concepts

Lyra: A Benchmark for Turducken-Style Code Generation

1 code implementation27 Aug 2021 Qingyuan Liang, Zeyu Sun, Qihao Zhu, Wenjie Zhang, Lian Yu, Yingfei Xiong, Lu Zhang

Since a declarative language is typically embedded in an imperative language (i. e., the turducken-style programming) in real-world software development, the promising results on declarative languages can hardly lead to significant reduction of manual software development efforts.

Code Generation

A Syntax-Guided Edit Decoder for Neural Program Repair

1 code implementation15 Jun 2021 Qihao Zhu, Zeyu Sun, Yuan-an Xiao, Wenjie Zhang, Kang Yuan, Yingfei Xiong, Lu Zhang

Our results show that Recoder repairs 53 bugs on Defects4J v1. 2, which achieves 21. 4% improvement over the previous state-of-the-art approach for single-hunk bugs (TBar).

Code Completion Code Generation +2

Generalized Equivariance and Preferential Labeling for GNN Node Classification

1 code implementation23 Feb 2021 Zeyu Sun, Wenjie Zhang, Lili Mou, Qihao Zhu, Yingfei Xiong, Lu Zhang

Existing graph neural networks (GNNs) largely rely on node embeddings, which represent a node as a vector by its identity, type, or content.

General Classification Graph Classification +1

OCoR: An Overlapping-Aware Code Retriever

2 code implementations12 Aug 2020 Qihao Zhu, Zeyu Sun, Xiran Liang, Yingfei Xiong, Lu Zhang

To address these problems, we propose a novel neural architecture named OCoR, where we introduce two specifically-designed components to capture overlaps: the first embeds identifiers by character to capture the overlaps between identifiers, and the second introduces a novel overlap matrix to represent the degrees of overlaps between each natural language word and each identifier.

Retrieval

NLocalSAT: Boosting Local Search with Solution Prediction

1 code implementation26 Jan 2020 Wenjie Zhang, Zeyu Sun, Qihao Zhu, Ge Li, Shaowei Cai, Yingfei Xiong, Lu Zhang

However, in this method, the initialization is assigned in a random manner, which impacts the effectiveness of SLS solvers.

TreeGen: A Tree-Based Transformer Architecture for Code Generation

2 code implementations22 Nov 2019 Zeyu Sun, Qihao Zhu, Yingfei Xiong, Yican Sun, Lili Mou, Lu Zhang

TreeGen outperformed the previous state-of-the-art approach by 4. 5 percentage points on HearthStone, and achieved the best accuracy among neural network-based approaches on ATIS (89. 1%) and GEO (89. 6%).

Code Generation Semantic Parsing

A Grammar-Based Structural CNN Decoder for Code Generation

1 code implementation14 Nov 2018 Zeyu Sun, Qihao Zhu, Lili Mou, Yingfei Xiong, Ge Li, Lu Zhang

In this paper, we propose a grammar-based structural convolutional neural network (CNN) for code generation.

Code Generation Decoder +2

Cannot find the paper you are looking for? You can Submit a new open access paper.