Search Results for author: Shizhao Sun

Found 13 papers, 2 papers with code

Text-to-CAD Generation Through Infusing Visual Feedback in Large Language Models

no code implementations31 Jan 2025 Ruiyu Wang, Yu Yuan, Shizhao Sun, Jiang Bian

In this work, we introduce CADFusion, a framework that uses Large Language Models (LLMs) as the backbone and alternates between two training stages: the sequential learning (SL) stage and the visual feedback (VF) stage.

From Elements to Design: A Layered Approach for Automatic Graphic Design Composition

no code implementations27 Dec 2024 Jiawei Lin, Shizhao Sun, Danqing Huang, Ting Liu, Ji Li, Jiang Bian

Based on the planning results, it subsequently predicts element attributes that control the design composition in a layer-wise manner, and includes the rendered image of previously generated layers into the context.

AnalogXpert: Automating Analog Topology Synthesis by Incorporating Circuit Design Expertise into Large Language Models

no code implementations17 Dec 2024 Haoyi Zhang, Shizhao Sun, Yibo Lin, Runsheng Wang, Jiang Bian

Third, we introduce a proofreading strategy that allows LLMs to incrementally correct the errors in the initial design, akin to human designers who iteratively check and adjust the initial topology design to ensure accuracy.

2k Code Generation +1

FlexCAD: Unified and Versatile Controllable CAD Generation with Fine-tuned Large Language Models

no code implementations5 Nov 2024 Zhanwei Zhang, Shizhao Sun, Wenxiao Wang, Deng Cai, Jiang Bian

First, to enhance comprehension by LLMs, we represent a CAD model as a structured text by abstracting each hierarchy as a sequence of text tokens.

Collaborative Evolving Strategy for Automatic Data-Centric Development

no code implementations26 Jul 2024 Xu Yang, Haotian Chen, Wenjun Feng, Haoxue Wang, Zeqi Ye, Xinjie Shen, Xiao Yang, Shizhao Sun, Weiqing Liu, Jiang Bian

By leveraging the strong complex problem-solving capabilities of large language models (LLMs), we propose an LLM-based autonomous agent, equipped with a strategy named Collaborative Knowledge-STudying-Enhanced Evolution by Retrieval (Co-STEER), to simultaneously address all the challenges.

Scheduling

LayoutPrompter: Awaken the Design Ability of Large Language Models

1 code implementation NeurIPS 2023 Jiawei Lin, Jiaqi Guo, Shizhao Sun, Zijiang James Yang, Jian-Guang Lou, Dongmei Zhang

In this work, we propose LayoutPrompter, which leverages large language models (LLMs) to address the above problems through in-context learning.

In-Context Learning

A Parse-Then-Place Approach for Generating Graphic Layouts from Textual Descriptions

no code implementations ICCV 2023 Jiawei Lin, Jiaqi Guo, Shizhao Sun, Weijiang Xu, Ting Liu, Jian-Guang Lou, Dongmei Zhang

To model combined and incomplete constraints, we use a Transformer-based layout generation model and carefully design a way to represent constraints and layouts as sequences.

LayoutDiffusion: Improving Graphic Layout Generation by Discrete Diffusion Probabilistic Models

1 code implementation ICCV 2023 Junyi Zhang, Jiaqi Guo, Shizhao Sun, Jian-Guang Lou, Dongmei Zhang

To tackle the challenge, we summarize three critical factors for achieving a mild forward process for the layout, i. e., legality, coordinate proximity and type disruption.

Layout Design

LayoutFormer++: Conditional Graphic Layout Generation via Constraint Serialization and Decoding Space Restriction

no code implementations CVPR 2023 Zhaoyun Jiang, Jiaqi Guo, Shizhao Sun, Huayu Deng, Zhongkai Wu, Vuksan Mijovic, Zijiang James Yang, Jian-Guang Lou, Dongmei Zhang

First, to flexibly handle diverse constraints, we propose a constraint serialization scheme, which represents different user constraints as sequences of tokens with a predefined format.

Decoder

Data-Anonymous Encoding for Text-to-SQL Generation

no code implementations IJCNLP 2019 Zhen Dong, Shizhao Sun, Hongzhi Liu, Jian-Guang Lou, Dongmei Zhang

On text-to-SQL generation, the input utterance usually contains lots of tokens that are related to column names or cells in the table, called \textit{table-related tokens}.

Text-To-SQL

Slim-DP: A Light Communication Data Parallelism for DNN

no code implementations27 Sep 2017 Shizhao Sun, Wei Chen, Jiang Bian, Xiaoguang Liu, Tie-Yan Liu

However, with the increasing size of DNN models and the large number of workers in practice, this typical data parallelism cannot achieve satisfactory training acceleration, since it usually suffers from the heavy communication cost due to transferring huge amount of information between workers and the parameter server.

Ensemble-Compression: A New Method for Parallel Training of Deep Neural Networks

no code implementations2 Jun 2016 Shizhao Sun, Wei Chen, Jiang Bian, Xiaoguang Liu, Tie-Yan Liu

In this framework, we propose to aggregate the local models by ensemble, i. e., averaging the outputs of local models instead of the parameters.

Model Compression

On the Depth of Deep Neural Networks: A Theoretical View

no code implementations17 Jun 2015 Shizhao Sun, Wei Chen, Li-Wei Wang, Xiaoguang Liu, Tie-Yan Liu

First, we derive an upper bound for RA of DNN, and show that it increases with increasing depth.

Cannot find the paper you are looking for? You can Submit a new open access paper.