Search Results for author: Xingchao Liu

Found 34 papers, 20 papers with code

DeepSeek-V3 Technical Report

1 code implementation27 Dec 2024 DeepSeek-AI, Aixin Liu, Bei Feng, Bing Xue, Bingxuan Wang, Bochao Wu, Chengda Lu, Chenggang Zhao, Chengqi Deng, Chenyu Zhang, Chong Ruan, Damai Dai, Daya Guo, Dejian Yang, Deli Chen, Dongjie Ji, Erhang Li, Fangyun Lin, Fucong Dai, Fuli Luo, Guangbo Hao, Guanting Chen, Guowei Li, H. Zhang, Han Bao, Hanwei Xu, Haocheng Wang, Haowei Zhang, Honghui Ding, Huajian Xin, Huazuo Gao, Hui Li, Hui Qu, J. L. Cai, Jian Liang, JianZhong Guo, Jiaqi Ni, Jiashi Li, Jiawei Wang, Jin Chen, Jingchang Chen, Jingyang Yuan, Junjie Qiu, Junlong Li, Junxiao Song, Kai Dong, Kai Hu, Kaige Gao, Kang Guan, Kexin Huang, Kuai Yu, Lean Wang, Lecong Zhang, Lei Xu, Leyi Xia, Liang Zhao, Litong Wang, Liyue Zhang, Meng Li, Miaojun Wang, Mingchuan Zhang, Minghua Zhang, Minghui Tang, Mingming Li, Ning Tian, Panpan Huang, Peiyi Wang, Peng Zhang, Qiancheng Wang, Qihao Zhu, Qinyu Chen, Qiushi Du, R. J. Chen, R. L. Jin, Ruiqi Ge, Ruisong Zhang, Ruizhe Pan, Runji Wang, Runxin Xu, Ruoyu Zhang, Ruyi Chen, S. S. Li, Shanghao Lu, Shangyan Zhou, Shanhuang Chen, Shaoqing Wu, Shengfeng Ye, Shirong Ma, Shiyu Wang, Shuang Zhou, Shuiping Yu, Shunfeng Zhou, Shuting Pan, T. Wang, Tao Yun, Tian Pei, Tianyu Sun, W. L. Xiao, Wangding Zeng, Wanjia Zhao, Wei An, Wen Liu, Wenfeng Liang, Wenjun Gao, Wenqin Yu, Wentao Zhang, X. Q. Li, Xiangyue Jin, Xianzu Wang, Xiao Bi, Xiaodong Liu, Xiaohan Wang, Xiaojin Shen, Xiaokang Chen, Xiaokang Zhang, Xiaosha Chen, Xiaotao Nie, Xiaowen Sun, Xiaoxiang Wang, Xin Cheng, Xin Liu, Xin Xie, Xingchao Liu, Xingkai Yu, Xinnan Song, Xinxia Shan, Xinyi Zhou, Xinyu Yang, Xinyuan Li, Xuecheng Su, Xuheng Lin, Y. K. Li, Y. Q. Wang, Y. X. Wei, Y. X. Zhu, Yang Zhang, Yanhong Xu, Yanping Huang, Yao Li, Yao Zhao, Yaofeng Sun, Yaohui Li, Yaohui Wang, Yi Yu, Yi Zheng, Yichao Zhang, Yifan Shi, Yiliang Xiong, Ying He, Ying Tang, Yishi Piao, Yisong Wang, Yixuan Tan, Yiyang Ma, Yiyuan Liu, Yongqiang Guo, Yu Wu, Yuan Ou, Yuchen Zhu, Yuduan Wang, Yue Gong, Yuheng Zou, Yujia He, Yukun Zha, Yunfan Xiong, Yunxian Ma, Yuting Yan, Yuxiang Luo, Yuxiang You, Yuxuan Liu, Yuyang Zhou, Z. F. Wu, Z. Z. Ren, Zehui Ren, Zhangli Sha, Zhe Fu, Zhean Xu, Zhen Huang, Zhen Zhang, Zhenda Xie, Zhengyan Zhang, Zhewen Hao, Zhibin Gou, Zhicheng Ma, Zhigang Yan, Zhihong Shao, Zhipeng Xu, Zhiyu Wu, Zhongyu Zhang, Zhuoshu Li, Zihui Gu, Zijia Zhu, Zijun Liu, Zilin Li, Ziwei Xie, Ziyang Song, Ziyi Gao, Zizheng Pan

We present DeepSeek-V3, a strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

Language Modeling Language Modelling

JanusFlow: Harmonizing Autoregression and Rectified Flow for Unified Multimodal Understanding and Generation

1 code implementation12 Nov 2024 Yiyang Ma, Xingchao Liu, Xiaokang Chen, Wen Liu, Chengyue Wu, Zhiyu Wu, Zizheng Pan, Zhenda Xie, Haowei Zhang, Xingkai Yu, Liang Zhao, Yisong Wang, Jiaying Liu, Chong Ruan

To further improve the performance of our unified model, we adopt two key strategies: (i) decoupling the understanding and generation encoders, and (ii) aligning their representations during unified training.

Language Modeling Language Modelling +2

SlimFlow: Training Smaller One-Step Diffusion Models with Rectified Flow

1 code implementation17 Jul 2024 Yuanzhi Zhu, Xingchao Liu, Qiang Liu

The rectified flow framework trains one-step generative models using two operations, reflow and distillation.

Consistency Flow Matching: Defining Straight Flows with Velocity Consistency

1 code implementation2 Jul 2024 Ling Yang, Zixiang Zhang, Zhilong Zhang, Xingchao Liu, Minkai Xu, Wentao Zhang, Chenlin Meng, Stefano Ermon, Bin Cui

Additionally, we propose a multi-segment training approach for Consistency-FM to enhance expressiveness, achieving a better trade-off between sampling quality and speed.

Image Generation

PeRFlow: Piecewise Rectified Flow as Universal Plug-and-Play Accelerator

1 code implementation13 May 2024 Hanshu Yan, Xingchao Liu, Jiachun Pan, Jun Hao Liew, Qiang Liu, Jiashi Feng

We present Piecewise Rectified Flow (PeRFlow), a flow-based method for accelerating diffusion models.

Language Rectified Flow: Advancing Diffusion Language Generation with Probabilistic Flows

no code implementations25 Mar 2024 Shujian Zhang, Lemeng Wu, Chengyue Gong, Xingchao Liu

Extensive experiments and ablation studies demonstrate that our method can be general, effective, and beneficial for many NLP tasks.

Language Modeling Language Modelling +2

AdaFlow: Imitation Learning with Variance-Adaptive Flow-Based Policies

1 code implementation6 Feb 2024 Xixi Hu, Bo Liu, Xingchao Liu, Qiang Liu

To address this challenge, we propose AdaFlow, an imitation learning framework based on flow-based generative modeling.

Diversity Imitation Learning

InstaFlow: One Step is Enough for High-Quality Diffusion-Based Text-to-Image Generation

2 code implementations12 Sep 2023 Xingchao Liu, Xiwen Zhang, Jianzhu Ma, Jian Peng, Qiang Liu

Leveraging our new pipeline, we create, to the best of our knowledge, the first one-step diffusion-based text-to-image generator with SD-level image quality, achieving an FID (Frechet Inception Distance) of $23. 3$ on MS COCO 2017-5k, surpassing the previous state-of-the-art technique, progressive distillation, by a significant margin ($37. 2$ $\rightarrow$ $23. 3$ in FID).

Text-to-Image Generation

AutoML-GPT: Automatic Machine Learning with GPT

no code implementations4 May 2023 Shujian Zhang, Chengyue Gong, Lemeng Wu, Xingchao Liu, Mingyuan Zhou

Ultimately, with this prompt paragraph, AutoML-GPT will automatically conduct the experiments from data processing to model architecture, hyperparameter tuning, and predicted training log.

AutoML

FlowGrad: Controlling the Output of Generative ODEs With Gradients

no code implementations CVPR 2023 Xingchao Liu, Lemeng Wu, Shujian Zhang, Chengyue Gong, Wei Ping, Qiang Liu

To further accelerate the computation of the back-propagation, we propose to use a non-uniform discretization to approximate the ODE trajectory, where we measure how straight the trajectory is and gather the straight parts into one discretization step.

Image Manipulation

Fast Point Cloud Generation with Straight Flows

1 code implementation CVPR 2023 Lemeng Wu, Dilin Wang, Chengyue Gong, Xingchao Liu, Yunyang Xiong, Rakesh Ranjan, Raghuraman Krishnamoorthi, Vikas Chandra, Qiang Liu

We perform evaluations on multiple 3D tasks and find that our PSF performs comparably to the standard diffusion model, outperforming other efficient 3D point cloud generation methods.

Point Cloud Completion

Passage-Mask: A Learnable Regularization Strategy for Retriever-Reader Models

no code implementations2 Nov 2022 Shujian Zhang, Chengyue Gong, Xingchao Liu

Experiments on different tasks across open question answering, dialogue conversation, and fact verification show that our method consistently outperforms its baselines.

Answer Generation Fact Verification +2

Neural Volumetric Mesh Generator

no code implementations6 Oct 2022 Yan Zheng, Lemeng Wu, Xingchao Liu, Zhen Chen, Qiang Liu, QiXing Huang

We first propose a diffusion-based generative model to tackle this problem by generating voxelized shapes with close-to-reality outlines and structures.

Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow

6 code implementations7 Sep 2022 Xingchao Liu, Chengyue Gong, Qiang Liu

The idea of rectified flow is to learn the ODE to follow the straight paths connecting the points drawn from \pi_0 and \pi_1 as much as possible.

Domain Adaptation Image-to-Image Translation +1

Diffusion-based Molecule Generation with Informative Prior Bridges

no code implementations2 Sep 2022 Lemeng Wu, Chengyue Gong, Xingchao Liu, Mao Ye, Qiang Liu

AI-based molecule generation provides a promising approach to a large area of biomedical sciences and engineering, such as antibody design, hydrolase engineering, or vaccine development.

3D Generation Point Cloud Generation

Let us Build Bridges: Understanding and Extending Diffusion Generative Models

no code implementations31 Aug 2022 Xingchao Liu, Lemeng Wu, Mao Ye, Qiang Liu

Diffusion-based generative models have achieved promising results recently, but raise an array of open questions in terms of conceptual understanding, theoretical analysis, algorithm improvement and extensions to discrete, structured, non-Euclidean domains.

Imputation

A Langevin-like Sampler for Discrete Distributions

1 code implementation20 Jun 2022 Ruqi Zhang, Xingchao Liu, Qiang Liu

We propose discrete Langevin proposal (DLP), a simple and scalable gradient-based proposal for sampling complex high-dimensional discrete distributions.

Efficient Exploration Text Generation

FuseDream: Training-Free Text-to-Image Generation with Improved CLIP+GAN Space Optimization

1 code implementation2 Dec 2021 Xingchao Liu, Chengyue Gong, Lemeng Wu, Shujian Zhang, Hao Su, Qiang Liu

We approach text-to-image generation by combining the power of the retrained CLIP representation with an off-the-shelf image generator (GANs), optimizing in the latent space of GAN to find images that achieve maximum CLIP score with the given input text.

counterfactual Navigate +1

Automatic and Harmless Regularization with Constrained and Lexicographic Optimization: A Dynamic Barrier Approach

no code implementations NeurIPS 2021 Chengyue Gong, Xingchao Liu, Qiang Liu

In this work, we consider constrained optimization as a more principled approach for trading off two losses, with a special emphasis on lexicographic optimization, a degenerated limit of constrained optimization which optimizes a secondary loss inside the optimal set of the main loss.

Sampling with Trusthworthy Constraints: A Variational Gradient Framework

1 code implementation NeurIPS 2021 Xingchao Liu, Xin Tong, Qiang Liu

In this work, we propose a family of constrained sampling algorithms which generalize Langevin Dynamics (LD) and Stein Variational Gradient Descent (SVGD) to incorporate a moment constraint specified by a general nonlinear function.

Bayesian Inference Fairness

Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent

1 code implementation NeurIPS 2021 Xingchao Liu, Xin Tong, Qiang Liu

Finding diverse and representative Pareto solutions from the Pareto front is a key challenge in multi-objective optimization (MOO).

Diversity

Conflict-Averse Gradient Descent for Multi-task Learning

4 code implementations NeurIPS 2021 Bo Liu, Xingchao Liu, Xiaojie Jin, Peter Stone, Qiang Liu

The goal of multi-task learning is to enable more efficient learning than single task learning by sharing model structures for a diverse set of tasks.

Multi-Task Learning

Sampling with Trusthworthy Constraints: A Variational Gradient Framework

1 code implementation NeurIPS 2021 Xingchao Liu, Xin Tong, Qiang Liu

In this work, we propose a family of constrained sampling algorithms which generalize Langevin Dynamics (LD) and Stein Variational Gradient Descent (SVGD) to incorporate a moment constraint specified by a general nonlinear function.

Bayesian Inference Fairness

Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent

1 code implementation NeurIPS 2021 Xingchao Liu, Xin Tong, Qiang Liu

Finding diverse and representative Pareto solutions from the Pareto front is a key challenge in multi-objective optimization (MOO).

Diversity

Automatic and Harmless Regularization with Constrained and Lexicographic Optimization: A Dynamic Barrier Approach

no code implementations NeurIPS 2021 Chengyue Gong, Xingchao Liu, Qiang Liu

In this work, we consider constrained optimization as a more principled approach for trading off two losses, with a special emphasis on lexicographic optimization, a degenerated limit of constrained optimization which optimizes a secondary loss inside the optimal set of the main loss.

Centroid Transformers: Learning to Abstract with Attention

no code implementations17 Feb 2021 Lemeng Wu, Xingchao Liu, Qiang Liu

Self-attention, as the key block of transformers, is a powerful mechanism for extracting features from the inputs.

Abstractive Text Summarization Clustering +1

Fast Training of Contrastive Learning with Intermediate Contrastive Loss

no code implementations1 Jan 2021 Chengyue Gong, Xingchao Liu, Qiang Liu

We apply our method to recently-proposed MOCO, SimCLR, SwAV and notice that we can reduce the computational cost with little loss on the performance of ImageNet linear classification and other downstream tasks.

Contrastive Learning

Certified Monotonic Neural Networks

1 code implementation NeurIPS 2020 Xingchao Liu, Xing Han, Na Zhang, Qiang Liu

In this work, we propose to certify the monotonicity of the general piece-wise linear neural networks by solving a mixed integer linear programming problem. This provides a new general approach for learning monotonic neural networks with arbitrary model structures.

Fairness

Post-training Quantization with Multiple Points: Mixed Precision without Mixed Precision

no code implementations20 Feb 2020 Xingchao Liu, Mao Ye, Dengyong Zhou, Qiang Liu

We propose multipoint quantization, a quantization method that approximates a full-precision weight vector using a linear combination of multiple vectors of low-bit numbers; this is in contrast to typical quantization methods that approximate each weight using a single low precision number.

object-detection Object Detection +1

Transfer Value or Policy? A Value-centric Framework Towards Transferrable Continuous Reinforcement Learning

no code implementations27 Sep 2018 Xingchao Liu, Tongzhou Mu, Hao Su

In this paper, we investigate the problem of transfer learning across environments with different dynamics while accomplishing the same task in the continuous control domain.

continuous-control Continuous Control +2

Cannot find the paper you are looking for? You can Submit a new open access paper.