Search Results for author: Zhengqing Yuan

Found 8 papers, 5 papers with code

Fortifying Ethical Boundaries in AI: Advanced Strategies for Enhancing Security in Large Language Models

no code implementations27 Jan 2024 Yunhong He, Jianling Qiu, Wei zhang, Zhengqing Yuan

Recent advancements in large language models (LLMs) have significantly enhanced capabilities in natural language processing and artificial intelligence.

Question Answering Text Generation

TinyGPT-V: Efficient Multimodal Large Language Model via Small Backbones

2 code implementations28 Dec 2023 Zhengqing Yuan, Zhaoxu Li, Weiran Huang, Yanfang Ye, Lichao Sun

In recent years, multimodal large language models (MLLMs) such as GPT-4V have demonstrated remarkable advancements, excelling in a variety of vision-language tasks.

Computational Efficiency Image Captioning +5

ArtGPT-4: Towards Artistic-understanding Large Vision-Language Models with Enhanced Adapter

1 code implementation12 May 2023 Zhengqing Yuan, Yunhong He, Kun Wang, Yanfang Ye, Lichao Sun

However, a grand challenge of exploiting LLMs for multimodal learning is the size of pre-trained LLMs which are always with billions of parameters.

Image Comprehension Language Modelling

Hulk: Graph Neural Networks for Optimizing Regionally Distributed Computing Systems

no code implementations27 Feb 2023 Zhengqing Yuan, Huiwen Xue, Chao Zhang, Yongming Liu

Large deep learning models have shown great potential for delivering exceptional results in various applications.

Distributed Computing

RPN: A Word Vector Level Data Augmentation Algorithm in Deep Learning for Language Understanding

1 code implementation12 Dec 2022 Zhengqing Yuan, Xiaolong Zhang, Yue Wang, Xuecong Hou, Huiwen Xue, Zhuanzhe Zhao, Yongming Liu

However, existing data augmentation techniques in natural language understanding (NLU) may not fully capture the complexity of natural language variations, and they can be challenging to apply to large datasets.

CoLA Natural Language Inference +5

Cannot find the paper you are looking for? You can Submit a new open access paper.