no code implementations • 13 Dec 2024 • Junyan Hu, Xue Xiao, Mengqi Zhang, Yao Chen, Zhaochun Ren, Zhumin Chen, Pengjie Ren
As large language models (LLMs) grow in size, traditional full fine-tuning becomes increasingly impractical due to its high computational and storage costs.
1 code implementation • 27 Nov 2024 • Yao Chen, Jiabao Wang, Peichao Wang, Rui Zhang, Yang Li
Low-resolution fine-grained image classification has recently made significant progress, largely thanks to the super-resolution techniques and knowledge distillation methods.
no code implementations • 20 Nov 2024 • Peichao Wang, Jiabao Wang, Yao Chen, Rui Zhang, Yang Li, Zhuang Miao
To effectively guide the model to focus on targets' spatial features, this paper proposes the Local Contrast Attention Enhanced infrared small target detection Network (LCAE-Net), combining prior knowledge with data-driven deep learning methods.
no code implementations • 30 Oct 2024 • Vicky Dong, Hao Yu, Yao Chen
This study introduces a novel approach to sentence-level relation extraction (RE) that integrates Graph Neural Networks (GNNs) with Large Language Models (LLMs) to generate contextually enriched support documents.
no code implementations • 30 Sep 2024 • Zining Zhang, Yao Chen, Bingsheng He, Zhenjie Zhang
The increasing size and complexity of Large Language Models (LLMs) pose challenges for their deployment on personal computers and mobile devices.
no code implementations • 9 Sep 2024 • Shiru Wang, Yao Chen, Lesley A. Jarvis, Yucheng Tang, David J. Gladstone, Kimberley S. Samkoe, Brian W. Pogue, Petr Bruza, Rongxiao Zhang
To address the challenge of limited annotation of these features in Cherenkov images, a transfer learning strategy was applied.
no code implementations • 9 Sep 2024 • Yao Chen, Savannah M. Decker, Petr Bruza, David J. Gladstone, Lesley A. Jarvis, Brian W. Pogue, Kimberley S. Samkoe, Rongxiao Zhang
This approach quantified positioning variations in two parts: a global shift from rigid registration and a two-dimensional variation map of loco-regional deformation from non-rigid registration.
1 code implementation • 19 Jul 2024 • Cheng Gong, Yao Chen, Qiuyang Luo, Ye Lu, Tao Li, Yuzhi Zhang, Yufei Sun, Le Zhang
Experimental results on Cifar100 and ImageNet datasets exhibit that \methodname~provides up to a \textbf{50. 00\%} reduction in training time and attains up to a \textbf{6. 94\%} enhancement in accuracy when contrasted with baseline methods across diverse models and tasks.
2 code implementations • 18 Mar 2024 • Yang Yang, Wen Wang, Liang Peng, Chaotian Song, Yao Chen, Hengjia Li, Xiaolong Yang, Qinglin Lu, Deng Cai, Boxi Wu, Wei Liu
Customization generation techniques have significantly advanced the synthesis of specific concepts across varied contexts.
no code implementations • 14 Dec 2023 • Yibo Zhao, Liang Peng, Yang Yang, Zekai Luo, Hengjia Li, Yao Chen, Zheng Yang, Xiaofei He, Wei Zhao, Qinglin Lu, Boxi Wu, Wei Liu
It focuses on controlling specific local region according to user-defined image conditions, while the remaining regions are only conditioned by the original text prompt.
no code implementations • 13 Dec 2023 • Esteban Real, Yao Chen, Mirko Rossini, Connal de Souza, Manav Garg, Akhil Verghese, Moritz Firsching, Quoc V. Le, Ekin Dogus Cubuk, David H. Park
Computers calculate transcendental functions by approximating them through the composition of a few limited-precision instructions.
no code implementations • 8 Aug 2023 • Yuanhan Mo, Yao Chen, Aimee Readie, Gregory Ligozio, Thibaud Coroller, Bartłomiej W. Papież
In this study, we address this challenge by prototyping a 2-step auto-grading pipeline, called VertXGradeNet, to automatically predict mSASSS scores for the cervical and lumbar vertebral units (VUs) in X-ray spinal imaging.
no code implementations • 23 Mar 2023 • Yao Chen, Shan Huang, Wensheng Gan, Gengsen Huang, Yongdong Wu
In this paper, we review some of the early advances of FL4M, which will be a research direction with unlimited development potential.
no code implementations • 7 Feb 2023 • Yao Chen, Yuanhan Mo, Aimee Readie, Gregory Ligozio, Indrajeet Mandal, Faiz Jabbar, Thibaud Coroller, Bartlomiej W. Papiez
Our experimental results have shown that the proposed pipeline outperformed two SOTA segmentation models on our test dataset (MEASURE 1) with a mean Dice of 0. 90, vs. a mean Dice of 0. 73 for Mask R-CNN and 0. 72 for U-Net.
no code implementations • 27 Nov 2022 • Yao Chen, Yijie Gui, Hong Lin, Wensheng Gan, Yongdong Wu
For the purpose of advancing the research in this field, building a robust FL system, and realizing the wide application of FL, this paper sorts out the possible attacks and corresponding defenses of the current FL system systematically.
no code implementations • 27 Sep 2022 • Yao Chen, Wensheng Gan, Yongdong Wu, Philip S. Yu
Contrast pattern mining (CPM) is an important and popular subfield of data mining.
no code implementations • 29 Aug 2022 • Yao Chen, Samuel S. Streeter, Brady Hunt, Hira S. Sardar, Jason R. Gunn, Laura J. Tafe, Joseph A. Paydarfar, Brian W. Pogue, Keith D. Paulsen, Kimberley S. Samkoe
In this study, a radiomics approach was extended to optical fluorescence molecular imaging data for tissue classification, termed 'optomics'.
no code implementations • 22 Jul 2022 • Yao Chen, Junhao Pan, Xinheng Liu, JinJun Xiong, Deming Chen
In this study, we propose HiKonv, a unified solution that maximizes the throughput of convolution on a given underlying processing unit with low-bitwidth quantized data inputs through novel bit-wise management and parallel computation.
no code implementations • 12 Jul 2022 • Yao Chen, Yuanhan Mo, Aimee Readie, Gregory Ligozio, Thibaud Coroller, Bartlomiej W. Papiez
Manual annotation of vertebrae on spinal X-ray imaging is costly and time-consuming due to bone shape complexity and image quality variations.
no code implementations • 6 Jun 2022 • Xiaofan Zhang, Yao Chen, Cong Hao, Sitao Huang, Yuhong Li, Deming Chen
Deep Neural Networks (DNNs) have achieved great success in a variety of machine learning (ML) applications, delivering high-quality inferencing solutions in computer vision, natural language processing, and virtual reality, etc.
no code implementations • 28 Dec 2021 • Xinheng Liu, Yao Chen, Prakhar Ganesh, Junhao Pan, JinJun Xiong, Deming Chen
Quantization for Convolutional Neural Network (CNN) has shown significant progress with the intention of reducing the cost of computation and storage with low-bitwidth data inputs.
1 code implementation • 26 Oct 2021 • Prakhar Ganesh, Yao Chen, Yin Yang, Deming Chen, Marianne Winslett
Performance of object detection models has been growing rapidly on two major fronts, model accuracy and efficiency.
no code implementations • 13 Sep 2021 • Yao Chen, Qingyi Gao, Xiao Wang
The Wasserstein GAN (WGAN) leverages the Wasserstein distance to avoid the caveats in the minmax two-player training of GANs but has other defects such as mode collapse and lack of metric to detect the convergence.
no code implementations • 4 Aug 2021 • Lingdong Kong, Prakhar Ganesh, Tan Wang, Junhao Liu, Le Zhang, Yao Chen
We hope that the scale, diversity, and quality of our dataset can benefit researchers in this area and beyond.
1 code implementation • 9 Jul 2021 • Xinheng Liu, Yao Chen, Cong Hao, Ashutosh Dhar, Deming Chen
We implement our proposed accelerator on multiple FPGAs, which outperforms the state-of-the-art designs in terms of both throughput and DSP efficiency.
no code implementations • 11 May 2021 • Yao Chen, Cole Hawkins, Kaiqi Zhang, Zheng Zhang, Cong Hao
This paper emphasizes the importance and efficacy of training, quantization and accelerator design, and calls for more research breakthroughs in the area for AI on the edge.
no code implementations • 7 Jan 2021 • Yao Chen, Jiangang Liu, Zhe Zhang, Shiping Wen, Wenjun Xiong
In this work, we propose a novel Knowledge Graph Embedding (KGE) strategy, called M\"{o}biusE, in which the entities and relations are embedded to the surface of a M\"{o}bius ring.
no code implementations • 14 Oct 2020 • Cong Hao, Yao Chen, Xiaofan Zhang, Yuhong Li, JinJun Xiong, Wen-mei Hwu, Deming Chen
High quality AI solutions require joint optimization of AI algorithms, such as deep neural networks (DNNs), and their hardware accelerators.
1 code implementation • 18 May 2020 • Cheng Gong, Yao Chen, Ye Lu, Tao Li, Cong Hao, Deming Chen
Quantization has been proven to be an effective method for reducing the computing and/or storage cost of DNNs.
no code implementations • 6 May 2020 • Yuhong Li, Cong Hao, Xiaofan Zhang, Xinheng Liu, Yao Chen, JinJun Xiong, Wen-mei Hwu, Deming Chen
We formulate the co-search problem by fusing DNN search variables and hardware implementation variables into one solution space, and maximize both algorithm accuracy and hardware implementation quality.
no code implementations • ACL 2020 • Ruichu Cai, Zhihao Liang, Boyan Xu, Zijian Li, Yuexing Hao, Yao Chen
Existing leading code comment generation approaches with the structure-to-sequence framework ignores the type information of the interpretation of the code, e. g., operator, string, etc.
no code implementations • 25 Apr 2020 • Sicong Du, Hengkai Guo, Yao Chen, Yilun Lin, Xiangbing Meng, Linfu Wen, Fei-Yue Wang
Initialization is essential to monocular Simultaneous Localization and Mapping (SLAM) problems.
no code implementations • 27 Feb 2020 • Prakhar Ganesh, Yao Chen, Xin Lou, Mohammad Ali Khan, Yin Yang, Hassan Sajjad, Preslav Nakov, Deming Chen, Marianne Winslett
Pre-trained Transformer-based models have achieved state-of-the-art performance for various Natural Language Processing (NLP) tasks.
no code implementations • 18 Nov 2019 • Cong Hao, Yao Chen, Xinheng Liu, Atif Sarwari, Daryl Sew, Ashutosh Dhar, Bryan Wu, Dongdong Fu, JinJun Xiong, Wen-mei Hwu, Junli Gu, Deming Chen
The rapidly growing demands for powerful AI algorithms in many application domains have motivated massive investment in both high-quality deep neural network (DNN) models and high-efficiency implementations.
no code implementations • 25 Sep 2019 • Yao Chen, Qingyi Gao, Xiao Wang
We further provide a rigorous probabilistic interpretation of our model under the framework of maximum likelihood estimation.
2 code implementations • 20 May 2019 • Xiaofan Zhang, Cong Hao, Yuhong Li, Yao Chen, JinJun Xiong, Wen-mei Hwu, Deming Chen
Developing deep learning models for resource-constrained Internet-of-Things (IoT) devices is challenging, as it is difficult to achieve both good quality of results (QoR), such as DNN model inference accuracy, and quality of service (QoS), such as inference latency, throughput, and power consumption.
no code implementations • 30 Oct 2018 • Mingjie Sun, Jian Tang, Huichen Li, Bo Li, Chaowei Xiao, Yao Chen, Dawn Song
In this paper, we take the task of link prediction as an example, which is one of the most fundamental problems for graph analysis, and introduce a data positioning attack to node embedding methods.
no code implementations • 27 May 2016 • Yao Chen, Xiao Wang, Linglong Kong, Hongtu Zhu
Identification of regions of interest (ROI) associated with certain disease has a great impact on public health.