Search Results for author: Xiaocheng Yang

Found 5 papers, 1 papers with code

Arithmetic Reasoning with LLM: Prolog Generation & Permutation

no code implementations28 May 2024 Xiaocheng Yang, Bingsen Chen, Yik-Cheung Tam

We hypothesize that an LLM should focus on extracting predicates and generating symbolic formulas from the math problem description so that the underlying calculation can be done via an external code interpreter.

Arithmetic Reasoning Data Augmentation +2

Exploring an LM to generate Prolog Predicates from Mathematics Questions

no code implementations7 Sep 2023 Xiaocheng Yang, Yik-Cheung Tam

Consequently, we employ chain-of-thought to fine-tune LLaMA7B as a baseline model and develop other fine-tuned LLaMA7B models for the generation of Prolog code, Prolog code + chain-of-thought, and chain-of-thought + Prolog code, respectively.

GSM8K Language Modelling

Simple and Efficient Heterogeneous Graph Neural Network

2 code implementations6 Jul 2022 Xiaocheng Yang, Mingyu Yan, Shirui Pan, Xiaochun Ye, Dongrui Fan

Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.

Graph Neural Network Heterogeneous Node Classification +1

Characterizing and Understanding Distributed GNN Training on GPUs

no code implementations18 Apr 2022 Haiyang Lin, Mingyu Yan, Xiaocheng Yang, Mo Zou, WenMing Li, Xiaochun Ye, Dongrui Fan

Graph neural network (GNN) has been demonstrated to be a powerful model in many domains for its effectiveness in learning over graphs.

Graph Neural Network

FA-GAN: Fused Attentive Generative Adversarial Networks for MRI Image Super-Resolution

no code implementations9 Aug 2021 Mingfeng Jiang, Minghao Zhi, Liying Wei, Xiaocheng Yang, Jucheng Zhang, Yongming Li, Pin Wang, Jiahao Huang, Guang Yang

High-resolution magnetic resonance images can provide fine-grained anatomical information, but acquiring such data requires a long scanning time.

Image Super-Resolution SSIM

Cannot find the paper you are looking for? You can Submit a new open access paper.