Search Results for author: Tianyu Hu

Found 6 papers, 2 papers with code

ChemAgent: Self-updating Library in Large Language Models Improves Chemical Reasoning

1 code implementation11 Jan 2025 Xiangru Tang, Tianyu Hu, Muyang Ye, Yanjun Shao, Xunjian Yin, Siru Ouyang, Wangchunshu Zhou, Pan Lu, Zhuosheng Zhang, Yilun Zhao, Arman Cohan, Mark Gerstein

To address these challenges, we present ChemAgent, a novel framework designed to improve the performance of LLMs through a dynamic, self-updating library.

Drug Discovery

CreDes: Causal Reasoning Enhancement and Dual-End Searching for Solving Long-Range Reasoning Problems using LLMs

no code implementations2 Oct 2024 Kangsheng Wang, Xiao Zhang, Hao liu, Songde Han, Huimin Ma, Tianyu Hu

Large language models (LLMs) have demonstrated limitations in handling combinatorial optimization problems involving long-range reasoning, partially due to causal hallucinations and huge search space.

Combinatorial Optimization

CSCE: Boosting LLM Reasoning by Simultaneous Enhancing of Casual Significance and Consistency

no code implementations20 Sep 2024 Kangsheng Wang, Xiao Zhang, Zizheng Guo, Tianyu Hu, Huimin Ma

Chain-based reasoning methods like chain of thought (CoT) play a rising role in solving reasoning tasks for large language models (LLMs).

Enhancing Short-Term Wind Speed Forecasting using Graph Attention and Frequency-Enhanced Mechanisms

no code implementations19 May 2023 Hao liu, Huimin Ma, Tianyu Hu

In this paper, a Graph-attentive Frequency-enhanced Spatial-Temporal Wind Speed Forecasting model based on graph attention and frequency-enhanced mechanisms, i. e., GFST-WSF, is proposed to improve the accuracy of short-term wind speed forecasting.

Graph Attention

Gestalt-Guided Image Understanding for Few-Shot Learning

1 code implementation8 Feb 2023 Kun Song, Yuchen Wu, Jiansheng Chen, Tianyu Hu, Huimin Ma

Due to the scarcity of available data, deep learning does not perform well on few-shot learning tasks.

Few-Shot Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.