no code implementations • 4 Mar 2024 • Feihu Jin, Yin Liu, Ying Tan
Parameter-efficient tuning methods such as LoRA could achieve comparable performance to model tuning by tuning a small portion of the parameters.
1 code implementation • 14 Nov 2023 • Yang Gao, Zhuang Xiong, Shanshan Shan, Yin Liu, Pengfei Rong, Min Li, Alan H Wilman, G. Bruce Pike, Feng Liu, Hongfu Sun
The proposed OA-LFE-empowered iQSM, which we refer to as iQSM+, is trained in a self-supervised manner on a specially-designed simulation brain dataset.
no code implementations • 18 Aug 2023 • Zhuang Xiong, Yang Gao, Yin Liu, Amir Fazlollahi, Peter Nestor, Feng Liu, Hongfu Sun
The data-driven approach of supervised learning methods has limited applicability in solving dipole inversion in Quantitative Susceptibility Mapping (QSM) with varying scan parameters across different objects.
1 code implementation • 19 May 2020 • Zihan Ye, Fuyuan Hu, Yin Liu, Zhenping Xia, Fan Lyu, Pengqing Liu
First, CNL computes correlations between features of a query layer and all response layers.
no code implementations • ICLR 2018 • Yin Liu, Vincent Chen
Modern neural network architectures take advantage of increasingly deeper layers, and various advances in their structure to achieve better performance.