1 code implementation • 22 Jan 2025 • Kun Fang, Ziyu Wang, Gus Xia, Ichiro Fujinaga
We convert the music data to symbolic inputs and evaluate LLMs' ability in detecting annotation errors in three key MIR tasks: beat tracking, chord extraction, and key estimation.
no code implementations • 13 Jan 2025 • Yuchen Lu, Kun Fang
Quantum relative entropy, a quantum generalization of the well-known Kullback-Leibler divergence, serves as a fundamental measure of the distinguishability between quantum states and plays a pivotal role in quantum information science.
no code implementations • 19 Oct 2024 • MingAn Lin, Fan Yang, Yanjun Shen, Haoze Sun, Tianpeng Li, Chenzheng Zhu, Tao Zhang, Miao Zheng, Xu Li, Yijie Zhou, Mingyang Chen, Yanzhao Qin, Youquan Li, Hao Liang, Fei Li, Yadong Li, Mang Wang, Guosheng Dong, Kun Fang, Jianhua Xu, Bin Cui, Wentao Zhang, Zenan Zhou, WeiPeng Chen
Baichuan-Instruct is an internal model, while Qwen2-Nova-72B and Llama3-PBM-Nova-70B are instruct versions of the Qwen2-72B and Llama-3-70B base models, optimized through Baichuan Alignment.
no code implementations • 16 Sep 2024 • Kun Fang, Qinghua Tao, Zuopeng Yang, Xiaolin Huang, Jie Yang
Out-of-Distribution (OoD) detection aims to justify whether a given sample is from the training distribution of the classifier-under-protection, i. e., In-Distribution (InD), or from OoD.
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
no code implementations • 8 Jul 2024 • Miao Zheng, Hao Liang, Fan Yang, Haoze Sun, Tianpeng Li, Lingchu Xiong, Yan Zhang, Youzhen Wu, Kun Li, Yanjun Shen, MingAn Lin, Tao Zhang, Guosheng Dong, Yujing Qiao, Kun Fang, WeiPeng Chen, Bin Cui, Wentao Zhang, Zenan Zhou
This combination of high performance, efficiency, and flexibility makes PAS a valuable system for enhancing the usability and effectiveness of LLMs through improved prompt engineering.
2 code implementations • 9 Jun 2024 • Shuting Wang, Jiongnan Liu, Shiren Song, Jiehan Cheng, Yuqi Fu, Peidong Guo, Kun Fang, Yutao Zhu, Zhicheng Dou
We evaluated popular LLMs such as Llama, Baichuan, ChatGLM, and GPT models.
1 code implementation • 30 Mar 2024 • Tao Li, Qinghua Tao, Weihao Yan, Zehao Lei, Yingwen Wu, Kun Fang, Mingzhen He, Xiaolin Huang
Improving the generalization ability of modern deep neural networks (DNNs) is a fundamental challenge in machine learning.
1 code implementation • 19 Feb 2024 • Jiejun Tan, Zhicheng Dou, Yutao Zhu, Peidong Guo, Kun Fang, Ji-Rong Wen
The integration of large language models (LLMs) and search engines represents a significant evolution in knowledge acquisition methodologies.
1 code implementation • 5 Feb 2024 • Kun Fang, Qinghua Tao, Kexin Lv, Mingzhen He, Xiaolin Huang, Jie Yang
Out-of-Distribution (OoD) detection is vital for the reliability of Deep Neural Networks (DNNs).
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
1 code implementation • 22 Oct 2023 • Kun Fang, Qinghua Tao, Xiaolin Huang, Jie Yang
Motivated by such diversities on OoD loss landscape across modes, we revisit the deep ensemble method for OoD detection through mode ensemble, leading to improved performance and benefiting the OoD detector with reduced variances.
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
2 code implementations • 19 Sep 2023 • Aiyuan Yang, Bin Xiao, Bingning Wang, Borong Zhang, Ce Bian, Chao Yin, Chenxu Lv, Da Pan, Dian Wang, Dong Yan, Fan Yang, Fei Deng, Feng Wang, Feng Liu, Guangwei Ai, Guosheng Dong, Haizhou Zhao, Hang Xu, Haoze Sun, Hongda Zhang, Hui Liu, Jiaming Ji, Jian Xie, Juntao Dai, Kun Fang, Lei Su, Liang Song, Lifeng Liu, Liyun Ru, Luyao Ma, Mang Wang, Mickel Liu, MingAn Lin, Nuolan Nie, Peidong Guo, Ruiyang Sun, Tao Zhang, Tianpeng Li, Tianyu Li, Wei Cheng, WeiPeng Chen, Xiangrong Zeng, Xiaochuan Wang, Xiaoxi Chen, Xin Men, Xin Yu, Xuehai Pan, Yanjun Shen, Yiding Wang, Yiyu Li, Youxin Jiang, Yuchen Gao, Yupeng Zhang, Zenan Zhou, Zhiying Wu
Large language models (LLMs) have demonstrated remarkable performance on a variety of natural language tasks based on just a few examples of natural language instructions, reducing the need for extensive feature engineering.
1 code implementation • 21 Nov 2022 • Tao Li, Weihao Yan, Zehao Lei, Yingwen Wu, Kun Fang, Ming Yang, Xiaolin Huang
To fully uncover the great potential of deep neural networks (DNNs), various learning algorithms have been developed to improve the model's generalization ability.
1 code implementation • 20 Nov 2022 • Kun Fang, Qinghua Tao, Yingwen Wu, Tao Li, Xiaolin Huang, Jie Yang
Randomized Smoothing (RS) is a promising technique for certified robustness, and recently in RS the ensemble of multiple deep neural networks (DNNs) has shown state-of-the-art performances.
1 code implementation • 19 Oct 2022 • Pengjin Wei, Guohang Yan, Yikang Li, Kun Fang, Jie Yang, Wei Liu
This calibration task is multi-modal, where the rich color and texture information captured by the camera and the accurate three-dimensional spatial information from the LiDAR is incredibly significant for downstream tasks.
1 code implementation • 12 Aug 2022 • Yingwen Wu, Sizhe Chen, Kun Fang, Xiaolin Huang
The wide application of deep neural networks (DNNs) demands an increasing amount of attention to their real-world robustness, i. e., whether a DNN resists black-box adversarial attacks, among which score-based query attacks (SQAs) are most threatening since they can effectively hurt a victim network with the only access to model outputs.
1 code implementation • 7 Mar 2022 • Pengjin Wei, Guohang Yan, Yikang Li, Kun Fang, Xinyu Cai, Jie Yang, Wei Liu
Sensor-based environmental perception is a crucial part of the autonomous driving system.
1 code implementation • CVPR 2022 • Tao Li, Yingwen Wu, Sizhe Chen, Kun Fang, Xiaolin Huang
Single-step adversarial training (AT) has received wide attention as it proved to be both efficient and robust.
1 code implementation • 13 Apr 2021 • Qin Luo, Kun Fang, Jie Yang, Xiaolin Huang
Random Fourier Features (RFF) demonstrate wellappreciated performance in kernel approximation for largescale situations but restrict kernels to be stationary and positive definite.
no code implementations • 5 Jan 2021 • Sheng-Hao Wang, Kun Fang, Xiao-Jun Bi, Peng-Fei Yin
The TeV $\gamma$-ray halo around the Geminga pulsar is an important indicator of cosmic-ray (CR) propagation in the local zone of the Galaxy as it reveals the spatial distribution of the electrons and positrons escaping from the pulsar.
High Energy Astrophysical Phenomena High Energy Physics - Phenomenology
2 code implementations • 23 Oct 2020 • Kun Fang, Qinghua Tao, Yingwen Wu, Tao Li, Jia Cai, Feipeng Cai, Xiaolin Huang, Jie Yang
In this way, the proposed DIO augments the model and enhances the robustness of DNN itself as the learned features can be corrected by these mutually-orthogonal paths.
no code implementations • 28 Sep 2020 • Kun Fang, Xiaolin Huang, Yingwen Wu, Tao Li, Jie Yang
To defend adversarial attacks, we design a block containing multiple paths to learn robust features and the parameters of these paths are required to be orthogonal with each other.
1 code implementation • 10 Sep 2020 • Kun Fang, Fanghui Liu, Xiaolin Huang, Jie Yang
In the second-stage process, a linear learner is conducted with respect to the mapped random features.
1 code implementation • 12 Sep 2019 • Kun Fang, Hamza Fawzi
We present a systematic study of the geometric R\'enyi divergence (GRD), also known as the maximal R\'enyi divergence, from the point of view of quantum information theory.
Quantum Physics Information Theory Mathematical Physics Information Theory Mathematical Physics
no code implementations • 4 Jun 2019 • Bartosz Regula, Kun Fang, Xin Wang, Mile Gu
We show in particular that the $\varepsilon$-error one-shot distillable entanglement of any pure state is the same under all sets of operations ranging from one-way LOCC to separability-preserving operations or operations preserving the set of states with positive partial transpose, and can be computed exactly as a quadratically constrained linear program.
Quantum Physics Mathematical Physics Mathematical Physics
1 code implementation • 19 Jun 2017 • Kun Fang, Xin Wang, Marco Tomamichel, Runyao Duan
For isotropic states, it can be further simplified to a linear program.
Quantum Physics