no code implementations • 26 Mar 2024 • Yiqun Chen, Jiaxin Mao, Yi Zhang, Dehong Ma, Long Xia, Jun Fan, Daiting Shi, Zhicong Cheng, Simiu Gu, Dawei Yin
The objective of search result diversification (SRD) is to ensure that selected documents cover as many different subtopics as possible.
no code implementations • 5 Jan 2024 • Zhongjie Shi, Jun Fan, Linhao Song, Ding-Xuan Zhou, Johan A. K. Suykens
With the rapid development of deep learning in various fields of science and technology, such as speech recognition, image classification, and natural language processing, recently it is also widely applied in the functional data analysis (FDA) with some empirical success.
no code implementations • 12 May 2023 • Zhan Yu, Jun Fan, Zhongjie Shi, Ding-Xuan Zhou
In the information era, to face the big data challenges {that} stem from functional data analysis very recently, we propose a novel distributed gradient descent functional learning (DGDFL) algorithm to tackle functional data across numerous local machines (processors) in the framework of reproducing kernel Hilbert space.
no code implementations • 10 Apr 2023 • Linhao Song, Jun Fan, Di-Rong Chen, Ding-Xuan Zhou
In recent years, functional neural networks have been proposed and studied in order to approximate nonlinear continuous functionals defined on $L^p([-1, 1]^s)$ for integers $s\ge1$ and $1\le p<\infty$.
no code implementations • 11 Nov 2022 • Lianshang Cai, Linhao Zhang, Dehong Ma, Jun Fan, Daiting Shi, Yi Wu, Zhicong Cheng, Simiu Gu, Dawei Yin
In this paper, we focus on two key questions in knowledge distillation for ranking models: 1) how to ensemble knowledge from multi-teacher; 2) how to utilize the label information of data in the distillation process.
no code implementations • 29 Oct 2021 • Keli Guo, Jun Fan, Lixing Zhu
In this paper, we establish minimax optimal rates of convergence for prediction in a semi-functional linear model that consists of a functional component and a less smooth nonparametric component.
no code implementations • 20 Apr 2021 • Jean-Christophe Breton, Youssef El-Khatib, Jun Fan, Nicolas Privault
We propose an extension of the Cox-Ross-Rubinstein (CRR) model based on $q$-binomial (or Kemp) random walks, with application to default with logistic failure rates.
no code implementations • 27 Oct 2020 • ZiHao Wang, Zhifei Xu, Jiayi He, Chulsoon Hwang, Jun Fan, Hervé Delingette
In this work we propose a neuromorphic hardware based signal equalizer by based on the deep learning implementation.
no code implementations • 13 Aug 2019 • Jun Fan, Dao-Hong Xiang
This paper studies binary classification problem associated with a family of loss functions called large-margin unified machines (LUM), which offers a natural bridge between distribution-based likelihood approaches and margin-based approaches.
no code implementations • 24 Oct 2018 • Zhongyang Zhang, Ling Zhang, Ze Sun, Nicholas Erickson, Ryan From, Jun Fan
Simulating the dynamic characteristics of a PN junction at the microscopic level requires solving the Poisson's equation at every time step.
no code implementations • 20 Feb 2017 • Yunlong Feng, Jun Fan, Johan A. K. Suykens
However, it outperforms these regression models in terms of robustness as shown in our study from a re-descending M-estimation view.
no code implementations • 17 Dec 2014 • Jun Fan, Ting Hu, Qiang Wu, Ding-Xuan Zhou
The error entropy consistency, which requires the error entropy of the learned function to approximate the minimum error entropy, is shown to be always true if the bandwidth parameter tends to 0 at an appropriate rate.