Multi-kernel Correntropy Regression: Robustness, Optimality, and Application on Magnetometer Calibration

13 Apr 2023  ·  Shilei Li, Yunjiang Lou, Dawei Shi, Lijing Li, Ling Shi ·

This paper investigates the robustness and optimality of the multi-kernel correntropy (MKC) on linear regression. We first derive an upper error bound for a scalar regression problem in the presence of arbitrarily large outliers and reveal that the kernel bandwidth should be neither too small nor too big in the sense of the lowest upper error bound. Meanwhile, we find that the proposed MKC is related to a specific heavy-tail distribution, and the level of the heavy tail is controlled by the kernel bandwidth solely. Interestingly, this distribution becomes the Gaussian distribution when the bandwidth is set to be infinite, which allows one to tackle both Gaussian and non-Gaussian problems. We propose an expectation-maximization (EM) algorithm to estimate the parameter vectors and explore the kernel bandwidths alternatively. The results show that our algorithm is equivalent to the traditional linear regression under Gaussian noise and outperforms the conventional method under heavy-tailed noise. Both numerical simulations and experiments on a magnetometer calibration application verify the effectiveness of the proposed method.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods