Search Results for author: Haitao Liu

Found 29 papers, 5 papers with code

Comprehensive Reassessment of Large-Scale Evaluation Outcomes in LLMs: A Multifaceted Statistical Approach

no code implementations22 Mar 2024 Kun Sun, Rong Wang, Haitao Liu, Anders Søgaard

Evaluations have revealed that factors such as scaling, training types, architectures and other factors profoundly impact the performance of LLMs.

ALI-DPFL: Differentially Private Federated Learning with Adaptive Local Iterations

no code implementations21 Aug 2023 XinPeng Ling, Jie Fu, Kuncan Wang, Haitao Liu, Zhili Chen

Federated Learning (FL) is a distributed machine learning technique that allows model training among multiple devices or organizations by sharing training parameters instead of raw data.

Federated Learning

Robust Motion Averaging for Multi-view Registration of Point Sets Based Maximum Correntropy Criterion

no code implementations24 Aug 2022 Yugeng Huang, Haitao Liu, Tian Huang

We also provide a novel strategy for determining the kernel width which ensures that our method can efficiently exploit information redundancy supplied by relative motions in the presence of many outliers.

Learning Multi-Task Gaussian Process Over Heterogeneous Input Domains

no code implementations25 Feb 2022 Haitao Liu, Kai Wu, Yew-Soon Ong, Chao Bian, Xiaomo Jiang, Xiaofang Wang

Multi-task Gaussian process (MTGP) is a well-known non-parametric Bayesian model for learning correlated tasks effectively by transferring knowledge across tasks.

Dimensionality Reduction Inductive Bias

Scalable Multi-Task Gaussian Processes with Neural Embedding of Coregionalization

no code implementations20 Sep 2021 Haitao Liu, Jiaqi Ding, Xinyu Xie, Xiaomo Jiang, Yusong Zhao, Xiaofang Wang

Multi-task regression attempts to exploit the task similarity in order to achieve knowledge transfer across related tasks for performance improvement.

Gaussian Processes regression +2

Deep Probabilistic Time Series Forecasting using Augmented Recurrent Input for Dynamic Systems

no code implementations3 Jun 2021 Haitao Liu, Changjun Liu, Xiaomo Jiang, Xudong Chen, Shuhua Yang, Xiaofang Wang

Thereafter, we first investigate the methodological characteristics of the proposed deep probabilistic sequence model on toy cases, and then comprehensively demonstrate the superiority of our model against existing deep probabilistic SSM models through extensive numerical experiments on eight system identification benchmarks from various dynamic systems.

Management Probabilistic Time Series Forecasting +1

Statistical patterns of word frequency suggesting the probabilistic nature of human languages

no code implementations1 Dec 2020 Shuiyuan Yu, Chunshan Xu, Haitao Liu

Traditional linguistic theories have largely regard language as a formal system composed of rigid rules.

Modulating Scalable Gaussian Processes for Expressive Statistical Learning

1 code implementation29 Aug 2020 Haitao Liu, Yew-Soon Ong, Xiaomo Jiang, Xiaofang Wang

For a learning task, Gaussian process (GP) is interested in learning the statistical relationship between inputs and outputs, since it offers not only the prediction mean but also the associated variability.

Gaussian Processes Variational Inference

Deep Latent-Variable Kernel Learning

1 code implementation18 May 2020 Haitao Liu, Yew-Soon Ong, Xiaomo Jiang, Xiaofang Wang

Deep kernel learning (DKL) leverages the connection between Gaussian process (GP) and neural networks (NN) to build an end-to-end, hybrid model.

Scalable Gaussian Process Classification with Additive Noise for Various Likelihoods

1 code implementation14 Sep 2019 Haitao Liu, Yew-Soon Ong, Ziwei Yu, Jianfei Cai, Xiaobo Shen

Gaussian process classification (GPC) provides a flexible and powerful statistical framework describing joint distributions over function space.

Classification General Classification +3

Anomaly Detection via Graphical Lasso

1 code implementation10 Nov 2018 Haitao Liu, Randy C. Paffenroth, Jian Zou, Chong Zhou

Accordingly, we propose a novel optimization problem that is similar in spirit to Robust Principal Component Analysis (RPCA) and splits the sample covariance matrix $M$ into two parts, $M=F+S$, where $F$ is the cleaned sample covariance whose inverse is sparse and computable by Graphical Lasso, and $S$ contains the outliers in $M$.

Anomaly Detection

Large-scale Heteroscedastic Regression via Gaussian Process

no code implementations3 Nov 2018 Haitao Liu, Yew-Soon Ong, Jianfei Cai

To improve the scalability, we first develop a variational sparse inference algorithm, named VSHGP, to handle large-scale datasets.

regression Variational Inference

Understanding and Comparing Scalable Gaussian Process Regression for Big Data

no code implementations3 Nov 2018 Haitao Liu, Jianfei Cai, Yew-Soon Ong, Yi Wang

This paper devotes to investigating the methodological characteristics and performance of representative global and local scalable GPs including sparse approximations and local aggregations from four main perspectives: scalability, capability, controllability and robustness.

regression

Zipf's law in 50 languages: its structural pattern, linguistic interpretation, and cognitive motivation

no code implementations5 Jul 2018 Shuiyuan Yu, Chunshan Xu, Haitao Liu

A computer simulation based on the dual-process theory yields Zipf's law with the same structural pattern, suggesting that Zipf's law of natural languages are motivated by common cognitive mechanisms.

When Gaussian Process Meets Big Data: A Review of Scalable GPs

no code implementations3 Jul 2018 Haitao Liu, Yew-Soon Ong, Xiaobo Shen, Jianfei Cai

The review of scalable GPs in the GP community is timely and important due to the explosion of data size.

Generalized Robust Bayesian Committee Machine for Large-scale Gaussian Process Regression

1 code implementation ICML 2018 Haitao Liu, Jianfei Cai, Yi Wang, Yew-Soon Ong

In order to scale standard Gaussian process (GP) regression to large-scale datasets, aggregation models employ factorized training process and then combine predictions from distributed experts.

Distributed Computing regression

Existence of Hierarchies and Human's Pursuit of Top Hierarchy Lead to Power Law

no code implementations24 Sep 2016 Shuiyuan Yu, Junying Liang, Haitao Liu

The power law is ubiquitous in natural and social phenomena, and is considered as a universal relationship between the frequency and its rank for diverse social systems.

The distribution of information content in English sentences

no code implementations24 Sep 2016 Shuiyuan Yu, Jin Cong, Junying Liang, Haitao Liu

Sentence is a basic linguistic unit, however, little is known about how information content is distributed across different positions of a sentence.

Position Sentence

Dependency length minimization: Puzzles and Promises

no code implementations15 Sep 2015 Haitao Liu, Chunshan Xu, Junying Liang

In the recent issue of PNAS, Futrell et al. claims that their study of 37 languages gives the first large scale cross-language evidence for Dependency Length Minimization, which is an overstatement that ignores similar previous researches.

The influence of Chunking on Dependency Crossing and Distance

no code implementations3 Sep 2015 Qian Lu, Chunshan Xu, Haitao Liu

These results suggest that chunking may play a vital role in the minimization of dependency distance, and a somewhat contributing role in the rarity of dependency crossing.

Chunking

The risks of mixing dependency lengths from sequences of different length

no code implementations13 Apr 2013 Ramon Ferrer-i-Cancho, Haitao Liu

However, the empirical distribution of dependency lengths of sentences of the same length differs from that of sentences of varying length and the distribution of dependency lengths depends on sentence length for real sentences and also under the null hypothesis that dependencies connect vertices located in random positions of the sequence.

Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.