Search Results for author: Zhidong Li

Found 6 papers, 0 papers with code

De-biased Representation Learning for Fairness with Unreliable Labels

no code implementations1 Aug 2022 Yixuan Zhang, Feng Zhou, Zhidong Li, Yang Wang, Fang Chen

In other words, the fair pre-processing methods ignore the discrimination encoded in the labels either during the learning procedure or the evaluation stage.

Fairness Representation Learning

Bias-Tolerant Fair Classification

no code implementations7 Jul 2021 Yixuan Zhang, Feng Zhou, Zhidong Li, Yang Wang, Fang Chen

Therefore, we propose a Bias-TolerantFAirRegularizedLoss (B-FARL), which tries to regain the benefits using data affected by label bias and selection bias.

Classification Fairness +2

Long-Term Pipeline Failure Prediction Using Nonparametric Survival Analysis

no code implementations11 Nov 2020 Dilusha Weeraddana, Sudaraka MallawaArachchi, Tharindu Warnakula, Zhidong Li, Yang Wang

We applied Machine Learning techniques to find a cost-effective solution to the pipe failure problem in these Australian cities, where on average 1500 of water main failures occur each year.

BIG-bench Machine Learning Survival Analysis

Utilizing machine learning to prevent water main breaks by understanding pipeline failure drivers

no code implementations5 Jun 2020 Dilusha Weeraddana, Bin Liang, Zhidong Li, Yang Wang, Fang Chen, Livia Bonazzi, Dean Phillips, Nitin Saxena

Data61 and Western Water worked collaboratively to apply engineering expertise and Machine Learning tools to find a cost-effective solution to the pipe failure problem in the region west of Melbourne, where on average 400 water main failures occur per year.

BIG-bench Machine Learning

Scalable Inference for Nonparametric Hawkes Process Using Pólya-Gamma Augmentation

no code implementations29 Oct 2019 Feng Zhou, Zhidong Li, Xuhui Fan, Yang Wang, Arcot Sowmya, Fang Chen

In this paper, we consider the sigmoid Gaussian Hawkes process model: the baseline intensity and triggering kernel of Hawkes process are both modeled as the sigmoid transformation of random trajectories drawn from Gaussian processes (GP).

Bayesian Inference Gaussian Processes +1

Efficient EM-Variational Inference for Hawkes Process

no code implementations29 May 2019 Feng Zhou, Zhidong Li, Xuhui Fan, Yang Wang, Arcot Sowmya, Fang Chen

In classical Hawkes process, the baseline intensity and triggering kernel are assumed to be a constant and parametric function respectively, which limits the model flexibility.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.