Search Results for author: Xianghang Liu

Found 5 papers, 0 papers with code

FedFNN: Faster Training Convergence Through Update Predictions in Federated Recommender Systems

no code implementations14 Sep 2023 Francesco Fabbri, Xianghang Liu, Jack R. McKenzie, Bartlomiej Twardowski, Tri Kurniawan Wijaya

Federated Learning (FL) has emerged as a key approach for distributed machine learning, enhancing online personalization while ensuring user data privacy.

Federated Learning Recommendation Systems

Projecting Markov Random Field Parameters for Fast Mixing

no code implementations NeurIPS 2014 Xianghang Liu, Justin Domke

Markov chain Monte Carlo (MCMC) algorithms are simple and extremely powerful techniques to sample from almost arbitrary distributions.

Projecting Ising Model Parameters for Fast Mixing

no code implementations NeurIPS 2013 Justin Domke, Xianghang Liu

Inference in general Ising models is difficult, due to high treewidth making tree-based algorithms intractable.

Learning as MAP Inference in Discrete Graphical Models

no code implementations NeurIPS 2012 Xianghang Liu, James Petterson, Tibério S. Caetano

Instead of relying on convex losses and regularisers such as in SVMs, logistic regression and boosting, or instead non-convex but continuous formulations such as those encountered in neural networks and deep belief networks, our framework entails a non-convex but \emph{discrete} formulation, where estimation amounts to finding a MAP configuration in a graphical model whose potential functions are low-dimensional discrete surrogates for the misclassification loss.

Binary Classification feature selection

Cannot find the paper you are looking for? You can Submit a new open access paper.