Search Results for author: Weichi Yao

Found 8 papers, 6 papers with code

Towards fully covariant machine learning

no code implementations31 Jan 2023 Soledad Villar, David W. Hogg, Weichi Yao, George A. Kevrekidis, Bernhard Schölkopf

We discuss links to causal modeling, and argue that the implementation of passive symmetries is particularly valuable when the goal of the learning problem is to generalize out of sample.

Dimensionless machine learning: Imposing exact units equivariance

1 code implementation2 Apr 2022 Soledad Villar, Weichi Yao, David W. Hogg, Ben Blum-Smith, Bianca Dumitrascu

Units equivariance (or units covariance) is the exact symmetry that follows from the requirement that relationships among measured quantities of physics relevance must obey self-consistent dimensional scalings.

BIG-bench Machine Learning Symbolic Regression

Scalars are universal: Equivariant machine learning, structured like classical physics

2 code implementations NeurIPS 2021 Soledad Villar, David W. Hogg, Kate Storey-Fisher, Weichi Yao, Ben Blum-Smith

There has been enormous progress in the last few years in designing neural networks that respect the fundamental symmetries and coordinate freedoms of physical law.

BIG-bench Machine Learning Translation

Dynamic estimation with random forests for discrete-time survival data

1 code implementation1 Mar 2021 Hoora Moradian, Weichi Yao, Denis Larocque, Jeffrey S. Simonoff, Halina Frydman

Time-varying covariates are often available in survival studies and estimation of the hazard function needs to be updated as new information becomes available.

Methodology Applications

Ensemble methods for survival function estimation with time-varying covariates

2 code implementations31 May 2020 Weichi Yao, Halina Frydman, Denis Larocque, Jeffrey S. Simonoff

We compare their performance with that of the Cox model and transformation forest, adapted here to accommodate time-varying covariates, through a comprehensive simulation study in which the Kaplan-Meier estimate serves as a benchmark, and performance is compared using the integrated L2 difference between the true and estimated survival functions.

Experimental performance of graph neural networks on random instances of max-cut

no code implementations15 Aug 2019 Weichi Yao, Afonso S. Bandeira, Soledad Villar

In particular we consider Graph Neural Networks (GNNs) -- a class of neural networks designed to learn functions on graphs -- and we apply them to the max-cut problem on random regular graphs.

Cannot find the paper you are looking for? You can Submit a new open access paper.