1 code implementation • 31 Jan 2022 • Alexander Soen, Ibrahim Alabdulmohsin, Sanmi Koyejo, Yishay Mansour, Nyalleng Moorosi, Richard Nock, Ke Sun, Lexing Xie
We introduce a new family of techniques to post-process ("wrap") a black-box classifier in order to reduce its bias.
1 code implementation • 3 Nov 2021 • Pio Calderon, Alexander Soen, Marian-Andrei Rizoiu
This work introduces a novel multivariate temporal point process, the Partial Mean Behavior Poisson (PMBP) process, which can be leveraged to fit the multivariate Hawkes process to partially interval-censored data consisting of a mix of event timestamps on a subset of dimensions and interval-censored event counts on the complementary dimensions.
no code implementations • NeurIPS 2021 • Alexander Soen, Ke Sun
In the realm of deep learning, the Fisher information matrix (FIM) gives novel insights and useful tools to characterize the loss landscape, perform second-order optimization, and build geometric learning theories.
no code implementations • 16 Apr 2021 • Marian-Andrei Rizoiu, Alexander Soen, Shidi Li, Pio Calderon, Leanne Dong, Aditya Krishna Menon, Lexing Xie
We propose the multi-impulse exogenous function - for when the exogenous events are observed as event time - and the latent homogeneous Poisson process exogenous function - for when the exogenous events are presented as interval-censored volumes.
no code implementations • 1 Dec 2020 • Alexander Soen, Hisham Husain, Richard Nock
Our paper makes use of a mathematical object recently introduced in privacy -- mollifiers of distributions -- and a popular approach to machine learning -- boosting -- to get an approach in the same lineage as Celis et al. but without the same impediments, including in particular, better guarantees in terms of accuracy and finer guarantees in terms of fairness.
no code implementations • 28 Jul 2020 • Alexander Soen, Alexander Mathews, Daniel Grixti-Cheng, Lexing Xie
The proof connects the well known Stone-Weierstrass Theorem for function approximation, the uniform density of non-negative continuous functions using a transfer functions, the formulation of the parameters of a piece-wise continuous functions as a dynamic system, and a recurrent neural network implementation for capturing the dynamics.