Search Results for author: Mahdi Karami

Found 10 papers, 4 papers with code

Orchid: Flexible and Data-Dependent Convolution for Sequence Modeling

no code implementations28 Feb 2024 Mahdi Karami, Ali Ghodsi

In the rapidly evolving landscape of deep learning, the quest for models that balance expressivity with computational efficiency has never been more critical.

Computational Efficiency Image Classification +2

HiGen: Hierarchical Graph Generative Networks

1 code implementation30 May 2023 Mahdi Karami

Most real-world graphs exhibit a hierarchical structure, which is often overlooked by existing graph generation methods.

Graph Generation

On Hierarchical Multi-Resolution Graph Generative Models

no code implementations6 Mar 2023 Mahdi Karami, Jun Luo

In real world domains, most graphs naturally exhibit a hierarchical structure.

Graph Generation

DP$^2$-VAE: Differentially Private Pre-trained Variational Autoencoders

no code implementations5 Aug 2022 Dihong Jiang, Guojun Zhang, Mahdi Karami, Xi Chen, Yunfeng Shao, YaoLiang Yu

Similar to other differentially private (DP) learners, the major challenge for DPGM is also how to achieve a subtle balance between utility and privacy.

Robust One Round Federated Learning with Predictive Space Bayesian Inference

1 code implementation20 Jun 2022 Mohsin Hasan, Zehao Zhang, Kaiyang Guo, Mahdi Karami, Guojun Zhang, Xi Chen, Pascal Poupart

In contrast, our method performs the aggregation on the predictive posteriors, which are typically easier to approximate owing to the low-dimensionality of the output space.

Bayesian Inference Federated Learning

Federated Bayesian Neural Regression: A Scalable Global Federated Gaussian Process

no code implementations13 Jun 2022 Haolin Yu, Kaiyang Guo, Mahdi Karami, Xi Chen, Guojun Zhang, Pascal Poupart

We present Federated Bayesian Neural Regression (FedBNR), an algorithm that learns a scalable stand-alone global federated GP that respects clients' privacy.

Federated Learning Knowledge Distillation +1

Variational Inference for Deep Probabilistic Canonical Correlation Analysis

no code implementations9 Mar 2020 Mahdi Karami, Dale Schuurmans

In this paper, we propose a deep probabilistic multi-view model that is composed of a linear multi-view layer based on probabilistic canonical correlation analysis (CCA) description in the latent space together with deep generative networks as observation models.

MULTI-VIEW LEARNING Variational Inference

Invertible Convolutional Flow

1 code implementation NeurIPS 2019 Mahdi Karami, Dale Schuurmans, Jascha Sohl-Dickstein, Laurent Dinh, Daniel Duckworth

We show that these transforms allow more effective normalizing flow models to be developed for generative image models.

Multi-view Matrix Factorization for Linear Dynamical System Estimation

no code implementations NeurIPS 2017 Mahdi Karami, Martha White, Dale Schuurmans, Csaba Szepesvari

In this paper, we instead reconsider likelihood maximization and develop an optimization based strategy for recovering the latent states and transition parameters.

Cannot find the paper you are looking for? You can Submit a new open access paper.