Search Results for author: Farzaneh Mirzazadeh

Found 8 papers, 4 papers with code

From Large Language Models and Optimization to Decision Optimization CoPilot: A Research Manifesto

no code implementations26 Feb 2024 Segev Wasserkrug, Leonard Boussioux, Dick den Hertog, Farzaneh Mirzazadeh, Ilker Birbil, Jannis Kurtz, Donato Maragno

Significantly simplifying the creation of optimization models for real-world business problems has long been a major goal in applying mathematical optimization more widely to important business and societal decisions.

Decision Making

GC-Flow: A Graph-Based Flow Network for Effective Clustering

1 code implementation26 May 2023 Tianchun Wang, Farzaneh Mirzazadeh, Xiang Zhang, Jie Chen

Graph convolutional networks (GCNs) are \emph{discriminative models} that directly model the class posterior $p(y|\mathbf{x})$ for semi-supervised classification of graph data.

Clustering Representation Learning

Alleviating Label Switching with Optimal Transport

1 code implementation NeurIPS 2019 Pierre Monteiller, Sebastian Claici, Edward Chien, Farzaneh Mirzazadeh, Justin Solomon, Mikhail Yurochkin

Label switching is a phenomenon arising in mixture model posterior inference that prevents one from meaningfully assessing posterior statistics using standard Monte Carlo procedures.

Hierarchical Optimal Transport for Document Representation

1 code implementation NeurIPS 2019 Mikhail Yurochkin, Sebastian Claici, Edward Chien, Farzaneh Mirzazadeh, Justin Solomon

The ability to measure similarity between documents enables intelligent summarization and analysis of large corpora.

BreGMN: scaled-Bregman Generative Modeling Networks

no code implementations1 Jun 2019 Akash Srivastava, Kristjan Greenewald, Farzaneh Mirzazadeh

Well-definedness of f-divergences, however, requires the distributions of the data and model to overlap completely in every time step of training.

Learning Embeddings into Entropic Wasserstein Spaces

2 code implementations8 May 2019 Charlie Frogner, Farzaneh Mirzazadeh, Justin Solomon

Euclidean embeddings of data are fundamentally limited in their ability to capture latent semantic structures, which need not conform to Euclidean spatial assumptions.

Dimensionality Reduction Word Embeddings

Learning Entropic Wasserstein Embeddings

no code implementations ICLR 2019 Charlie Frogner, Farzaneh Mirzazadeh, Justin Solomon

Despite their prevalence, Euclidean embeddings of data are fundamentally limited in their ability to capture latent semantic structures, which need not conform to Euclidean spatial assumptions.

Dimensionality Reduction

Embedding Inference for Structured Multilabel Prediction

no code implementations NeurIPS 2015 Farzaneh Mirzazadeh, Siamak Ravanbakhsh, Nan Ding, Dale Schuurmans

A key bottleneck in structured output prediction is the need for inference during training and testing, usually requiring some form of dynamic programming.

Cannot find the paper you are looking for? You can Submit a new open access paper.