Search Results for author: Erfan Sadeqi Azer

Found 6 papers, 2 papers with code

The Ultimate DataFlow for Ultimate SuperComputers-on-a-Chip, for Scientific Computing, Geo Physics, Complex Mathematics, and Information Processing

no code implementations20 Sep 2020 Veljko Milutinovic, Erfan Sadeqi Azer, Kristy Yoshimoto, Gerhard Klimeck, Miljan Djordjevic, Milos Kotlar, Miroslav Bojovic, Bozidar Miladinovic, Nenad Korolija, Stevan Stankovic, Nenad Filipović, Zoran Babovic, Miroslav Kosanic, Akira Tsuda, Mateo Valero, Massimo De Santo, Erich Neuhold, Jelena Skoručak, Laura Dipietro, Ivan Ratkovic

This article starts from the assumption that near future 100BTransistor SuperComputers-on-a-Chip will include N big multi-core processors, 1000N small many-core processors, a TPU-like fixed-structure systolic array accelerator for the most frequently used Machine Learning algorithms needed in bandwidth-bound applications and a flexible-structure reprogrammable accelerator for less frequently used Machine Learning algorithms needed in latency-critical applications.

Distributed, Parallel, and Cluster Computing

Not All Claims are Created Equal: Choosing the Right Statistical Approach to Assess Hypotheses

1 code implementation ACL 2020 Erfan Sadeqi Azer, Daniel Khashabi, Ashish Sabharwal, Dan Roth

Empirical research in Natural Language Processing (NLP) has adopted a narrow set of principles for assessing hypotheses, relying mainly on p-value computation, which suffers from several known issues.

Bayesian Inference Misconceptions

On the Possibilities and Limitations of Multi-hop Reasoning Under Linguistic Imperfections

no code implementations8 Jan 2019 Daniel Khashabi, Erfan Sadeqi Azer, Tushar Khot, Ashish Sabharwal, Dan Roth

The idea is to consider two interrelated spaces: a conceptual meaning space that is unambiguous and complete but hidden, and a linguistic space that captures a noisy grounding of the meaning space in the words of a language---the level at which all systems, whether neural or symbolic, operate.

A Practical Algorithm for Distributed Clustering and Outlier Detection

no code implementations NeurIPS 2018 Jiecao Chen, Erfan Sadeqi Azer, Qin Zhang

We study the classic $k$-means/median clustering, which are fundamental problems in unsupervised learning, in the setting where data are partitioned across multiple sites, and where we are allowed to discard a small portion of the data by labeling them as outliers.

Clustering Outlier Detection

Effective sketching methods for value function approximation

no code implementations3 Aug 2017 Yangchen Pan, Erfan Sadeqi Azer, Martha White

As a remedy, we demonstrate how to use sketching more sparingly, with only a left-sided sketch, that can still enable significant computational gains and the use of these matrix-based learning algorithms that are less sensitive to parameters.

Reinforcement Learning (RL)

Cannot find the paper you are looking for? You can Submit a new open access paper.