Search Results for author: Morteza Noshad

Found 8 papers, 3 papers with code

Clinical Recommender System: Predicting Medical Specialty Diagnostic Choices with Neural Network Ensembles

no code implementations23 Jul 2020 Morteza Noshad, Ivana Jankovic, Jonathan H. Chen

The growing demand for key healthcare resources such as clinical expertise and facilities has motivated the emergence of artificial intelligence (AI) based decision support systems.

Recommendation Systems

Learning to Benchmark: Determining Best Achievable Misclassification Error from Training Data

2 code implementations16 Sep 2019 Morteza Noshad, Li Xu, Alfred Hero

In this problem the objective is to establish statistically consistent estimates of the Bayes misclassification error rate without having to learn a Bayes-optimal classifier.

Convergence Rates for Empirical Estimation of Binary Classification Bounds

no code implementations1 Oct 2018 Salimeh Yasaei Sekeh, Morteza Noshad, Kevin R. Moon, Alfred O. Hero

We derive a bound on the convergence rate for the Friedman-Rafsky (FR) estimator of the HP-divergence, which is related to a multivariate runs statistic for testing between two distributions.

Binary Classification Classification +1

Scalable Mutual Information Estimation using Dependence Graphs

1 code implementation27 Jan 2018 Morteza Noshad, Yu Zeng, Alfred O. Hero III

To the best of our knowledge EDGE is the first non-parametric MI estimator that can achieve parametric MSE rates with linear time complexity.

Information Plane Mutual Information Estimation

Direct Estimation of Information Divergence Using Nearest Neighbor Ratios

no code implementations17 Feb 2017 Morteza Noshad, Kevin R. Moon, Salimeh Yasaei Sekeh, Alfred O. Hero III

Considering the $k$-nearest neighbor ($k$-NN) graph of $Y$ in the joint data set $(X, Y)$, we show that the average powered ratio of the number of $X$ points to the number of $Y$ points among all $k$-NN points is proportional to R\'{e}nyi divergence of $X$ and $Y$ densities.

Information Theoretic Structure Learning with Confidence

no code implementations13 Sep 2016 Kevin R. Moon, Morteza Noshad, Salimeh Yasaei Sekeh, Alfred O. Hero III

Information theoretic measures (e. g. the Kullback Liebler divergence and Shannon mutual information) have been used for exploring possibly nonlinear multivariate dependencies in high dimension.

Two-sample testing

Low-Complexity Stochastic Generalized Belief Propagation

no code implementations6 May 2016 Farzin Haddadpour, Mahdi Jafari Siavoshani, Morteza Noshad

However, this reduction can be larger than only one order of magnitude in alphabet size.

Cannot find the paper you are looking for? You can Submit a new open access paper.