Browse > Methodology > Feature Importance

Feature Importance

21 papers with code · Methodology

State-of-the-art leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

A Unified Approach to Interpreting Model Predictions

NeurIPS 2017 slundberg/shap

Understanding why a model makes a certain prediction can be as crucial as the prediction's accuracy in many applications.

FEATURE IMPORTANCE INTERPRETABLE MACHINE LEARNING

Distributed and parallel time series feature extraction for industrial big data applications

25 Oct 2016blue-yonder/tsfresh

This problem is especially hard to solve for time series classification and regression in industrial applications such as predictive maintenance or production line optimization, for which each label or regression target is associated with several time series and meta-information simultaneously.

FEATURE IMPORTANCE FEATURE SELECTION TIME SERIES TIME SERIES CLASSIFICATION

Gradients of Counterfactuals

8 Nov 2016kundajelab/deeplift

Unfortunately, in nonlinear deep networks, not only individual neurons but also the whole network can saturate, and as a result an important input feature can have a tiny gradient.

FEATURE IMPORTANCE LANGUAGE MODELLING OBJECT RECOGNITION

Attention is not Explanation

NAACL 2019 successar/AttentionExplanation

Attention mechanisms have seen wide adoption in neural NLP models.

FEATURE IMPORTANCE

Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record

10 Oct 2018BarnesLab/Patient2Vec

The wide implementation of electronic health record (EHR) systems facilitates the collection of large-scale health data from real clinical settings.

FEATURE IMPORTANCE

Interpreting Neural Networks With Nearest Neighbors

EMNLP 2018 Eric-Wallace/deep-knn

However, the confidence of neural networks is not a robust measure of model uncertainty.

FEATURE IMPORTANCE TEXT CLASSIFICATION

Visualizing the Feature Importance for Black Box Models

18 Apr 2018giuseppec/featureImportance

Based on local feature importance, we propose two visual tools: partial importance (PI) and individual conditional importance (ICI) plots which visualize how changes in a feature affect the model performance on average, as well as for individual observations.

FEATURE IMPORTANCE

PaloBoost: An Overfitting-robust TreeBoost with Out-of-Bag Sample Regularization Techniques

22 Jul 2018yubin-park/bonsai-dt

We propose PaloBoost, a Stochastic Gradient TreeBoost model that uses novel regularization techniques to guard against overfitting and is robust to parameter settings.

FEATURE IMPORTANCE

Disentangled Attribution Curves for Interpreting Random Forests and Boosted Trees

18 May 2019csinva/disentangled-attribution-curves

Tree ensembles, such as random forests and AdaBoost, are ubiquitous machine learning models known for achieving strong predictive performance across a wide variety of domains.

FEATURE ENGINEERING FEATURE IMPORTANCE