Mask and Understand: Evaluating the Importance of Parameters

29 Sep 2021  ·  Bowei Zhu, Yong liu ·

Influence functions are classic techniques from robust statistics based on first-order Taylor approximations that have been widely used in the machine learning community to estimate small perturbations of datasets accurately to the model. However, existing researches concentrate on the estimate the perturbations of the training or pre-training points. In this paper, we introduce the influence functions to predict the effects of removing features or parameters. It is worth emphasizing that our method can be applied to explore the influence of any combination of parameters disturbance on the model whether they belong to the same layer or whether are related. The validation and experiments also demonstrate that the influence functions for parameters can be used in many fields such as understanding model structure, model pruning, feature importance ranking, and any other strategies of masking parameters as you can imagine when you want to evaluate the importance of a group of parameters.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here