Influence Approximation
4 papers with code • 0 benchmarks • 0 datasets
Estimating the influence of training triples on the behavior of a machine learning model.
Benchmarks
These leaderboards are used to track progress in Influence Approximation
Latest papers
Deeper Understanding of Black-box Predictions via Generalized Influence Functions
Influence functions (IFs) elucidate how training data changes model behavior.
DataInf: Efficiently Estimating Data Influence in LoRA-tuned LLMs and Diffusion Models
Quantifying the impact of training data points is crucial for understanding the outputs of machine learning models and for improving the transparency of the AI pipeline.
Explaining Neural Matrix Factorization with Gradient Rollback
Moreover, we show theoretically that the difference between gradient rollback's influence approximation and the true influence on a model's behavior is smaller than known bounds on the stability of stochastic gradient descent.
On the Accuracy of Influence Functions for Measuring Group Effects
Influence functions estimate the effect of removing a training point on a model without the need to retrain.