no code implementations • 16 Oct 2021 • Dinesh Singh, Hardik Tankaria, Makoto Yamada
However, the secant equation becomes insipid in approximating the Newton step owing to its use of the first-order derivatives.
2 code implementations • 23 Jan 2020 • Dinesh Singh, Héctor Climente-González, Mathis Petrovich, Eiryo Kawakami, Makoto Yamada
Because a large number of parameters in the selection and reconstruction layers can easily result in overfitting under a limited number of samples, we use two tiny networks to predict the large, virtual weight matrices of the selection and reconstruction layers.
2 code implementations • 17 Jan 2020 • Qiang Huang, Makoto Yamada, Yuan Tian, Dinesh Singh, Dawei Yin, Yi Chang
In this paper, we propose GraphLIME, a local interpretable model explanation for graphs using the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which is a nonlinear feature selection method.