Search Results for author: Dinesh Singh

Found 3 papers, 2 papers with code

Nys-Newton: Nyström-Approximated Curvature for Stochastic Optimization

no code implementations16 Oct 2021 Dinesh Singh, Hardik Tankaria, Makoto Yamada

However, the secant equation becomes insipid in approximating the Newton step owing to its use of the first-order derivatives.

Stochastic Optimization

FsNet: Feature Selection Network on High-dimensional Biological Data

2 code implementations23 Jan 2020 Dinesh Singh, Héctor Climente-González, Mathis Petrovich, Eiryo Kawakami, Makoto Yamada

Because a large number of parameters in the selection and reconstruction layers can easily result in overfitting under a limited number of samples, we use two tiny networks to predict the large, virtual weight matrices of the selection and reconstruction layers.

BIG-bench Machine Learning feature selection +1

GraphLIME: Local Interpretable Model Explanations for Graph Neural Networks

2 code implementations17 Jan 2020 Qiang Huang, Makoto Yamada, Yuan Tian, Dinesh Singh, Dawei Yin, Yi Chang

In this paper, we propose GraphLIME, a local interpretable model explanation for graphs using the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which is a nonlinear feature selection method.

Descriptive feature selection

Cannot find the paper you are looking for? You can Submit a new open access paper.