1 code implementation • 1 Aug 2023 • Kishan Wimalawarne, Taiji Suzuki, Sophie Langer
Learning the Green's function using deep learning models enables to solve different classes of partial differential equations.
no code implementations • 12 Sep 2022 • Kishan Wimalawarne, Taiji Suzuki
Additionally, we propose adaptive learning between directly graph polynomial convolution models and learning directly from the adjacency matrix.
no code implementations • 24 Aug 2021 • Kishan Wimalawarne, Taiji Suzuki
We investigate adaptive layer-wise graph convolution in deep GCN models.
no code implementations • NeurIPS 2018 • Kishan Wimalawarne, Hiroshi Mamitsuka
Coupled norms have emerged as a convex method to solve coupled tensor completion.
no code implementations • 15 May 2017 • Kishan Wimalawarne, Makoto Yamada, Hiroshi Mamitsuka
We propose a set of convex low rank inducing norms for a coupled matrices and tensors (hereafter coupled tensors), which shares information between matrices and tensors through common modes.
no code implementations • 6 Sep 2015 • Kishan Wimalawarne, Ryota Tomioka, Masashi Sugiyama
We theoretically and experimentally investigate tensor-based regression and classification.
1 code implementation • 4 Jul 2015 • Makoto Yamada, Wenzhao Lian, Amit Goyal, Jianhui Chen, Kishan Wimalawarne, Suleiman A. Khan, Samuel Kaski, Hiroshi Mamitsuka, Yi Chang
We propose the convex factorization machine (CFM), which is a convex variant of the widely used Factorization Machines (FMs).
no code implementations • NeurIPS 2014 • Kishan Wimalawarne, Masashi Sugiyama, Ryota Tomioka
We study a multitask learning problem in which each task is parametrized by a weight vector and indexed by a pair of indices, which can be e. g, (consumer, time).