no code implementations • 20 Jun 2021 • Gerald M Pao, Cameron Smith, Joseph Park, Keichi Takahashi, Wassapon Watanakeesuntorn, Hiroaki Natsukawa, Sreekanth H Chalasani, Tom Lorimer, Ryousei Takano, Nuttida Rungratsameetaweemana, George Sugihara
Thus, as a final validation of how well GMN captures essential dynamic information, we show that the artificially generated time series can be used as a training set to predict out-of-sample observed fly locomotion, as well as brain activity in out of sample withheld data not used in model building.
no code implementations • 19 Apr 2021 • Albert Njoroge Kahira, Truong Thao Nguyen, Leonardo Bautista Gomez, Ryousei Takano, Rosa M Badia, Mohamed Wahib
Deep Neural Network (DNN) frameworks use distributed training to enable faster time to convergence and alleviate memory capacity limitations when training large models and/or using high dimension inputs.
1 code implementation • 2 Dec 2020 • Wassapon Watanakeesuntorn, Keichi Takahashi, Kohei Ichikawa, Joseph Park, George Sugihara, Ryousei Takano, Jason Haga, Gerald M. Pao
Empirical Dynamic Modeling (EDM) is a nonlinear time series causal inference framework.
no code implementations • 26 Aug 2020 • Mohamed Wahib, Haoyu Zhang, Truong Thao Nguyen, Aleksandr Drozd, Jens Domke, Lingqi Zhang, Ryousei Takano, Satoshi Matsuoka
An alternative solution is to use out-of-core methods instead of, or in addition to, data parallelism.
1 code implementation • 28 Jul 2019 • Takahiro Hirofuchi, Ryousei Takano
Through experiments, we confirmed that even though a VM has only 1% of DRAM in its RAM, the performance degradation of the VM was drastically alleviated by memory mapping optimization.
Operating Systems Hardware Architecture Performance D.4; B.3
2 code implementations • 5 Feb 2019 • Yuma Kishi, Tsutomu Ikegami, Shin-ichi O'uchi, Ryousei Takano, Wakana Nogami, Tomohiro Kudoh
Perturbative GAN, which replaces convolution layers of existing convolutional GANs (DCGAN, WGAN-GP, BIGGAN, etc.)