no code implementations • 6 Dec 2023 • Claudio Zeni, Robert Pinsler, Daniel Zügner, Andrew Fowler, Matthew Horton, Xiang Fu, Sasha Shysheya, Jonathan Crabbé, Lixin Sun, Jake Smith, Bichlien Nguyen, Hannes Schulz, Sarah Lewis, Chin-wei Huang, Ziheng Lu, Yichi Zhou, Han Yang, Hongxia Hao, Jielan Li, Ryota Tomioka, Tian Xie
We further introduce adapter modules to enable fine-tuning towards any given property constraints with a labeled dataset.
no code implementations • 13 Sep 2023 • Marco Federici, Patrick Forré, Ryota Tomioka, Bastiaan S. Veeling
Markov processes are widely used mathematical models for describing dynamic systems in various fields.
1 code implementation • NeurIPS 2023 • Leon Klein, Andrew Y. K. Foong, Tor Erlend Fjelde, Bruno Mlodozeniec, Marc Brockschmidt, Sebastian Nowozin, Frank Noé, Ryota Tomioka
Molecular dynamics (MD) simulation is a widely used technique to simulate molecular systems, most commonly at the all-atom resolution where equations of motion are integrated with timesteps on the order of femtoseconds ($1\textrm{fs}=10^{-15}\textrm{s}$).
no code implementations • 9 Nov 2021 • Keshav Santhanam, Siddharth Krishna, Ryota Tomioka, Tim Harris, Matei Zaharia
The rapidly growing size of deep neural network (DNN) models and datasets has given rise to a variety of distribution strategies such as data, tensor-model, pipeline parallelism, and hybrid combinations thereof.
1 code implementation • NeurIPS 2021 • Marco Federici, Ryota Tomioka, Patrick Forré
Safely deploying machine learning models to the real world is often a challenging process.
no code implementations • 18 Jan 2021 • Hisham Husain, Kamil Ciosek, Ryota Tomioka
Entropic regularization of policies in Reinforcement Learning (RL) is a commonly used heuristic to ensure that the learned policy explores the state-space sufficiently before overfitting to a local optimal policy.
1 code implementation • NeurIPS 2020 • Chen Liu, Mathieu Salzmann, Tao Lin, Ryota Tomioka, Sabine Süsstrunk
We analyze the influence of adversarial training on the loss landscape of machine learning models.
no code implementations • ICLR 2020 • Kamil Ciosek, Vincent Fortuin, Ryota Tomioka, Katja Hofmann, Richard Turner
Obtaining high-quality uncertainty estimates is essential for many applications of deep neural networks.
no code implementations • 15 Mar 2019 • Chen Liu, Ryota Tomioka, Volkan Cevher
This work studies the robustness certification problem of neural network models, which aims to find certified adversary-free regions as large as possible around data points.
4 code implementations • NeurIPS 2019 • Emile Mathieu, Charline Le Lan, Chris J. Maddison, Ryota Tomioka, Yee Whye Teh
We therefore endow VAEs with a Poincar\'e ball model of hyperbolic geometry as a latent space and rigorously derive the necessary methods to work with two main Gaussian generalisations on that space.
no code implementations • 29 May 2018 • Justas Dauparas, Ryota Tomioka, Katja Hofmann
The question of how to explore, i. e., take actions with uncertain outcomes to learn about possible future rewards, is a key question in reinforcement learning (RL).
no code implementations • ICLR 2018 • Mary Phuong, Max Welling, Nate Kushman, Ryota Tomioka, Sebastian Nowozin
Thus, we decouple the choice of decoder capacity and the latent code dimensionality from the amount of information stored in the code.
1 code implementation • ICLR 2018 • Alexander L. Gaunt, Matthew A. Johnson, Maik Riechert, Daniel Tarlow, Ryota Tomioka, Dimitrios Vytiniotis, Sam Webster
Through an implementation on multi-core CPUs, we show that AMP training converges to the same accuracy as conventional synchronous training algorithms in a similar number of epochs, but utilizes the available hardware more efficiently even for small minibatch sizes, resulting in significantly shorter overall training times.
2 code implementations • 24 May 2017 • Diane Bouchacourt, Ryota Tomioka, Sebastian Nowozin
We would like to learn a representation of the data which decomposes an observation into factors of variation which we can independently control.
1 code implementation • 8 May 2017 • Behnam Neyshabur, Ryota Tomioka, Ruslan Salakhutdinov, Nathan Srebro
We argue that the optimization plays a crucial role in generalization of deep learning models through implicit regularization.
no code implementations • 10 Feb 2017 • Kirthevasan Kandasamy, Yoram Bachrach, Ryota Tomioka, Daniel Tarlow, David Carter
We study reinforcement learning of chatbots with recurrent neural network architectures when the rewards are noisy and expensive to obtain.
no code implementations • 7 Nov 2016 • Liwen Zhang, John Winn, Ryota Tomioka
We propose the Gaussian attention model for content-based neural memory access.
2 code implementations • NeurIPS 2017 • Dan Alistarh, Demjan Grubic, Jerry Li, Ryota Tomioka, Milan Vojnovic
In this paper, we propose Quantized SGD (QSGD), a family of compression schemes which allow the compression of gradient updates at each node, while guaranteeing convergence under standard assumptions.
2 code implementations • NeurIPS 2016 • Sebastian Nowozin, Botond Cseke, Ryota Tomioka
Generative neural samplers are probabilistic models that implement sampling using feedforward neural networks: they take a random input vector and produce a sample from a probability distribution defined by the network weights.
1 code implementation • 15 Dec 2015 • Shinichi Nakajima, Ryota Tomioka, Masashi Sugiyama, S. Derin Babacan
In this paper, we clarify the behavior of VB learning in probabilistic PCA (or fully-observed matrix factorization).
no code implementations • 20 Nov 2015 • Behnam Neyshabur, Ryota Tomioka, Ruslan Salakhutdinov, Nathan Srebro
We propose a unified framework for neural net normalization, regularization and optimization, which includes Path-SGD and Batch-Normalization and interpolates between them across two different dimensions.
no code implementations • 6 Sep 2015 • Kishan Wimalawarne, Ryota Tomioka, Masashi Sugiyama
We theoretically and experimentally investigate tensor-based regression and classification.
1 code implementation • NeurIPS 2015 • Qinqing Zheng, Ryota Tomioka
We consider the problem of recovering a low-rank tensor from its noisy observation.
no code implementations • 5 Mar 2015 • Liwen Zhang, Subhransu Maji, Ryota Tomioka
Similarity between objects is multi-faceted and it can be easier for human annotators to measure it when the focus is on a specific aspect.
no code implementations • 27 Feb 2015 • Behnam Neyshabur, Ryota Tomioka, Nathan Srebro
We investigate the capacity, convexity and characterization of a general family of norm-constrained feed-forward networks.
no code implementations • 20 Dec 2014 • Behnam Neyshabur, Ryota Tomioka, Nathan Srebro
We present experiments demonstrating that some other form of capacity control, different from network size, plays a central role in learning multilayer feed-forward networks.
no code implementations • NeurIPS 2014 • Kishan Wimalawarne, Masashi Sugiyama, Ryota Tomioka
We study a multitask learning problem in which each task is parametrized by a weight vector and indexed by a pair of indices, which can be e. g, (consumer, time).
no code implementations • 7 Jul 2014 • Ryota Tomioka, Taiji Suzuki
We show that the spectral norm of a random $n_1\times n_2\times \cdots \times n_K$ tensor (or higher-order array) scales as $O\left(\sqrt{(\sum_{k=1}^{K}n_k)\log(K)}\right)$ under some sub-Gaussian assumption on the entries.
no code implementations • NeurIPS 2013 • Ryota Tomioka, Taiji Suzuki
We discuss structured Schatten norms for tensor decomposition that includes two recently proposed norms ("overlapped" and "latent") for convex-optimization-based tensor decomposition, and connect tensor decomposition with wider literature on structured sparsity.
no code implementations • NeurIPS 2012 • Shinichi Nakajima, Ryota Tomioka, Masashi Sugiyama, S. D. Babacan
The variational Bayesian (VB) approach is one of the best tractable approximations to the Bayesian estimation, and it was demonstrated to perform well in many applications.
no code implementations • 17 Nov 2012 • Franz J. Király, Louis Theran, Ryota Tomioka
We present a novel algebraic combinatorial view on low-rank matrix completion based on studying relations between a few entries with tools from algebraic geometry and matroid theory.
no code implementations • NeurIPS 2011 • Ryota Tomioka, Taiji Suzuki, Kohei Hayashi, Hisashi Kashima
We analyze the statistical performance of a recently proposed convex tensor decomposition algorithm.
no code implementations • NeurIPS 2010 • Shinichi Nakajima, Masashi Sugiyama, Ryota Tomioka
Bayesian methods of matrix factorization (MF) have been actively explored recently as promising alternatives to classical singular value decomposition.