You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 2 Nov 2021 • Matthias Weissenbacher, Samarth Sinha, Animesh Garg, Yoshinobu Kawahara

The learned policies may then be deployed in real-world settings where interactions are costly or dangerous.

1 code implementation • NeurIPS 2021 • Keisuke Fujii, Naoya Takeishi, Kazushi Tsutsui, Emyo Fujioka, Nozomi Nishiumi, Ryoya Tanaka, Mika Fukushiro, Kaoru Ide, Hiroyoshi Kohno, Ken Yoda, Susumu Takahashi, Shizuko Hiryu, Yoshinobu Kawahara

In this paper, we propose a new framework for learning Granger causality from multi-animal trajectories via augmented theory-based behavioral models with interpretable data-driven models.

1 code implementation • 30 Jun 2021 • Motoya Ohnishi, Isao Ishikawa, Kendall Lowrey, Masahiro Ikeda, Sham Kakade, Yoshinobu Kawahara

In this work, we present a novel paradigm of controlling nonlinear systems via the minimization of the Koopman spectrum cost: a cost over the Koopman operator of the controlled dynamics.

1 code implementation • 11 Mar 2021 • Matthias Weissenbacher, Yoshinobu Kawahara

In this work we discuss the incorporation of quadratic neurons into policy networks in the context of model-free actor-critic reinforcement learning.

no code implementations • 19 Feb 2021 • Naoya Takeishi, Keisuke Fujii, Koh Takeuchi, Yoshinobu Kawahara

Extracting coherent patterns is one of the standard approaches towards understanding spatio-temporal data.

no code implementations • 9 Feb 2021 • Tomoharu Iwata, Yoshinobu Kawahara

With the proposed method, a representation of a given short time-series is obtained by a bidirectional LSTM for extracting its properties.

no code implementations • 27 Jan 2021 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

no code implementations • 11 Dec 2020 • Tomoharu Iwata, Yoshinobu Kawahara

With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition, enabling end-to-end learning of Koopman spectral analysis.

no code implementations • 29 Jul 2020 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Yoshinobu Kawahara

Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data, where the measures are conventionally embedded into a reproducing kernel Hilbert space (RKHS).

2 code implementations • 7 Jul 2020 • Keisuke Fujii, Naoya Takeishi, Yoshinobu Kawahara, Kazuya Takeda

Extracting the rules of real-world biological multi-agent behaviors is a current challenge in various scientific and engineering fields.

no code implementations • 16 Jun 2020 • Naoya Takeishi, Yoshinobu Kawahara

Invariance and stability are essential notions in dynamical systems study, and thus it is of great interest to learn a dynamics model with a stable invariant set.

1 code implementation • 9 Apr 2020 • Naoya Takeishi, Yoshinobu Kawahara

We focus on the semi-supervised anomaly detection and newly propose a characteristic function, on which the Shapley value is computed, specifically for anomaly scores.

no code implementations • 2 Mar 2020 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

no code implementations • 25 Sep 2019 • Israr Ul Haq, Yoshinobu Kawahara

Extracting underlying dynamics of objects in image sequences is one of the challenging problems in computer vision.

no code implementations • 9 Sep 2019 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Yoichi Matsuo, Yoshinobu Kawahara

In this paper, we address a lifted representation of nonlinear dynamical systems with random noise based on transfer operators, and develop a novel Krylov subspace method for estimating the operators using finite data, with consideration of the unboundedness of operators.

no code implementations • 17 Jun 2019 • Isao Ishikawa, Akinori Tanaka, Masahiro Ikeda, Yoshinobu Kawahara

We empirically illustrate our metric with synthetic data, and evaluate it in the context of the independence test for random processes.

1 code implementation • 13 May 2019 • Keisuke Fujii, Naoya Takeishi, Motokazu Hojo, Yuki Inaba, Yoshinobu Kawahara

A fundamental question addressed here pertains to the classification of collective motion network based on physically-interpretable dynamical properties.

no code implementations • 26 Apr 2019 • Naoya Uematsu, Shunji Umetani, Yoshinobu Kawahara

For the problem of maximizing an approximately submodular function (ASFM problem), a greedy algorithm quickly finds good feasible solutions for many instances while guaranteeing ($1-e^{-\gamma}$)-approximation ratio for a given submodular ratio $\gamma$.

no code implementations • 6 Feb 2019 • Naoya Takeishi, Yoshinobu Kawahara

Prior domain knowledge can greatly help to learn generative models.

no code implementations • 10 Nov 2018 • Naoya Uematsu, Shunji Umetani, Yoshinobu Kawahara

Nemhauser and Wolsey developed an exact algorithm called the constraint generation algorithm that starts from a reduced BIP problem with a small subset of constraints taken from the constraints and repeats solving a reduced BIP problem while adding a new constraint at each iteration.

1 code implementation • 30 Aug 2018 • Keisuke Fujii, Yoshinobu Kawahara

In this paper, we formulate Koopman spectral analysis for NLDSs with structures among observables and propose an estimation algorithm for this problem.

2 code implementations • NeurIPS 2018 • Isao Ishikawa, Keisuke Fujii, Masahiro Ikeda, Yuka Hashimoto, Yoshinobu Kawahara

The development of a metric for structural data is a long-term problem in pattern recognition and machine learning.

no code implementations • NeurIPS 2017 • Naoya Takeishi, Yoshinobu Kawahara, Takehisa Yairi

Spectral decomposition of the Koopman operator is attracting attention as a tool for the analysis of nonlinear dynamical systems.

no code implementations • NeurIPS 2016 • Yoshinobu Kawahara

In this paper, we consider a spectral analysis of the Koopman operator in a reproducing kernel Hilbert space (RKHS).

no code implementations • 14 Sep 2015 • Yoshinobu Kawahara, Yutaro Yamaguchi

The proximal problem for structured penalties obtained via convex relaxations of submodular functions is known to be equivalent to minimizing separable convex functions over the corresponding submodular polyhedra.

no code implementations • 9 Aug 2014 • Shohei Shimizu, Aapo Hyvarinen, Yoshinobu Kawahara

Structural equation models and Bayesian networks have been widely used to analyze causal relations between continuous variables.

no code implementations • 22 Jan 2014 • Takanori Inazumi, Takashi Washio, Shohei Shimizu, Joe Suzuki, Akihiro Yamamoto, Yoshinobu Kawahara

Discovering causal relations among observed variables in a given data set is a major objective in studies of statistics and artificial intelligence.

no code implementations • 26 Sep 2013 • Kiyohito Nagano, Yoshinobu Kawahara

A number of discrete and continuous optimization problems in machine learning are related to convex minimization problems under submodular constraints.

no code implementations • NeurIPS 2012 • Tsuyoshi Ueno, Kohei Hayashi, Takashi Washio, Yoshinobu Kawahara

Reinforcement learning (RL) methods based on direct policy search (DPS) have been actively discussed to achieve an efficient approach to complicated Markov decision processes (MDPs).

no code implementations • 10 Nov 2012 • Chloé-Agathe Azencott, Dominik Grimm, Mahito Sugiyama, Yoshinobu Kawahara, Karsten M. Borgwardt

We present SConES, a new efficient method to discover sets of genetic loci that are maximally associated with a phenotype, while being connected in an underlying network.

no code implementations • NeurIPS 2010 • Kiyohito Nagano, Yoshinobu Kawahara, Satoru Iwata

In this paper, we introduce the minimum average cost criterion, and show that the theory of intersecting submodular functions can be used for clustering with submodular objective functions.

no code implementations • NeurIPS 2009 • Yoshinobu Kawahara, Kiyohito Nagano, Koji Tsuda, Jeff A. Bilmes

Several key problems in machine learning, such as feature selection and active learning, can be formulated as submodular set function maximization.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.