no code implementations • 12 Feb 2024 • Junpei Komiyama, Shinji Ito, Yuichi Yoshida, Souta Koshino

For the analysis of these algorithms, we propose a principled approach to limiting the probability of nonreplication.

no code implementations • 6 Feb 2024 • Yuta Kawachi, Mitsuru Ambai, Yuichi Yoshida, Gaku Takano

The current vector trajectories of the RNN showed that the RNN could automatically determine arbitrary trajectories in the flux-weakening region in accordance with an arbitrarily designed loss function.

no code implementations • 25 Apr 2023 • Yuri Kinoshita, Kenta Oono, Kenji Fukumizu, Yuichi Yoshida, Shin-ichi Maeda

Variational autoencoders (VAEs) are one of the deep generative models that have experienced enormous success over the past decades.

no code implementations • 18 Jan 2022 • Akbar Rafiey, Yuichi Yoshida

The underlying submodular functions for many of these tasks are decomposable, i. e., they are sum of several simple submodular functions.

no code implementations • 7 Jun 2021 • Pan Peng, Daniel Lopatta, Yuichi Yoshida, Gramoz Goranci

Effective resistance is an important metric that measures the similarity of two vertices in a graph.

no code implementations • EACL 2021 • Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi

Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities.

1 code implementation • 25 Jan 2021 • Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi

Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities.

no code implementations • NeurIPS 2020 • Shinji Ito, Shuichi Hirahara, Tasuku Soma, Yuichi Yoshida

We propose novel algorithms with first- and second-order regret bounds for adversarial linear bandits.

no code implementations • 15 Jul 2020 • Nathaniel Harms, Yuichi Yoshida

For many important classes of functions, such as intersections of halfspaces, polynomial threshold functions, convex sets, and $k$-alternating functions, the known algorithms either have complexity that depends on the support size of the distribution, or are proven to work only for specific examples of product distributions.

no code implementations • ICML 2020 • Akbar Rafiey, Yuichi Yoshida

In this paper, we study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy.

1 code implementation • 15 Jun 2020 • Yuuki Takai, Atsushi Miyauchi, Masahiro Ikeda, Yuichi Yoshida

For both algorithms, we discuss theoretical guarantees on the conductance of the output vertex set.

no code implementations • 7 Jun 2020 • Pan Peng, Yuichi Yoshida

To make reliable and efficient decisions based on spectral clustering, we assess the stability of spectral clustering against edge perturbations in the input graph using the notion of average sensitivity, which is the expected size of the symmetric difference of the output clusters before and after we randomly remove edges.

no code implementations • 14 Feb 2020 • Tasuku Soma, Yuichi Yoshida

For convex and Lipschitz loss functions, we show that our algorithm has $O(1/\sqrt{n})$-convergence to the optimal CVaR, where $n$ is the number of samples.

no code implementations • 13 Feb 2020 • Chien-Chung Huang, Naonori Kakimura, Simon Mauras, Yuichi Yoshida

The latter one almost matches our lower bound of $\frac{K}{2K-1}$ for a matroid constraint, which almost settles the approximation ratio for a matroid constraint that can be obtained by a streaming algorithm whose space complexity is independent of $n$.

no code implementations • ICLR 2019 • Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi

Existing methods for learning KGEs can be seen as a two-stage process where (a) entities and relations in the knowledge graph are represented using some linear algebraic structures (embeddings), and (b) a scoring function is defined that evaluates the strength of a relation that holds between two entities using the corresponding relation and entity embeddings.

no code implementations • 28 Jan 2019 • Kohei Hayashi, Masaaki Imaizumi, Yuichi Yoshida

In this paper, we study random subsampling of Gaussian process regression, one of the simplest approximation baselines, from a theoretical perspective.

no code implementations • 13 Sep 2018 • Kent Fujiwara, Ikuro Sato, Mitsuru Ambai, Yuichi Yoshida, Yoshiaki Sakakura

We present a novel compact point cloud representation that is inherently invariant to scale, coordinate change and point permutation.

no code implementations • 26 Feb 2018 • Fanhua Shang, Yuanyuan Liu, Kaiwen Zhou, James Cheng, Kelvin K. W. Ng, Yuichi Yoshida

In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of stochastic variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct.

38 code implementations • ICLR 2018 • Takeru Miyato, Toshiki Kataoka, Masanori Koyama, Yuichi Yoshida

One of the challenges in the study of generative adversarial networks is the instability of its training.

Ranked #24 on Image Generation on STL-10

1 code implementation • NeurIPS 2017 • Kohei Hayashi, Yuichi Yoshida

Then, we show that the residual error of the Tucker decomposition of $\tilde{X}$ is sufficiently close to that of $X$ with high probability.

no code implementations • 5 Sep 2017 • Danushka Bollegala, Yuichi Yoshida, Ken-ichi Kawarabayashi

Co-occurrences between two words provide useful insights into the semantics of those words.

no code implementations • 31 May 2017 • Yuichi Yoshida, Takeru Miyato

We investigate the generalizability of deep learning based on the sensitivity to input perturbation.

no code implementations • 20 Mar 2017 • Fanhua Shang, Yuanyuan Liu, James Cheng, Kelvin Kai Wing Ng, Yuichi Yoshida

In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct.

1 code implementation • NeurIPS 2016 • Kohei Hayashi, Yuichi Yoshida

A sampling-based optimization method for quadratic functions is proposed.

no code implementations • NeurIPS 2015 • Naoto Ohsaka, Yuichi Yoshida

A $k$-submodular function is a generalization of a submodular function, where the input consists of $k$ disjoint subsets, instead of a single subset, of the domain. Many machine learning problems, including influence maximization with $k$ kinds of topics and sensor placement with $k$ kinds of sensors, can be naturally modeled as the problem of maximizing monotone $k$-submodular functions. In this paper, we give constant-factor approximation algorithms for maximizing monotone $k$-submodular functions subject to several size constraints. The running time of our algorithms are almost linear in the domain size. We experimentally demonstrate that our algorithms outperform baseline algorithms in terms of the solution quality.

no code implementations • NeurIPS 2015 • Tasuku Soma, Yuichi Yoshida

We show that the generalized submodular cover problem can be applied to various problems and devise a bicriteria approximation algorithm.

no code implementations • 7 Dec 2014 • Danushka Bollegala, Takanori Maehara, Yuichi Yoshida, Ken-ichi Kawarabayashi

To evaluate the accuracy of the word representations learnt using the proposed method, we use the learnt word representations to solve semantic word analogy problems.

2 code implementations • 10 Oct 2013 • Yoichi Iwata, Magnus Wahlström, Yuichi Yoshida

In addition to the insight into problems with half-integral relaxations, our results yield a range of new and improved FPT algorithms, including an $O^*(|\Sigma|^{2k})$-time algorithm for node-deletion Unique Label Cover with label set $\Sigma$ and an $O^*(4^k)$-time algorithm for Group Feedback Vertex Set, including the setting where the group is only given by oracle access.

Data Structures and Algorithms

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.