Search Results for author: Yuichi Yoshida

Found 28 papers, 6 papers with code

Replicability is Asymptotically Free in Multi-armed Bandits

no code implementations12 Feb 2024 Junpei Komiyama, Shinji Ito, Yuichi Yoshida, Souta Koshino

For the analysis of these algorithms, we propose a principled approach to limiting the probability of nonreplication.

Decision Making Multi-Armed Bandits

PMSM transient response optimization by end-to-end optimal control

no code implementations6 Feb 2024 Yuta Kawachi, Mitsuru Ambai, Yuichi Yoshida, Gaku Takano

The current vector trajectories of the RNN showed that the RNN could automatically determine arbitrary trajectories in the flux-weakening region in accordance with an arbitrarily designed loss function.

Controlling Posterior Collapse by an Inverse Lipschitz Constraint on the Decoder Network

no code implementations25 Apr 2023 Yuri Kinoshita, Kenta Oono, Kenji Fukumizu, Yuichi Yoshida, Shin-ichi Maeda

Variational autoencoders (VAEs) are one of the deep generative models that have experienced enormous success over the past decades.

Decoder

Sparsification of Decomposable Submodular Functions

no code implementations18 Jan 2022 Akbar Rafiey, Yuichi Yoshida

The underlying submodular functions for many of these tasks are decomposable, i. e., they are sum of several simple submodular functions.

Local Algorithms for Estimating Effective Resistance

no code implementations7 Jun 2021 Pan Peng, Daniel Lopatta, Yuichi Yoshida, Gramoz Goranci

Effective resistance is an important metric that measures the similarity of two vertices in a graph.

Clustering Graph Clustering +1

RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding

no code implementations EACL 2021 Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi

Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities.

Knowledge Graph Embedding Knowledge Graph Embeddings +1

RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding

1 code implementation25 Jan 2021 Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi

Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities.

Knowledge Graph Embedding Knowledge Graph Embeddings +1

Tight First- and Second-Order Regret Bounds for Adversarial Linear Bandits

no code implementations NeurIPS 2020 Shinji Ito, Shuichi Hirahara, Tasuku Soma, Yuichi Yoshida

We propose novel algorithms with first- and second-order regret bounds for adversarial linear bandits.

Downsampling for Testing and Learning in Product Distributions

no code implementations15 Jul 2020 Nathaniel Harms, Yuichi Yoshida

For many important classes of functions, such as intersections of halfspaces, polynomial threshold functions, convex sets, and $k$-alternating functions, the known algorithms either have complexity that depends on the support size of the distribution, or are proven to work only for specific examples of product distributions.

Fast and Private Submodular and $k$-Submodular Functions Maximization with Matroid Constraints

no code implementations ICML 2020 Akbar Rafiey, Yuichi Yoshida

In this paper, we study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy.

Data Summarization

Hypergraph Clustering Based on PageRank

1 code implementation15 Jun 2020 Yuuki Takai, Atsushi Miyauchi, Masahiro Ikeda, Yuichi Yoshida

For both algorithms, we discuss theoretical guarantees on the conductance of the output vertex set.

Clustering

Average Sensitivity of Spectral Clustering

no code implementations7 Jun 2020 Pan Peng, Yuichi Yoshida

To make reliable and efficient decisions based on spectral clustering, we assess the stability of spectral clustering against edge perturbations in the input graph using the notion of average sensitivity, which is the expected size of the symmetric difference of the output clusters before and after we randomly remove edges.

Clustering

Statistical Learning with Conditional Value at Risk

no code implementations14 Feb 2020 Tasuku Soma, Yuichi Yoshida

For convex and Lipschitz loss functions, we show that our algorithm has $O(1/\sqrt{n})$-convergence to the optimal CVaR, where $n$ is the number of samples.

Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model

no code implementations13 Feb 2020 Chien-Chung Huang, Naonori Kakimura, Simon Mauras, Yuichi Yoshida

The latter one almost matches our lower bound of $\frac{K}{2K-1}$ for a matroid constraint, which almost settles the approximation ratio for a matroid constraint that can be obtained by a streaming algorithm whose space complexity is independent of $n$.

2k

RelWalk -- A Latent Variable Model Approach to Knowledge Graph Embedding

no code implementations ICLR 2019 Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi

Existing methods for learning KGEs can be seen as a two-stage process where (a) entities and relations in the knowledge graph are represented using some linear algebraic structures (embeddings), and (b) a scoring function is defined that evaluates the strength of a relation that holds between two entities using the corresponding relation and entity embeddings.

Entity Embeddings Knowledge Graph Embedding +2

On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis

no code implementations28 Jan 2019 Kohei Hayashi, Masaaki Imaizumi, Yuichi Yoshida

In this paper, we study random subsampling of Gaussian process regression, one of the simplest approximation baselines, from a theoretical perspective.

regression

Canonical and Compact Point Cloud Representation for Shape Classification

no code implementations13 Sep 2018 Kent Fujiwara, Ikuro Sato, Mitsuru Ambai, Yuichi Yoshida, Yoshiaki Sakakura

We present a novel compact point cloud representation that is inherently invariant to scale, coordinate change and point permutation.

Classification General Classification

Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization

no code implementations26 Feb 2018 Fanhua Shang, Yuanyuan Liu, Kaiwen Zhou, James Cheng, Kelvin K. W. Ng, Yuichi Yoshida

In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of stochastic variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct.

Stochastic Optimization

Fitting Low-Rank Tensors in Constant Time

1 code implementation NeurIPS 2017 Kohei Hayashi, Yuichi Yoshida

Then, we show that the residual error of the Tucker decomposition of $\tilde{X}$ is sufficiently close to that of $X$ with high probability.

Tensor Decomposition

Spectral Norm Regularization for Improving the Generalizability of Deep Learning

no code implementations31 May 2017 Yuichi Yoshida, Takeru Miyato

We investigate the generalizability of deep learning based on the sensitivity to input perturbation.

Deep Learning

Guaranteed Sufficient Decrease for Variance Reduced Stochastic Gradient Descent

no code implementations20 Mar 2017 Fanhua Shang, Yuanyuan Liu, James Cheng, Kelvin Kai Wing Ng, Yuichi Yoshida

In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct.

Stochastic Optimization

Minimizing Quadratic Functions in Constant Time

1 code implementation NeurIPS 2016 Kohei Hayashi, Yuichi Yoshida

A sampling-based optimization method for quadratic functions is proposed.

Monotone k-Submodular Function Maximization with Size Constraints

no code implementations NeurIPS 2015 Naoto Ohsaka, Yuichi Yoshida

A $k$-submodular function is a generalization of a submodular function, where the input consists of $k$ disjoint subsets, instead of a single subset, of the domain. Many machine learning problems, including influence maximization with $k$ kinds of topics and sensor placement with $k$ kinds of sensors, can be naturally modeled as the problem of maximizing monotone $k$-submodular functions. In this paper, we give constant-factor approximation algorithms for maximizing monotone $k$-submodular functions subject to several size constraints. The running time of our algorithms are almost linear in the domain size. We experimentally demonstrate that our algorithms outperform baseline algorithms in terms of the solution quality.

BIG-bench Machine Learning

A Generalization of Submodular Cover via the Diminishing Return Property on the Integer Lattice

no code implementations NeurIPS 2015 Tasuku Soma, Yuichi Yoshida

We show that the generalized submodular cover problem can be applied to various problems and devise a bicriteria approximation algorithm.

Learning Word Representations from Relational Graphs

no code implementations7 Dec 2014 Danushka Bollegala, Takanori Maehara, Yuichi Yoshida, Ken-ichi Kawarabayashi

To evaluate the accuracy of the word representations learnt using the proposed method, we use the learnt word representations to solve semantic word analogy problems.

Representation Learning

Half-integrality, LP-branching and FPT Algorithms

2 code implementations10 Oct 2013 Yoichi Iwata, Magnus Wahlström, Yuichi Yoshida

In addition to the insight into problems with half-integral relaxations, our results yield a range of new and improved FPT algorithms, including an $O^*(|\Sigma|^{2k})$-time algorithm for node-deletion Unique Label Cover with label set $\Sigma$ and an $O^*(4^k)$-time algorithm for Group Feedback Vertex Set, including the setting where the group is only given by oracle access.

Data Structures and Algorithms

Cannot find the paper you are looking for? You can Submit a new open access paper.