Search Results for author: Masayuki Karasuyama

Found 17 papers, 3 papers with code

Bayesian Optimization for Distributionally Robust Chance-constrained Problem

no code implementations31 Jan 2022 Yu Inatsu, Shion Takeno, Masayuki Karasuyama, Ichiro Takeuchi

In black-box function optimization, we need to consider not only controllable design variables but also uncontrollable stochastic environment variables.

Sequential- and Parallel- Constrained Max-value Entropy Search via Information Lower Bound

no code implementations19 Feb 2021 Shion Takeno, Tomoyuki Tamura, Kazuki Shitara, Masayuki Karasuyama

Max-value entropy search (MES) is one of the state-of-the-art approaches in Bayesian optimization (BO).

Cost-effective search for lower-error region in material parameter space using multifidelity Gaussian process modeling

no code implementations15 Mar 2020 Shion Takeno, Yuhki Tsukada, Hitoshi Fukuoka, Toshiyuki Koyama, Motoki Shiga, Masayuki Karasuyama

Hence, we considered estimating a region of material parameter space in which a computational model produces precipitates having shapes similar to those observed in the experimental images.

Distance Metric Learning for Graph Structured Data

2 code implementations3 Feb 2020 Tomoki Yoshida, Ichiro Takeuchi, Masayuki Karasuyama

Hence, we propose a supervised distance metric learning method for the graph classification problem.

General Classification Graph Classification +1

Active learning for level set estimation under cost-dependent input uncertainty

no code implementations13 Sep 2019 Yu Inatsu, Masayuki Karasuyama, Keiichi Inoue, Ichiro Takeuchi

As part of a quality control process in manufacturing it is often necessary to test whether all parts of a product satisfy a required property, with as few inspections as possible.

Active Learning

Statistically Discriminative Sub-trajectory Mining

no code implementations6 May 2019 Vo Nguyen Le Duy, Takuto Sakuma, Taiju Ishiyama, Hiroki Toda, Kazuya Nishi, Masayuki Karasuyama, Yuta Okubo, Masayuki Sunaga, Yasuo Tabei, Ichiro Takeuchi

Given two groups of trajectories, the goal of this problem is to extract moving patterns in the form of sub-trajectories which are more similar to sub-trajectories of one group and less similar to those of the other.

Safe Triplet Screening for Distance Metric Learning

1 code implementation12 Feb 2018 Tomoki Yoshida, Ichiro Takeuchi, Masayuki Karasuyama

Distance metric learning can optimize a metric over a set of triplets, each one of which is defined by a pair of same class instances and an instance in a different class.

Metric Learning

Safe Pattern Pruning: An Efficient Approach for Predictive Pattern Mining

no code implementations15 Feb 2016 Kazuya Nakagawa, Shinya Suzumura, Masayuki Karasuyama, Koji Tsuda, Ichiro Takeuchi

The SPP method allows us to efficiently find a superset of all the predictive patterns in the database that are needed for the optimal predictive model.

Graph Mining

Simultaneous Safe Screening of Features and Samples in Doubly Sparse Modeling

no code implementations8 Feb 2016 Atsushi Shibagaki, Masayuki Karasuyama, Kohei Hatano, Ichiro Takeuchi

A significant advantage of considering them simultaneously rather than individually is that they have a synergy effect in the sense that the results of the previous safe feature screening can be exploited for improving the next safe sample screening performances, and vice-versa.

Homotopy Continuation Approaches for Robust SV Classification and Regression

no code implementations12 Jul 2015 Shinya Suzumura, Kohei Ogawa, Masashi Sugiyama, Masayuki Karasuyama, Ichiro Takeuchi

An advantage of our homotopy approach is that it can be interpreted as simulated annealing, a common approach for finding a good local optimal solution in non-convex optimization problems.

Classification General Classification +2

Safe Feature Pruning for Sparse High-Order Interaction Models

no code implementations26 Jun 2015 Kazuya Nakagawa, Shinya Suzumura, Masayuki Karasuyama, Koji Tsuda, Ichiro Takeuchi

An SFS rule has a property that, if a feature satisfies the rule, then the feature is guaranteed to be non-active in the LASSO solution, meaning that it can be safely screened-out prior to the LASSO training process.

Sparse Learning

Regularization Path of Cross-Validation Error Lower Bounds

1 code implementation NeurIPS 2015 Atsushi Shibagaki, Yoshiki Suzuki, Masayuki Karasuyama, Ichiro Takeuchi

Careful tuning of a regularization parameter is indispensable in many machine learning tasks because it has a significant impact on generalization performances.

Manifold-based Similarity Adaptation for Label Propagation

no code implementations NeurIPS 2013 Masayuki Karasuyama, Hiroshi Mamitsuka

In this approach, edge weights represent both similarity and local reconstruction weight simultaneously, both being reasonable for label propagation.

Multiple Incremental Decremental Learning of Support Vector Machines

no code implementations NeurIPS 2009 Masayuki Karasuyama, Ichiro Takeuchi

Conventional single cremental decremental SVM can update the trained model efficiently when single data point is added to or removed from the training set.

Cannot find the paper you are looking for? You can Submit a new open access paper.