1 code implementation • 15 Jul 2024 • Akifumi Okuno

This paper presents an integrated perspective on robustness in regression.

2 code implementations • 4 Aug 2023 • Akifumi Okuno

While the $(k, q)$-VR terms applied to general parametric models are computationally intractable due to the integration, this study provides a stochastic optimization algorithm, that can efficiently train general models with the $(k, q)$-VR without conducting explicit numerical integration.

1 code implementation • 11 Jul 2023 • Akifumi Okuno

Density power divergence (DPD) is designed to robustly estimate the underlying distribution of observations, in the presence of outliers.

1 code implementation • 28 Jun 2023 • Akifumi Okuno, Yuya Morishita, Yoh-ichi Mototake

This study delves into the domain of dynamical systems, specifically the forecasting of dynamical time series defined through an evolution function.

1 code implementation • 31 Mar 2023 • Akifumi Okuno, Kazuharu Harada

This study proposes an interpretable neural network-based non-proportional odds model (N$^3$POM) for ordinal regression.

1 code implementation • 18 Apr 2022 • Akifumi Okuno, Kohei Hattori

In this study, we examine a clustering problem in which the covariates of each individual element in a dataset are associated with an uncertainty specific to that element.

no code implementations • 28 Dec 2021 • Ruixing Cao, Akifumi Okuno, Kei Nakagawa, Hidetoshi Shimodaira

For correcting the asymptotic bias with fewer observations, this paper proposes a \emph{local radial regression (LRR)} and its logistic regression variant called \emph{local radial logistic regression~(LRLR)}, by combining the advantages of LPoR and MS-$k$-NN.

no code implementations • 7 Dec 2021 • Akifumi Okuno, Keisuke Yano

This paper discusses the estimation of the generalization gap, the difference between generalization performance and training performance, for overparameterized models including neural networks.

1 code implementation • 1 Dec 2021 • Akifumi Okuno, Masaaki Imaizumi

The derived minimax rate corresponds to that of the non-invertible bi-Lipschitz function, which shows that the invertibility does not reduce the complexity of the estimation problem in terms of the rate.

no code implementations • 24 Dec 2020 • Akifumi Okuno, Keisuke Yano

This paper discusses a design-dependent nature of variance in nonparametric link regression aiming at predicting a mean outcome at a link, i. e., a pair of nodes, based on currently observed data comprising covariates at nodes and outcomes at links.

Statistics Theory Statistics Theory

no code implementations • NeurIPS 2020 • Akifumi Okuno, Hidetoshi Shimodaira

The weights and the parameter $k \in \mathbb{N}$ regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the $k$-NN classifier; several existing studies considered selecting optimal $k$ and weights to obtain faster convergence rate.

no code implementations • 2 May 2020 • Morihiro Mizutani, Akifumi Okuno, Geewook Kim, Hidetoshi Shimodaira

Multimodal relational data analysis has become of increasing importance in recent years, for exploring across different domains of data, such as images and their text tags obtained from social networking services (e. g., Flickr).

no code implementations • 8 Feb 2020 • Akifumi Okuno, Hidetoshi Shimodaira

The weights and the parameter $k \in \mathbb{N}$ regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the $k$-NN classifier; several existing studies considered selecting optimal $k$ and weights to obtain faster convergence rate.

no code implementations • 22 Jul 2019 • Akifumi Okuno, Hidetoshi Shimodaira

A collection of $U \: (\in \mathbb{N})$ data vectors is called a $U$-tuple, and the association strength among the vectors of a tuple is termed as the \emph{hyperlink weight}, that is assumed to be symmetric with respect to permutation of the entries in the index.

1 code implementation • 27 Feb 2019 • Geewook Kim, Akifumi Okuno, Kazuki Fukui, Hidetoshi Shimodaira

In addition to the parameters of neural networks, we optimize the weights of the inner product by allowing positive and negative values.

no code implementations • 22 Feb 2019 • Akifumi Okuno, Hidetoshi Shimodaira

We propose $\beta$-graph embedding for robustly learning feature vectors from data vectors and noisy link weights.

no code implementations • 4 Oct 2018 • Akifumi Okuno, Geewook Kim, Hidetoshi Shimodaira

We propose shifted inner-product similarity (SIPS), which is a novel yet very simple extension of the ordinary inner-product similarity (IPS) for neural-network based graph embedding (GE).

no code implementations • 31 May 2018 • Akifumi Okuno, Hidetoshi Shimodaira

We consider the representation power of siamese-style similarity functions used in neural network-based graph embedding.

no code implementations • ICML 2018 • Akifumi Okuno, Tetsuya Hada, Hidetoshi Shimodaira

PMvGE is a probabilistic model for predicting new associations via graph embedding of the nodes of data vectors with links of their associations.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.