Search Results for author: Naoto Ohsaka

Found 4 papers, 0 papers with code

On the (In)tractability of Computing Normalizing Constants for the Product of Determinantal Point Processes

no code implementations ICML 2020 Naoto Ohsaka, Tatsuya Matsuoka

We consider the product of determinantal point processes (DPPs), a point process whose probability mass is proportional to the product of principal minors of multiple matrices as a natural, promising generalization of DPPs.

Point Processes

Predictive Optimization with Zero-Shot Domain Adaptation

no code implementations15 Jan 2021 Tomoya Sakai, Naoto Ohsaka

The task is regarded as predictive optimization, but existing predictive optimization methods have not been extended to handling multiple domains.

Domain Adaptation

Monotone k-Submodular Function Maximization with Size Constraints

no code implementations NeurIPS 2015 Naoto Ohsaka, Yuichi Yoshida

A $k$-submodular function is a generalization of a submodular function, where the input consists of $k$ disjoint subsets, instead of a single subset, of the domain. Many machine learning problems, including influence maximization with $k$ kinds of topics and sensor placement with $k$ kinds of sensors, can be naturally modeled as the problem of maximizing monotone $k$-submodular functions. In this paper, we give constant-factor approximation algorithms for maximizing monotone $k$-submodular functions subject to several size constraints. The running time of our algorithms are almost linear in the domain size. We experimentally demonstrate that our algorithms outperform baseline algorithms in terms of the solution quality.

Cannot find the paper you are looking for? You can Submit a new open access paper.