Search Results for author: Akihiro Kishimoto

Found 12 papers, 4 papers with code

Parallel Recursive Best-First AND/OR Search for Exact MAP Inference in Graphical Models

no code implementations NeurIPS 2015 Akihiro Kishimoto, Radu Marinescu, Adi Botea

The paper presents and evaluates the power of parallel search for exact MAP inference in graphical models.

A Survey of Parallel A*

1 code implementation16 Aug 2017 Alex Fukunaga, Adi Botea, Yuu Jinnai, Akihiro Kishimoto

A* is a best-first search algorithm for finding optimal-cost paths in graphs.

Depth-First Proof-Number Search with Heuristic Edge Cost and Application to Chemical Synthesis Planning

no code implementations NeurIPS 2019 Akihiro Kishimoto, Beat Buesser, Bei Chen, Adi Botea

Search techniques, such as Monte Carlo Tree Search (MCTS) and Proof-Number Search (PNS), are effective in playing and solving games.

Designing Machine Learning Pipeline Toolkit for AutoML Surrogate Modeling Optimization

no code implementations2 Jul 2021 Paulito P. Palmes, Akihiro Kishimoto, Radu Marinescu, Parikshit Ram, Elizabeth Daly

The pipeline optimization problem in machine learning requires simultaneous optimization of pipeline structures and parameter adaptation of their elements.

AutoML BIG-bench Machine Learning +1

An Ensemble Approach for Automated Theorem Proving Based on Efficient Name Invariant Graph Neural Representations

1 code implementation15 May 2023 Achille Fokoue, Ibrahim Abdelaziz, Maxwell Crouse, Shajith Ikbal, Akihiro Kishimoto, Guilherme Lima, Ndivhuwo Makondo, Radu Marinescu

NIAGRA addresses this problem by using 1) improved Graph Neural Networks for learning name-invariant formula representations that is tailored for their unique characteristics and 2) an efficient ensemble approach for automated theorem proving.

Automated Theorem Proving Transfer Learning

Improving Molecular Properties Prediction Through Latent Space Fusion

1 code implementation20 Oct 2023 Eduardo Soares, Akihiro Kishimoto, Emilio Vital Brazil, Seiji Takeda, Hiroshi Kajino, Renato Cerqueira

Pre-trained Language Models have emerged as promising tools for predicting molecular properties, yet their development is in its early stages, necessitating further research to enhance their efficacy and address challenges such as generalization and sample efficiency.

Molecular Property Prediction Property Prediction

Finding Sub-task Structure with Natural Language Instruction

no code implementations LNLS (ACL) 2022 Ryokan Ri, Yufang Hou, Radu Marinescu, Akihiro Kishimoto

When mapping a natural language instruction to a sequence of actions, it is often useful toidentify sub-tasks in the instruction.

Segmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.