Search Results for author: Hisaichi Shibata

Found 6 papers, 0 papers with code

Theory of Hallucinations based on Equivariance

no code implementations22 Dec 2023 Hisaichi Shibata

Utilizing this scale, I tested language models for their ability to acquire character-level equivariance.

Hallucination

Playing the Werewolf game with artificial intelligence for language understanding

no code implementations21 Feb 2023 Hisaichi Shibata, Soichiro Miki, Yuta Nakamura

The purpose of this study is to develop an AI agent that can play Werewolf through natural language conversations.

Language Modelling

Local Differential Privacy Image Generation Using Flow-based Deep Generative Models

no code implementations20 Dec 2022 Hisaichi Shibata, Shouhei Hanaoka, Yang Cao, Masatoshi Yoshikawa, Tomomi Takenaga, Yukihiro Nomura, Naoto Hayashi, Osamu Abe

To release and use medical images, we need an algorithm that can simultaneously protect privacy and preserve pathologies in medical images.

Image Generation

X2CT-FLOW: Maximum a posteriori reconstruction using a progressive flow-based deep generative model for ultra sparse-view computed tomography in ultra low-dose protocols

no code implementations9 Apr 2021 Hisaichi Shibata, Shouhei Hanaoka, Yukihiro Nomura, Takahiro Nakao, Tomomi Takenaga, Naoto Hayashi, Osamu Abe

Here, we propose X2CT-FLOW for the maximum a posteriori (MAP) reconstruction of a three-dimensional (3D) chest CT image from a single or a few two-dimensional (2D) projection images using a progressive flow-based deep generative model, especially for ultra low-dose protocols.

Computed Tomography (CT) SSIM

On the Matrix-Free Generation of Adversarial Perturbations for Black-Box Attacks

no code implementations18 Feb 2020 Hisaichi Shibata, Shouhei Hanaoka, Yukihiro Nomura, Naoto Hayashi, Osamu Abe

In general, adversarial perturbations superimposed on inputs are realistic threats for a deep neural network (DNN).

Semantic Segmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.