no code implementations • 24 Aug 2022 • Mocho Go, Hideyuki Tachibana
Following the success in language domain, the self-attention mechanism (transformer) is adopted in the vision domain and achieving great success recently.
Ranked #42 on Instance Segmentation on COCO test-dev
no code implementations • 26 Dec 2021 • Hideyuki Tachibana, Mocho Go, Muneyoshi Inahara, Yotaro Katayama, Yotaro Watanabe
Diffusion generative models have emerged as a new challenger to popular deep neural generative models such as GANs, but have the drawback that they often require a huge number of neural function evaluations (NFEs) during synthesis unless some sophisticated sampling strategies are employed.
1 code implementation • 7 Jun 2020 • Mocho Go
In this thesis, we introduce new tools for the conformal bootstrap, autoboot and qboot.
High Energy Physics - Theory
1 code implementation • 25 Mar 2019 • Mocho Go, Yuji Tachikawa
We introduce autoboot, a Mathematica program which automatically generates mixed-correlator bootstrap equations of an arbitrary number of scalar external operators, given the global symmetry group and the representations of the operators.
High Energy Physics - Theory