Search Results for author: Mohamed Wahib

Found 10 papers, 0 papers with code

Adaptive Patching for High-resolution Image Segmentation with Transformers

no code implementations15 Apr 2024 Enzhi Zhang, Isaac Lyngaas, Peng Chen, Xiao Wang, Jun Igarashi, Yuankai Huo, Mohamed Wahib, Masaharu Munetomo

For high-resolution images, e. g. microscopic pathology images, the quadratic compute and memory cost prohibits the use of an attention-based model, if we are to use smaller patch sizes that are favorable in segmentation.

Friction Image Segmentation +2

Ultra-Long Sequence Distributed Transformer

no code implementations4 Nov 2023 Xiao Wang, Isaac Lyngaas, Aristeidis Tsaris, Peng Chen, Sajal Dash, Mayanka Chandra Shekar, Tao Luo, Hong-Jun Yoon, Mohamed Wahib, John Gouley

This paper presents a novel and efficient distributed training method, the Long Short-Sequence Transformer (LSS Transformer), for training transformer with long sequences.

CSI-Inpainter: Enabling Visual Scene Recovery from CSI Time Sequences for Occlusion Removal

no code implementations9 May 2023 Cheng Chen, Shoki Ohta, Takayuki Nishio, Mehdi Bennis, Jihong Park, Mohamed Wahib

Introducing CSI-Inpainter, a pioneering approach for occlusion removal using Channel State Information (CSI) time sequences, this work propels the application of wireless signal processing into the realm of visual scene recovery.

Image Inpainting Image Restoration

Myths and Legends in High-Performance Computing

no code implementations6 Jan 2023 Satoshi Matsuoka, Jens Domke, Mohamed Wahib, Aleksandr Drozd, Torsten Hoefler

While some laws end, new directions are emerging, such as algorithmic scaling or novel architecture research.

Vocal Bursts Intensity Prediction

Image Gradient Decomposition for Parallel and Memory-Efficient Ptychographic Reconstruction

no code implementations12 May 2022 Xiao Wang, Aristeidis Tsaris, Debangshu Mukherjee, Mohamed Wahib, Peng Chen, Mark Oxley, Olga Ovchinnikova, Jacob Hinkle

In this paper, we propose a novel image gradient decomposition method that significantly reduces the memory footprint for ptychographic reconstruction by tessellating image gradients and diffraction measurements into tiles.

An Oracle for Guiding Large-Scale Model/Hybrid Parallel Training of Convolutional Neural Networks

no code implementations19 Apr 2021 Albert Njoroge Kahira, Truong Thao Nguyen, Leonardo Bautista Gomez, Ryousei Takano, Rosa M Badia, Mohamed Wahib

Deep Neural Network (DNN) frameworks use distributed training to enable faster time to convergence and alleviate memory capacity limitations when training large models and/or using high dimension inputs.

GTOPX Space Mission Benchmarks

no code implementations15 Oct 2020 Martin Schlueter, Mehdi Neshat, Mohamed Wahib, Masaharu Munetomo, Markus Wagner

This contribution introduces the GTOPX space mission benchmark collection, which is an extension of GTOP database published by the European Space Agency (ESA).

Cannot find the paper you are looking for? You can Submit a new open access paper.