no code implementations • 12 Jul 2024 • Atefeh Khoshkhahtinat, Ali Zafari, Piyush M. Mehta, Nasser M. Nasrabadi, Barbara J. Thompson, Michael S. F. Kirk, Daniel da Silva
In this work, we introduce an architecture based on the Transformer model, which is specifically designed to capture both local and global information from input images in an effective and efficient manner.
no code implementations • CVPR 2024 • Atefeh Khoshkhahtinat, Ali Zafari, Piyush M. Mehta, Nasser M. Nasrabadi
The global spatial context is built upon the Transformer, which is specifically designed for image compression tasks.
no code implementations • 6 Nov 2023 • Ali Zafari, Atefeh Khoshkhahtinat, Jeremy A. Grajeda, Piyush M. Mehta, Nasser M. Nasrabadi, Laura E. Boucheron, Barbara J. Thompson, Michael S. F. Kirk, Daniel da Silva
In this work, we propose an adversarially trained neural network, equipped with local and non-local attention modules to capture both the local and global structure of the image resulting in a better trade-off in rate-distortion (RD) compared to conventional hand-engineered codecs.
no code implementations • 22 Sep 2023 • Mohammad Akyash, Ali Zafari, Nasser M. Nasrabadi
The consistent improvement we observed in these benchmarks demonstrates the efficacy of our approach in enhancing FR performance.
no code implementations • 19 Sep 2023 • Atefeh Khoshkhahtinat, Ali Zafari, Piyush M. Mehta, Mohammad Akyash, Hossein Kashiani, Nasser M. Nasrabadi
In addition, we introduce a novel entropy model that incorporates two different hyperpriors to model cross-channel and spatial dependencies of the latent representation.
no code implementations • 19 Sep 2023 • Ali Zafari, Atefeh Khoshkhahtinat, Piyush M. Mehta, Nasser M. Nasrabadi, Barbara J. Thompson, Michael S. F. Kirk, Daniel da Silva
Recently successful end-to-end optimized neural network-based image compression systems have shown great potential to be used in an ad-hoc manner.
no code implementations • 19 Sep 2023 • Atefeh Khoshkhahtinat, Ali Zafari, Piyush M. Mehta, Nasser M. Nasrabadi, Barbara J. Thompson, Michael S. F. Kirk, Daniel da Silva
NASA's Solar Dynamics Observatory (SDO) mission collects large data volumes of the Sun's daily activity.
no code implementations • 4 Aug 2023 • Ali Zafari, Atefeh Khoshkhahtinat, Piyush Mehta, Mohammad Saeed Ebrahimi Saadabadi, Mohammad Akyash, Nasser M. Nasrabadi
The design of a neural image compression network is governed by how well the entropy model matches the true distribution of the latent code.
no code implementations • 6 Jun 2023 • Mohammad Saeed Ebrahimi Saadabadi, Sahar Rahimi Malakshan, Ali Zafari, Moktari Mostofa, Nasser M. Nasrabadi
Our method adaptively finds and assigns more attention to the recognizable low-quality samples in the training datasets.
no code implementations • 12 Oct 2022 • Ali Zafari, Atefeh Khoshkhahtinat, Piyush M. Mehta, Nasser M. Nasrabadi, Barbara J. Thompson, Daniel da Silva, Michael S. F. Kirk
We have designed an ad-hoc ANN-based image compression scheme to reduce the amount of data needed to be stored and retrieved on space missions studying solar dynamics.
no code implementations • 25 May 2019 • Jia-Bao Liu, S. Morteza Mirafzal, Ali Zafari
If all the eigenvalues of the adjacency matrix of the graph $\Gamma$ are integers, then we say that $\Gamma$ is an integral graph.
Combinatorics 05C50, 05C31