Search Results for author: Safdar Abbas Khan

Found 1 papers, 1 papers with code

Neural Machine Translation Models with Attention-Based Dropout Layer

1 code implementation Computers, Materials & Continua 2023 Huma Israr, Safdar Abbas Khan, Muhammad Ali Tahir, Muhammad Khuram Shahzad, Muneer Ahmad, Jasni Mohamad Zain

We empirically concluded that adding an attention-based dropout layer helps improve GRU, SRU, and Transformer translation and is considerably more efficient in translation quality and speed.

Machine Translation NMT +1

Cannot find the paper you are looking for? You can Submit a new open access paper.