Search Results for author: Mehmet Fatih Amasyali

Found 11 papers, 2 papers with code

Theoretical research on generative diffusion models: an overview

no code implementations13 Apr 2024 Melike Nur Yeğin, Mehmet Fatih Amasyali

Generative diffusion models showed high success in many fields with a powerful theoretical background.

Advancing NLP Models with Strategic Text Augmentation: A Comprehensive Study of Augmentation Methods and Curriculum Strategies

no code implementations14 Feb 2024 Himmet Toprak Kesgin, Mehmet Fatih Amasyali

The study concludes that the use of augmentation methods, especially in conjunction with MCCL, leads to improved results in various classification tasks, providing a foundation for future advances in text augmentation strategies in NLP.

Sentiment Analysis Text Augmentation +1

From Text to Multimodal: A Comprehensive Survey of Adversarial Example Generation in Question Answering Systems

no code implementations26 Dec 2023 Gulsum Yigit, Mehmet Fatih Amasyali

Integrating adversarial machine learning with Question Answering (QA) systems has emerged as a critical area for understanding the vulnerabilities and robustness of these systems.

Question Answering Question Generation +1

Developing and Evaluating Tiny to Medium-Sized Turkish BERT Models

no code implementations26 Jul 2023 Himmet Toprak Kesgin, Muzaffer Kaan Yuce, Mehmet Fatih Amasyali

This study introduces and evaluates tiny, mini, small, and medium-sized uncased Turkish BERT models, aiming to bridge the research gap in less-resourced languages.

Classification Computational Efficiency +3

Reviewer Assignment Problem: A Systematic Review of the Literature

no code implementations1 Apr 2023 Meltem Aksoy, Seda Yanik, Mehmet Fatih Amasyali

Appropriate reviewer assignment significantly impacts the quality of proposal evaluation, as accurate and fair reviews are contingent on their assignment to relevant reviewers.

Transformers as Neural Augmentors: Class Conditional Sentence Generation via Variational Bayes

1 code implementation19 May 2022 M. Şafak Bilici, Mehmet Fatih Amasyali

The presented results show that, our model increases the performance of current models compared to other data augmentation techniques with a small amount of computation power.

Data Augmentation Sentence

Training with Growing Sets: A Simple Alternative to Curriculum Learning and Self Paced Learning

no code implementations ICLR 2018 Melike Nur Mermer, Mehmet Fatih Amasyali

Studies in these topics show that starting with a small training set and adding new samples according to difficulty levels improves the learning performance.

Cannot find the paper you are looking for? You can Submit a new open access paper.