Search Results for author: Riyadh Baghdadi

Found 14 papers, 4 papers with code

Exploring the Knowledge Mismatch Hypothesis: Hallucination Propensity in Small Models Fine-tuned on Data from Larger Models

no code implementations31 Oct 2024 Phil Wee, Riyadh Baghdadi

However, one of the key limitations that have been observed with these models is their propensity to hallucinate significantly more often than larger models.

Hallucination Misinformation

A Reinforcement Learning Environment for Automatic Code Optimization in the MLIR Compiler

no code implementations17 Sep 2024 Nazim Bendib, Iheb Nassim Aouadj, Riyadh Baghdadi

In this project, we introduce the first RL environment for the MLIR compiler, dedicated to facilitating MLIR compiler research, and enabling automatic code optimization using Multi-Action Reinforcement Learning.

reinforcement-learning Reinforcement Learning +1

Curriculum Learning for Small Code Language Models

no code implementations14 Jul 2024 Marwa Naïr, Kamel Yamani, Lynda Said Lhadj, Riyadh Baghdadi

In this paper, we explore the potential of curriculum learning in enhancing the performance of these models.

Code Completion Decoder

Leveraging High-Resolution Features for Improved Deep Hashing-based Image Retrieval

no code implementations20 Mar 2024 Aymene Berriche, Mehdi Adjal Zakaria, Riyadh Baghdadi

In this study, we explore the efficacy of employing high-resolution features learned through state-of-the-art techniques for image retrieval tasks.

Deep Hashing Image Retrieval

LOOPer: A Learned Automatic Code Optimizer For Polyhedral Compilers

no code implementations18 Mar 2024 Massinissa Merouani, Khaled Afif Boudaoud, Iheb Nassim Aouadj, Nassim Tchoulak, Islem Kara Bernou, Hamza Benyamina, Fatima Benbouzid-Si Tayeb, Karima Benatchba, Hugh Leather, Riyadh Baghdadi

In this paper, we introduce LOOPer, the first polyhedral autoscheduler that uses a deep-learning based cost model and covers a large set of affine transformations and programs.

Automatic Generation of Python Programs Using Context-Free Grammars

1 code implementation11 Mar 2024 Kamel Yamani, Marwa Naïr, Riyadh Baghdadi

In recent years, data has emerged as the new gold, serving as a powerful tool for creating intelligent systems.

Code Generation

TIRAMISU: A Polyhedral Compiler for Dense and Sparse Deep Learning

no code implementations7 May 2020 Riyadh Baghdadi, Abdelkader Nadir Debbagh, Kamel Abdous, Fatima Zohra Benhamida, Alex Renda, Jonathan Elliott Frankle, Michael Carbin, Saman Amarasinghe

In this paper, we demonstrate a compiler that can optimize sparse and recurrent neural networks, both of which are currently outside of the scope of existing neural network compilers (sparse neural networks here stand for networks that can be accelerated with sparse tensor algebra techniques).

Deep Learning

GraphIt - A High-Performance DSL for Graph Analytics

4 code implementations2 May 2018 Yunming Zhang, Mengjiao Yang, Riyadh Baghdadi, Shoaib Kamil, Julian Shun, Saman Amarasinghe

This paper introduces GraphIt, a new DSL for graph computations that generates fast implementations for algorithms with different performance characteristics running on graphs with different sizes and structures.

Programming Languages

Tiramisu: A Polyhedral Compiler for Expressing Fast and Portable Code

3 code implementations27 Apr 2018 Riyadh Baghdadi, Jessica Ray, Malek Ben Romdhane, Emanuele Del Sozzo, Abdurrahman Akkas, Yunming Zhang, Patricia Suriana, Shoaib Kamil, Saman Amarasinghe

This paper introduces Tiramisu, a polyhedral framework designed to generate high performance code for multiple platforms including multicores, GPUs, and distributed machines.

Scheduling

Cannot find the paper you are looking for? You can Submit a new open access paper.