Search Results for author: Ted Zadouri

Found 2 papers, 1 papers with code

Pushing Mixture of Experts to the Limit: Extremely Parameter Efficient MoE for Instruction Tuning

1 code implementation11 Sep 2023 Ted Zadouri, Ahmet Üstün, Arash Ahmadian, Beyza Ermiş, Acyr Locatelli, Sara Hooker

The Mixture of Experts (MoE) is a widely known neural architecture where an ensemble of specialized sub-models optimizes overall performance with a constant computational cost.

High Probability Bounds for Stochastic Continuous Submodular Maximization

no code implementations20 Mar 2023 Evan Becker, Jingdong Gao, Ted Zadouri, Baharan Mirzasoleiman

This implies that for a particular run of the algorithms, the solution may be much worse than the provided guarantee in expectation.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.