Search Results for author: Marius Cobzarenco

Found 2 papers, 0 papers with code

Latte: Latent Attention for Linear Time Transformers

no code implementations27 Feb 2024 Rares Dolga, Lucas Maystre, Marius Cobzarenco, David Barber

The time complexity of the standard attention mechanism in transformers scales quadratically with sequence length.

Text Generation

Generalized Multiple Intent Conditioned Slot Filling

no code implementations18 May 2023 Harshil Shah, Arthur Wilcke, Marius Cobzarenco, Cristi Cobzarenco, Edward Challis, David Barber

Natural language understanding includes the tasks of intent detection (identifying a user's objectives) and slot filling (extracting the entities relevant to those objectives).

Intent Detection Language Modeling +5

Cannot find the paper you are looking for? You can Submit a new open access paper.