no code implementations • 5 Dec 2021 • Benjamin Klein, Lior Wolf
In this work, a hierarchical model, Graph Query Expansion (GQE), is presented, which is learned in a supervised manner and performs aggregation over an extended neighborhood of the query, thus increasing the information used from the database when computing the query expansion, and using the structure of the nearest neighbors graph.
no code implementations • 11 Feb 2021 • Austin Z. Henley, Julian Ball, Benjamin Klein, Aiden Rutter, Dylan Lee
Novice programmers face numerous barriers while attempting to learn how to code that may deter them from pursuing a computer science degree or career in software development.
Software Engineering
no code implementations • CVPR 2019 • Benjamin Klein, Lior Wolf
To our knowledge, this is the first work to introduce a dictionary-based representation that is inspired by Product Quantization and which is learned end-to-end, and thus benefits from the supervised signal.
no code implementations • 12 Dec 2015 • Guy Lev, Gil Sadeh, Benjamin Klein, Lior Wolf
Recurrent Neural Networks (RNNs) have had considerable success in classifying and predicting sequences.
no code implementations • CVPR 2015 • Benjamin Klein, Guy Lev, Gil Sadeh, Lior Wolf
In this work, we are using the Fisher Vector as a sentence representation by pooling the word2vec embedding of each word in the sentence.
Ranked #14 on Video Retrieval on YouCook2
no code implementations • CVPR 2015 • Benjamin Klein, Lior Wolf, Yehuda Afek
In contrast, the dynamic convolutional layer uses filters that will vary from input to input during testing.
no code implementations • 26 Nov 2014 • Benjamin Klein, Guy Lev, Gil Sadeh, Lior Wolf
The second Mixture Model presented is a Hybrid Gaussian-Laplacian Mixture Model (HGLMM) which is based on a weighted geometric mean of the Gaussian and Laplacian distribution.