Search Results for author: Sebastian Bruch

Found 11 papers, 3 papers with code

Learning Groupwise Multivariate Scoring Functions Using Deep Neural Networks

2 code implementations11 Nov 2018 Qingyao Ai, Xuanhui Wang, Sebastian Bruch, Nadav Golbandi, Michael Bendersky, Marc Najork

To overcome this limitation, we propose a new framework for multivariate scoring functions, in which the relevance score of a document is determined jointly by multiple documents in the list.

Learning-To-Rank

Yggdrasil Decision Forests: A Fast and Extensible Decision Forests Library

1 code implementation6 Dec 2022 Mathieu Guillame-Bert, Sebastian Bruch, Richard Stotz, Jan Pfeifer

Yggdrasil Decision Forests is a library for the training, serving and interpretation of decision forest models, targeted both at research and production work, implemented in C++, and available in C++, command line interface, Python (under the name TensorFlow Decision Forests), JavaScript, Go, and Google Sheets (under the name Simple ML for Sheets).

An Alternative Cross Entropy Loss for Learning-to-Rank

no code implementations22 Nov 2019 Sebastian Bruch

Listwise learning-to-rank methods form a powerful class of ranking algorithms that are widely adopted in applications such as information retrieval.

Information Retrieval Learning-To-Rank +1

Modeling Text with Decision Forests using Categorical-Set Splits

no code implementations21 Sep 2020 Mathieu Guillame-Bert, Sebastian Bruch, Petr Mitrichev, Petr Mikheev, Jan Pfeifer

We define a condition that is specific to categorical-set features -- defined as an unordered set of categorical variables -- and present an algorithm to learn it, thereby equipping decision forests with the ability to directly model text, albeit without preserving sequential order.

text-classification Text Classification

An Analysis of Fusion Functions for Hybrid Retrieval

no code implementations21 Oct 2022 Sebastian Bruch, Siyu Gai, Amir Ingber

In particular, we examine fusion by a convex combination (CC) of lexical and semantic scores, as well as the Reciprocal Rank Fusion (RRF) method, and identify their advantages and potential pitfalls.

Retrieval Text Retrieval

An Approximate Algorithm for Maximum Inner Product Search over Streaming Sparse Vectors

no code implementations25 Jan 2023 Sebastian Bruch, Franco Maria Nardini, Amir Ingber, Edo Liberty

To achieve optimal memory footprint and query latency, they rely on the near stationarity of documents and on laws governing natural languages.

Information Retrieval Retrieval

Efficient and Effective Tree-based and Neural Learning to Rank

no code implementations15 May 2023 Sebastian Bruch, Claudio Lucchese, Franco Maria Nardini

We believe that by understanding the fundamentals underpinning these algorithmic and data structure solutions for containing the contentious relationship between efficiency and effectiveness, one can better identify future directions and more efficiently determine the merits of ideas.

Information Retrieval Learning-To-Rank +1

Bridging Dense and Sparse Maximum Inner Product Search

no code implementations16 Sep 2023 Sebastian Bruch, Franco Maria Nardini, Amir Ingber, Edo Liberty

Maximum inner product search (MIPS) over dense and sparse vectors have progressed independently in a bifurcated literature for decades; the latter is better known as top-$k$ retrieval in Information Retrieval.

Dimensionality Reduction Information Retrieval +1

Foundations of Vector Retrieval

no code implementations17 Jan 2024 Sebastian Bruch

Vectors are universal mathematical objects that can represent text, images, speech, or a mix of these data modalities.

Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.