Search Results for author: Chris Wendler

Found 9 papers, 5 papers with code

Do Llamas Work in English? On the Latent Language of Multilingual Transformers

1 code implementation16 Feb 2024 Chris Wendler, Veniamin Veselovsky, Giovanni Monea, Robert West

Tracking intermediate embeddings through their high-dimensional space reveals three distinct phases, whereby intermediate embeddings (1) start far away from output token embeddings; (2) already allow for decoding a semantically correct next token in the middle layers, but give higher probability to its version in English than in the input language; (3) finally move into an input-language-specific region of the embedding space.

Sketch-Guided Constrained Decoding for Boosting Blackbox Large Language Models without Logit Access

no code implementations18 Jan 2024 Saibo Geng, Berkay Döner, Chris Wendler, Martin Josifoski, Robert West

This paper introduces sketch-guided constrained decoding (SGCD), a novel approach to constrained decoding for blackbox LLMs, which operates without access to the logits of the blackbox LLM.

Constituency Parsing Language Modelling +1

Learning DAGs from Data with Few Root Causes

1 code implementation NeurIPS 2023 Panagiotis Misiakos, Chris Wendler, Markus Püschel

We prove identifiability in this new setting and show that the true DAG is the global minimizer of the $L^0$-norm of the vector of root causes.

Causal Fourier Analysis on Directed Acyclic Graphs and Posets

no code implementations16 Sep 2022 Bastian Seifert, Chris Wendler, Markus Püschel

Specifically, we model the spread of an infection on such a DAG obtained from real-world contact tracing data and learn the infection signal from samples assuming sparsity in the Fourier domain.

Instance-wise algorithm configuration with graph neural networks

1 code implementation10 Feb 2022 Romeo Valentin, Claudio Ferrari, Jérémy Scheurer, Andisheh Amrollahi, Chris Wendler, Max B. Paulus

We pose this task as a supervised learning problem: First, we compile a large dataset of the solver performance for various configurations and all provided MILP instances.

Combinatorial Optimization

Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases

3 code implementations1 Oct 2020 Chris Wendler, Andisheh Amrollahi, Bastian Seifert, Andreas Krause, Markus Püschel

Many applications of machine learning on discrete domains, such as learning preference functions in recommender systems or auctions, can be reduced to estimating a set function that is sparse in the Fourier domain.

Recommendation Systems

Discrete Signal Processing with Set Functions

no code implementations28 Jan 2020 Markus Püschel, Chris Wendler

Set functions are functions (or signals) indexed by the powerset (set of all subsets) of a finite set N. They are fundamental and ubiquitous in many application domains and have been used, for example, to formally describe or quantify loss functions for semantic image segmentation, the informativeness of sensors in sensor networks the utility of sets of items in recommender systems, cooperative games in game theory, or bidders in combinatorial auctions.

Image Segmentation Informativeness +2

Powerset Convolutional Neural Networks

1 code implementation NeurIPS 2019 Chris Wendler, Dan Alistarh, Markus Püschel

We present a novel class of convolutional neural networks (CNNs) for set functions, i. e., data indexed with the powerset of a finite set.

Cannot find the paper you are looking for? You can Submit a new open access paper.