Search Results for author: Christopher Hahn

Found 13 papers, 9 papers with code

Constraint-Based Monitoring of Hyperproperties

4 code implementations31 May 2019 Christopher Hahn, Marvin Stenger, Leander Tentrup

Verifying hyperproperties at runtime is a challenging problem as hyperproperties, such as non-interference and observational determinism, relate multiple computation traces with each other.

Logic in Computer Science

Teaching Temporal Logics to Neural Networks

2 code implementations ICLR 2021 Christopher Hahn, Frederik Schmitt, Jens U. Kreber, Markus N. Rabe, Bernd Finkbeiner

We study two fundamental questions in neuro-symbolic computing: can deep learning tackle challenging problems in logics end-to-end, and can neural networks learn the semantics of logics.

Realizing Omega-regular Hyperproperties

no code implementations18 Jan 2021 Bernd Finkbeiner, Christopher Hahn, Jana Hofmann, Leander Tentrup

We furthermore studied the realizability problem of HyperQPTL.

Logic in Computer Science

Efficient Monitoring of Hyperproperties using Prefix Trees

no code implementations18 Jan 2021 Bernd Finkbeiner, Christopher Hahn, Marvin Stenger, Leander Tentrup

Hyperproperties, such as non-interference and observational determinism, relate multiple computation traces with each other and are thus not monitorable by tools that consider computations in isolation.

Logic in Computer Science

Neural Circuit Synthesis from Specification Patterns

1 code implementation NeurIPS 2021 Frederik Schmitt, Christopher Hahn, Markus N. Rabe, Bernd Finkbeiner

We train hierarchical Transformers on the task of synthesizing hardware circuits directly out of high-level logical specifications in linear-time temporal logic (LTL).

Generating Symbolic Reasoning Problems with Transformer GANs

1 code implementation19 Oct 2021 Jens U. Kreber, Christopher Hahn

We study the capabilities of GANs and Wasserstein GANs equipped with Transformer encoders to generate sensible and challenging training data for symbolic reasoning domains.

Attention Flows for General Transformers

1 code implementation30 May 2022 Niklas Metzger, Christopher Hahn, Julian Siber, Frederik Schmitt, Bernd Finkbeiner

In this paper, we study the computation of how much an input token in a Transformer model influences its prediction.

Formal Specifications from Natural Language

no code implementations4 Jun 2022 Christopher Hahn, Frederik Schmitt, Julia J. Tillman, Niklas Metzger, Julian Siber, Bernd Finkbeiner

We study the generalization abilities of language models when translating natural language into formal specifications with complex semantics.

Automated Theorem Proving

Iterative Circuit Repair Against Formal Specifications

1 code implementation2 Mar 2023 Matthias Cosler, Frederik Schmitt, Christopher Hahn, Bernd Finkbeiner

We propose a separated hierarchical Transformer for multimodal representation learning of the formal specification and the circuit.

Representation Learning

Lightweight Online Learning for Sets of Related Problems in Automated Reasoning

1 code implementation18 May 2023 Haoze Wu, Christopher Hahn, Florian Lonsing, Makai Mann, Raghuram Ramanujan, Clark Barrett

We present Self-Driven Strategy Learning ($\textit{sdsl}$), a lightweight online learning methodology for automated reasoning tasks that involve solving a set of related problems.

NeuroSynt: A Neuro-symbolic Portfolio Solver for Reactive Synthesis

1 code implementation22 Jan 2024 Matthias Cosler, Christopher Hahn, Ayham Omar, Frederik Schmitt

At the core of the solver lies a seamless integration of neural and symbolic approaches to solving the reactive synthesis problem.

Cannot find the paper you are looking for? You can Submit a new open access paper.