no code implementations • 8 Dec 2023 • Ouail Kitouni, Niklas Nolte, James Hensman, Bhaskar Mitra
We introduce Diffusion Models of Structured Knowledge (DiSK) - a new architecture and training approach specialized for structured data.
1 code implementation • 14 Jul 2023 • Ouail Kitouni, Niklas Nolte, Michael Williams
The monotonic dependence of the outputs of a neural network on some of its inputs is a crucial inductive bias in many scenarios where domain knowledge dictates such behavior.
no code implementations • 9 Jun 2023 • Ouail Kitouni, Niklas Nolte, Sokratis Trifinopoulos, Subhash Kantamneni, Mike Williams
We introduce Nuclear Co-Learned Representations (NuCLR), a deep learning model that predicts various nuclear observables, including binding and decay energies, and nuclear charge radii.
no code implementations • 30 Sep 2022 • Ouail Kitouni, Niklas Nolte, Mike Williams
We present a new and interesting direction for this architecture: estimation of the Wasserstein metric (Earth Mover's Distance) in optimal transport by employing the Kantorovich-Rubinstein duality to enable its use in geometric fitting applications.
1 code implementation • 20 May 2022 • Ziming Liu, Ouail Kitouni, Niklas Nolte, Eric J. Michaud, Max Tegmark, Mike Williams
We aim to understand grokking, a phenomenon where models generalize long after overfitting their training set.
no code implementations • 30 Nov 2021 • Ouail Kitouni, Niklas Nolte, Mike Williams
The Lipschitz constant of the map between the input and output space represented by a neural network is a natural metric for assessing the robustness of the model.
1 code implementation • 19 Oct 2020 • Ouail Kitouni, Benjamin Nachman, Constantin Weisser, Mike Williams
A key challenge in searches for resonant new physics is that classifiers trained to enhance potential signals must not induce localized structures.
High Energy Physics - Phenomenology High Energy Physics - Experiment Data Analysis, Statistics and Probability