Search Results for author: Henry Kvinge

Found 40 papers, 6 papers with code

Machines and Mathematical Mutations: Using GNNs to Characterize Quiver Mutation Classes

no code implementations12 Nov 2024 Jesse He, Helen Jenne, Herman Chau, Davis Brown, Mark Raugas, Sara Billey, Henry Kvinge

In this work, we use graph neural networks to investigate quiver mutation -- an operation that transforms one quiver (or directed multigraph) into another -- which is central to the theory of cluster algebras with deep connections to geometry, topology, and physics.

ICML Topological Deep Learning Challenge 2024: Beyond the Graph Domain

no code implementations8 Sep 2024 Guillermo Bernárdez, Lev Telyatnikov, Marco Montagna, Federica Baccini, Mathilde Papillon, Miquel Ferriol-Galmés, Mustafa Hajij, Theodore Papamarkou, Maria Sofia Bucarelli, Olga Zaghen, Johan Mathe, Audun Myers, Scott Mahan, Hansen Lillemark, Sharvaree Vadgama, Erik Bekkers, Tim Doster, Tegan Emerson, Henry Kvinge, Katrina Agate, Nesreen K Ahmed, Pengfei Bai, Michael Banf, Claudio Battiloro, Maxim Beketov, Paul Bogdan, Martin Carrasco, Andrea Cavallo, Yun Young Choi, George Dasoulas, Matouš Elphick, Giordan Escalona, Dominik Filipiak, Halley Fritze, Thomas Gebhart, Manel Gil-Sorribes, Salvish Goomanee, Victor Guallar, Liliya Imasheva, Andrei Irimia, Hongwei Jin, Graham Johnson, Nikos Kanakaris, Boshko Koloski, Veljko Kovač, Manuel Lecha, Minho Lee, Pierrick Leroy, Theodore Long, German Magai, Alvaro Martinez, Marissa Masden, Sebastian Mežnar, Bertran Miquel-Oliver, Alexis Molina, Alexander Nikitin, Marco Nurisso, Matt Piekenbrock, Yu Qin, Patryk Rygiel, Alessandro Salatiello, Max Schattauer, Pavel Snopov, Julian Suk, Valentina Sánchez, Mauricio Tec, Francesco Vaccarino, Jonas Verhellen, Frederic Wantiez, Alexander Weers, Patrik Zajec, Blaž Škrlj, Nina Miolane

This paper describes the 2nd edition of the ICML Topological Deep Learning Challenge that was hosted within the ICML 2024 ELLIS Workshop on Geometry-grounded Representation Learning and Generative Modeling (GRaM).

Deep Learning Representation Learning

Haldane Bundles: A Dataset for Learning to Predict the Chern Number of Line Bundles on the Torus

1 code implementation6 Dec 2023 Cody Tipton, Elizabeth Coda, Davis Brown, Alyson Bittner, Jung Lee, Grayson Jorgenson, Tegan Emerson, Henry Kvinge

Characteristic classes, which are abstract topological invariants associated with vector bundles, have become an important notion in modern physics with surprising real-world consequences.

Understanding the Inner Workings of Language Models Through Representation Dissimilarity

no code implementations23 Oct 2023 Davis Brown, Charles Godfrey, Nicholas Konz, Jonathan Tu, Henry Kvinge

As language models are applied to an increasing number of real-world applications, understanding their inner workings has become an important issue in model trust, interpretability, and transparency.

Language Modelling

Attributing Learned Concepts in Neural Networks to Training Data

no code implementations4 Oct 2023 Nicholas Konz, Charles Godfrey, Madelyn Shapiro, Jonathan Tu, Henry Kvinge, Davis Brown

By now there is substantial evidence that deep learning models learn certain human-interpretable features as part of their internal representations of data.

SCITUNE: Aligning Large Language Models with Scientific Multimodal Instructions

1 code implementation3 Jul 2023 Sameera Horawalavithana, Sai Munikoti, Ian Stewart, Henry Kvinge

Instruction finetuning is a popular paradigm to align large language models (LLM) with human intent.

ColMix -- A Simple Data Augmentation Framework to Improve Object Detector Performance and Robustness in Aerial Images

no code implementations22 May 2023 Cuong Ly, Grayson Jorgenson, Dan Rosa de Jesus, Henry Kvinge, Adam Attarian, Yijing Watkins

In this work, we present a novel augmentation method, called collage pasting, for increasing the object density without a need for segmentation masks, thereby improving the detector performance.

Data Augmentation Object

How many dimensions are required to find an adversarial example?

no code implementations24 Mar 2023 Charles Godfrey, Henry Kvinge, Elise Bishoff, Myles Mckay, Davis Brown, Tim Doster, Eleanor Byler

Past work exploring adversarial vulnerability have focused on situations where an adversary can perturb all dimensions of model input.

Fast computation of permutation equivariant layers with the partition algebra

no code implementations10 Mar 2023 Charles Godfrey, Michael G. Rawson, Davis Brown, Henry Kvinge

The space of permutation equivariant linear layers is a generalization of the partition algebra, an object first discovered in statistical physics with deep connections to the representation theory of the symmetric group, and the basis described above generalizes the so-called orbit basis of the partition algebra.

Exploring the Representation Manifolds of Stable Diffusion Through the Lens of Intrinsic Dimension

no code implementations16 Feb 2023 Henry Kvinge, Davis Brown, Charles Godfrey

We find that choice of prompt has a substantial impact on the intrinsic dimension of representations at both layers of the model which we explored, but that the nature of this impact depends on the layer being considered.

Parameters, Properties, and Process: Conditional Neural Generation of Realistic SEM Imagery Towards ML-assisted Advanced Manufacturing

no code implementations13 Jan 2023 Scott Howland, Lara Kassab, Keerti Kappagantula, Henry Kvinge, Tegan Emerson

By characterizing microstructure from a topological perspective we are able to evaluate our models' ability to capture the breadth and diversity of experimental scanning electron microscope (SEM) samples.

Internal Representations of Vision Models Through the Lens of Frames on Data Manifolds

no code implementations19 Nov 2022 Henry Kvinge, Grayson Jorgenson, Davis Brown, Charles Godfrey, Tegan Emerson

While the last five years have seen considerable progress in understanding the internal representations of deep learning models, many questions remain.

Do Neural Networks Trained with Topological Features Learn Different Internal Representations?

no code implementations14 Nov 2022 Sarah McGuire, Shane Jackson, Tegan Emerson, Henry Kvinge

While this field, sometimes known as topological machine learning (TML), has seen some notable successes, an understanding of how the process of learning from topological features differs from the process of learning from raw data is still limited.

Topological Data Analysis

Testing predictions of representation cost theory with CNNs

1 code implementation3 Oct 2022 Charles Godfrey, Elise Bishoff, Myles Mckay, Davis Brown, Grayson Jorgenson, Henry Kvinge, Eleanor Byler

It is widely acknowledged that trained convolutional neural networks (CNNs) have different levels of sensitivity to signals of different frequency.

On the Symmetries of Deep Learning Models and their Internal Representations

2 code implementations27 May 2022 Charles Godfrey, Davis Brown, Tegan Emerson, Henry Kvinge

In this paper we seek to connect the symmetries arising from the architecture of a family of models with the symmetries of that family's internal representation of data.

TopTemp: Parsing Precipitate Structure from Temper Topology

no code implementations1 Apr 2022 Lara Kassab, Scott Howland, Henry Kvinge, Keerti Sahithi Kappagantula, Tegan Emerson

Technological advances are in part enabled by the development of novel manufacturing processes that give rise to new materials or material property improvements.

Fiber Bundle Morphisms as a Framework for Modeling Many-to-Many Maps

no code implementations15 Mar 2022 Elizabeth Coda, Nico Courts, Colby Wight, Loc Truong, Woongjo Choi, Charles Godfrey, Tegan Emerson, Keerti Kappagantula, Henry Kvinge

That is, a single input can potentially yield many different outputs (whether due to noise, imperfect measurement, or intrinsic stochasticity in the process) and many different inputs can yield the same output (that is, the map is not injective).

Benchmarking Sentiment Analysis

Differential Property Prediction: A Machine Learning Approach to Experimental Design in Advanced Manufacturing

no code implementations3 Dec 2021 Loc Truong, Woongjo Choi, Colby Wight, Lizzy Coda, Tegan Emerson, Keerti Kappagantula, Henry Kvinge

We show that by focusing on the experimenter's need to choose between multiple candidate experimental parameters, we can reframe the challenging regression task of predicting material properties from processing parameters, into a classification task on which machine learning models can achieve good performance.

BIG-bench Machine Learning Experimental Design +1

Making Corgis Important for Honeycomb Classification: Adversarial Attacks on Concept-based Explainability Tools

no code implementations14 Oct 2021 Davis Brown, Henry Kvinge

Methods for model explainability have become increasingly critical for testing the fairness and soundness of deep learning.

Adversarial Attack Fairness

Bundle Networks: Fiber Bundles, Local Trivializations, and a Generative Approach to Exploring Many-to-one Maps

1 code implementation ICLR 2022 Nico Courts, Henry Kvinge

Many-to-one maps are ubiquitous in machine learning, from the image recognition model that assigns a multitude of distinct images to the concept of "cat" to the time series forecasting model which assigns a range of distinct time-series to a single scalar regression value.

Time Series Time Series Forecasting

A Topological-Framework to Improve Analysis of Machine Learning Model Performance

no code implementations9 Jul 2021 Henry Kvinge, Colby Wight, Sarah Akers, Scott Howland, Woongjo Choi, Xiaolong Ma, Luke Gosink, Elizabeth Jurrus, Keerti Kappagantula, Tegan H. Emerson

As both machine learning models and the datasets on which they are evaluated have grown in size and complexity, the practice of using a few summary statistics to understand model performance has become increasingly problematic.

BIG-bench Machine Learning

Rotating spiders and reflecting dogs: a class conditional approach to learning data augmentation distributions

no code implementations7 Jun 2021 Scott Mahan, Henry Kvinge, Tim Doster

Building invariance to non-meaningful transformations is essential to building efficient and generalizable machine learning models.

Data Augmentation

One Representation to Rule Them All: Identifying Out-of-Support Examples in Few-shot Learning with Generic Representations

no code implementations2 Jun 2021 Henry Kvinge, Scott Howland, Nico Courts, Lauren A. Phillips, John Buckheit, Zachary New, Elliott Skomski, Jung H. Lee, Sandeep Tiwari, Jessica Hibler, Courtney D. Corley, Nathan O. Hodas

We describe how this problem is subtly different from out-of-distribution detection and describe a new method of identifying OOS examples within the Prototypical Networks framework using a fixed point which we call the generic representation.

Few-Shot Learning Out-of-Distribution Detection

Sheaves as a Framework for Understanding and Interpreting Model Fit

no code implementations21 May 2021 Henry Kvinge, Brett Jefferson, Cliff Joslyn, Emilie Purvine

As data grows in size and complexity, finding frameworks which aid in interpretation and analysis has become critical.

Prototypical Region Proposal Networks for Few-Shot Localization and Classification

no code implementations8 Apr 2021 Elliott Skomski, Aaron Tuor, Andrew Avila, Lauren Phillips, Zachary New, Henry Kvinge, Courtney D. Corley, Nathan Hodas

Recently proposed few-shot image classification methods have generally focused on use cases where the objects to be classified are the central subject of images.

Classification Few-Shot Image Classification +2

Hypergraph Models of Biological Networks to Identify Genes Critical to Pathogenic Viral Response

no code implementations6 Oct 2020 Song Feng, Emily Heath, Brett Jefferson, Cliff Joslyn, Henry Kvinge, Hugh D. Mitchell, Brenda Praggastis, Amie J. Eisfeld, Amy C. Sims, Larissa B. Thackray, Shufang Fan, Kevin B. Walters, Peter J. Halfmann, Danielle Westhoff-Smith, Qing Tan, Vineet D. Menachery, Timothy P. Sheahan, Adam S. Cockrell, Jacob F. Kocher, Kelly G. Stratton, Natalie C. Heller, Lisa M. Bramer, Michael S. Diamond, Ralph S. Baric, Katrina M. Waters, Yoshihiro Kawaoka, Jason E. McDermott, Emilie Purvine

Results: We compiled a novel data set of transcriptional host response to pathogenic viral infections and formulated relationships between genes as a hypergraph where hyperedges represent significantly perturbed genes, and vertices represent individual biological samples with specific experimental conditions.

More chemical detection through less sampling: amplifying chemical signals in hyperspectral data cubes through compressive sensing

no code implementations27 Jun 2019 Henry Kvinge, Elin Farnell, Julia R. Dupuis, Michael Kirby, Chris Peterson, Elizabeth C. Schundler

In this paper we explore a phenomenon in which bandwise CS sampling of a hyperspectral data cube followed by reconstruction can actually result in amplification of chemical signals contained in the cube.

Compressive Sensing

A data-driven approach to sampling matrix selection for compressive sensing

no code implementations20 Jun 2019 Elin Farnell, Henry Kvinge, John P. Dixon, Julia R. Dupuis, Michael Kirby, Chris Peterson, Elizabeth C. Schundler, Christian W. Smith

We propose a method for defining an order for a sampling basis that is optimal with respect to capturing variance in data, thus allowing for meaningful sensing at any desired level of compression.

Compressive Sensing

Rare geometries: revealing rare categories via dimension-driven statistics

no code implementations29 Jan 2019 Henry Kvinge, Elin Farnell, Jingya Li, Yujia Chen

The first is a general lack of labeled examples of the rare class and the second is the potential non-separability of the rare class from the majority (in terms of available features).

Translation

Multi-Dimensional Scaling on Groups

no code implementations8 Dec 2018 Mark Blumstein, Henry Kvinge

Leveraging the intrinsic symmetries in data for clear and efficient analysis is an important theme in signal processing and other data-driven sciences.

Dimensionality Reduction

Too many secants: a hierarchical approach to secant-based dimensionality reduction on large data sets

no code implementations5 Aug 2018 Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson

Intuitively, the SAP algorithm seeks to determine a projection which best preserves the lengths of all secants between points in a data set; by applying the algorithm to find the best projections to vector spaces of various dimensions, one may infer the dimension of the manifold of origination.

Dimensionality Reduction

A GPU-Oriented Algorithm Design for Secant-Based Dimensionality Reduction

no code implementations10 Jul 2018 Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson

Dimensionality-reduction techniques are a fundamental tool for extracting useful information from high-dimensional data sets.

Dimensionality Reduction

Endmember Extraction on the Grassmannian

no code implementations3 Jul 2018 Elin Farnell, Henry Kvinge, Michael Kirby, Chris Peterson

Endmember extraction plays a prominent role in a variety of data analysis problems as endmembers often correspond to data representing the purest or best representative of some feature.

Cannot find the paper you are looking for? You can Submit a new open access paper.