Search Results for author: Joshua Robinson

Found 20 papers, 13 papers with code

Relational Deep Learning: Graph Representation Learning on Relational Databases

no code implementations7 Dec 2023 Matthias Fey, Weihua Hu, Kexin Huang, Jan Eric Lenssen, Rishabh Ranjan, Joshua Robinson, Rex Ying, Jiaxuan You, Jure Leskovec

The core idea is to view relational databases as a temporal, heterogeneous graph, with a node for each row in each table, and edges specified by primary-foreign key links.

Feature Engineering Graph Representation Learning

On Retrieval Augmentation and the Limitations of Language Model Training

no code implementations16 Nov 2023 Ting-Rui Chiang, Xinyan Velocity Yu, Joshua Robinson, Ollie Liu, Isabelle Lee, Dani Yogatama

Augmenting a language model (LM) with $k$-nearest neighbors (kNN) retrieval on its training data alone can decrease its perplexity, though the underlying reasons for this remains elusive.

Language Modelling Memorization +1

Structuring Representation Geometry with Rotationally Equivariant Contrastive Learning

1 code implementation24 Jun 2023 Sharut Gupta, Joshua Robinson, Derek Lim, Soledad Villar, Stefanie Jegelka

Specifically, in the contrastive learning setting, we introduce an equivariance objective and theoretically prove that its minima forces augmentations on input space to correspond to rotations on the spherical embedding space.

Contrastive Learning Self-Supervised Learning

A deep learning approach to using wearable seismocardiography (SCG) for diagnosing aortic valve stenosis and predicting aortic hemodynamics obtained by 4D flow MRI

no code implementations5 Jan 2023 Mahmoud E. Khani, Ethan M. I. Johnson, Aparna Sodhi, Joshua Robinson, Cynthia K. Rigsby, Bradly D. Allen, Michael Markl

We also investigated the ability of this deep learning technique to differentiate between patients diagnosed with aortic valve stenosis (AS), non-AS patients with a bicuspid aortic valve (BAV), non-AS patients with a mechanical aortic valve (MAV), and healthy subjects with a normal tricuspid aortic valve (TAV).

A simple, efficient and scalable contrastive masked autoencoder for learning visual representations

1 code implementation30 Oct 2022 Shlok Mishra, Joshua Robinson, Huiwen Chang, David Jacobs, Aaron Sarna, Aaron Maschinot, Dilip Krishnan

Our framework is a minimal and conceptually clean synthesis of (C) contrastive learning, (A) masked autoencoders, and (N) the noise prediction approach used in diffusion models.

Contrastive Learning Self-Supervised Learning +1

Leveraging Large Language Models for Multiple Choice Question Answering

1 code implementation22 Oct 2022 Joshua Robinson, Christopher Michael Rytting, David Wingate

A more natural prompting approach is to present the question and answer options to the LLM jointly and have it output the symbol (e. g., "A") associated with its chosen answer option.

Answer Selection Multiple-choice +1

An Information-theoretic Approach to Prompt Engineering Without Ground Truth Labels

no code implementations ACL 2022 Taylor Sorensen, Joshua Robinson, Christopher Michael Rytting, Alexander Glenn Shaw, Kyle Jeffrey Rogers, Alexia Pauline Delorey, Mahmoud Khalil, Nancy Fulda, David Wingate

Pre-trained language models derive substantial linguistic and factual knowledge from the massive corpora on which they are trained, and prompt engineering seeks to align these models to specific tasks.

Prompt Engineering

Sign and Basis Invariant Networks for Spectral Graph Representation Learning

2 code implementations25 Feb 2022 Derek Lim, Joshua Robinson, Lingxiao Zhao, Tess Smidt, Suvrit Sra, Haggai Maron, Stefanie Jegelka

We introduce SignNet and BasisNet -- new neural architectures that are invariant to two key symmetries displayed by eigenvectors: (i) sign flips, since if $v$ is an eigenvector then so is $-v$; and (ii) more general basis symmetries, which occur in higher dimensional eigenspaces with infinitely many choices of basis eigenvectors.

Graph Regression Graph Representation Learning

Can contrastive learning avoid shortcut solutions?

1 code implementation NeurIPS 2021 Joshua Robinson, Li Sun, Ke Yu, Kayhan Batmanghelich, Stefanie Jegelka, Suvrit Sra

However, we observe that the contrastive loss does not always sufficiently guide which features are extracted, a behavior that can negatively impact the performance on downstream tasks via "shortcuts", i. e., by inadvertently suppressing important predictive features.

Contrastive Learning

A Matrix Autoencoder Framework to Align the Functional and Structural Connectivity Manifolds as Guided by Behavioral Phenotypes

1 code implementation30 May 2021 Niharika Shimona D'Souza, Mary Beth Nebel, Deana Crocetti, Nicholas Wymbs, Joshua Robinson, Stewart Mostofsky, Archana Venkataraman

We propose a novel matrix autoencoder to map functional connectomes from resting state fMRI (rs-fMRI) to structural connectomes from Diffusion Tensor Imaging (DTI), as guided by subject-level phenotypic measures.

Deep sr-DDL: Deep Structurally Regularized Dynamic Dictionary Learning to Integrate Multimodal and Dynamic Functional Connectomics data for Multidimensional Clinical Characterizations

no code implementations27 Aug 2020 Niharika Shimona D'Souza, Mary Beth Nebel, Deana Crocetti, Nicholas Wymbs, Joshua Robinson, Stewart H. Mostofsky, Archana Venkataraman

The generative component is a structurally-regularized Dynamic Dictionary Learning (sr-DDL) model that decomposes the dynamic rs-fMRI correlation matrices into a collection of shared basis networks and time varying subject-specific loadings.

Dictionary Learning

A Deep-Generative Hybrid Model to Integrate Multimodal and Dynamic Connectivity for Predicting Spectrum-Level Deficits in Autism

1 code implementation3 Jul 2020 Niharika Shimona D'Souza, Mary Beth Nebel, Deana Crocetti, Nicholas Wymbs, Joshua Robinson, Stewart Mostofsky, Archana Venkataraman

The generative part of our framework is a structurally-regularized Dynamic Dictionary Learning (sr-DDL) model that decomposes the dynamic rs-fMRI correlation matrices into a collection of shared basis networks and time varying patient-specific loadings.

Dictionary Learning

Debiased Contrastive Learning

1 code implementation NeurIPS 2020 Ching-Yao Chuang, Joshua Robinson, Lin Yen-Chen, Antonio Torralba, Stefanie Jegelka

A prominent technique for self-supervised representation learning has been to contrast semantically similar and dissimilar pairs of samples.

Contrastive Learning Generalization Bounds +2

Strength from Weakness: Fast Learning Using Weak Supervision

no code implementations ICML 2020 Joshua Robinson, Stefanie Jegelka, Suvrit Sra

Our theoretical results are reflected empirically across a range of tasks and illustrate how weak labels speed up learning on the strong task.

Weakly-supervised Learning

Perceptual Regularization: Visualizing and Learning Generalizable Representations

no code implementations25 Sep 2019 Hongzhou Lin, Joshua Robinson, Stefanie Jegelka

We propose a technique termed perceptual regularization that enables both visualization of the latent representation and control over the generality of the learned representation.

Flexible Modeling of Diversity with Strongly Log-Concave Distributions

1 code implementation NeurIPS 2019 Joshua Robinson, Suvrit Sra, Stefanie Jegelka

We propose SLC as the right extension of SR that enables easier, more intuitive control over diversity, illustrating this via examples of practical importance.

Cannot find the paper you are looking for? You can Submit a new open access paper.