Search Results for author: Jonathan Richard Schwarz

Found 7 papers, 2 papers with code

Empowering Biomedical Discovery with AI Agents

no code implementations3 Apr 2024 ShangHua Gao, Ada Fang, Yepeng Huang, Valentina Giunchiglia, Ayush Noori, Jonathan Richard Schwarz, Yasha Ektefaie, Jovana Kondic, Marinka Zitnik

We envision 'AI scientists' as systems capable of skeptical learning and reasoning that empower biomedical research through collaborative agents that integrate machine learning tools with experimental platforms.

Continual Learning Navigate

Unleashing the Power of Meta-tuning for Few-shot Generalization Through Sparse Interpolated Experts

1 code implementation13 Mar 2024 Shengzhuang Chen, Jihoon Tack, Yunqiao Yang, Yee Whye Teh, Jonathan Richard Schwarz, Ying WEI

Conventional wisdom suggests parameter-efficient fine-tuning of foundation models as the state-of-the-art method for transfer learning in vision, replacing the rich literature of alternatives such as meta-learning.

Domain Generalization Few-Shot Image Classification +2

Online Adaptation of Language Models with a Memory of Amortized Contexts

1 code implementation7 Mar 2024 Jihoon Tack, Jaehyung Kim, Eric Mitchell, Jinwoo Shin, Yee Whye Teh, Jonathan Richard Schwarz

We propose an amortized feature extraction and memory-augmentation approach to compress and extract information from new documents into compact modulations stored in a memory bank.

Language Modelling Meta-Learning

C3: High-performance and low-complexity neural compression from a single image or video

no code implementations5 Dec 2023 Hyunjik Kim, Matthias Bauer, Lucas Theis, Jonathan Richard Schwarz, Emilien Dupont

On the UVG video benchmark, we match the RD performance of the Video Compression Transformer (Mentzer et al.), a well-established neural video codec, with less than 5k MACs/pixel for decoding.

Video Compression

Spatial Functa: Scaling Functa to ImageNet Classification and Generation

no code implementations6 Feb 2023 Matthias Bauer, Emilien Dupont, Andy Brock, Dan Rosenbaum, Jonathan Richard Schwarz, Hyunjik Kim

Neural fields, also known as implicit neural representations, have emerged as a powerful means to represent complex signals of various modalities.

Classification Image Generation

Modality-Agnostic Variational Compression of Implicit Neural Representations

no code implementations23 Jan 2023 Jonathan Richard Schwarz, Jihoon Tack, Yee Whye Teh, Jaeho Lee, Jinwoo Shin

We introduce a modality-agnostic neural compression algorithm based on a functional view of data and parameterised as an Implicit Neural Representation (INR).

Data Compression

Meta-Learning Sparse Compression Networks

no code implementations18 May 2022 Jonathan Richard Schwarz, Yee Whye Teh

Recent work in Deep Learning has re-imagined the representation of data as functions mapping from a coordinate space to an underlying continuous signal.

Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.