no code implementations • 30 Oct 2023 • Surya Narayanan Hari, Matt Thomson
Currently, over a thousand LLMs exist that are multi-purpose and are capable of performing real world tasks, including Q&A, text summarization, content generation, etc.
1 code implementation • 2 Oct 2023 • Aman Bhargava, Cameron Witkowski, Manav Shah, Matt Thomson
We investigate the reachable set of output token sequences $R_y(\mathbf x_0)$ for which there exists a control input sequence $\mathbf u$ for each $\mathbf y \in R_y(\mathbf x_0)$ that steers the LLM to output $\mathbf y$ from initial state sequence $\mathbf x_0$.
no code implementations • 22 Aug 2023 • Surya Narayanan Hari, Matt Thomson
Here, we propose a context-aware routing system, Tryage, that leverages a language model router for optimal selection of expert models from a model library based on analysis of individual input prompts.
no code implementations • 23 Aug 2022 • Jialong Jiang, Sisi Chen, Tiffany Tsou, Christopher S. McGinnis, Tahmineh Khazaei, Qin Zhu, Jong H. Park, Paul Rivaud, Inna-Marie Strazhnik, Eric D. Chow, David A. Sivak, Zev J. Gartner, Matt Thomson
We develop a unified mathematical model that quantitatively describes the transcriptome scale response of myeloid and lymphoid cell types to individual drugs and drug combinations through a single inferred regulatory network.
1 code implementation • 30 Apr 2022 • Guruprasad Raghavan, Bahey Tharwat, Surya Narayanan Hari, Dhruvil Satani, Matt Thomson
We conceptualize the weight space of a neural network as a curved Riemannian manifold equipped with a metric tensor whose spectrum defines low rank subspaces in weight space that accommodate network adaptation without loss of prior knowledge.
no code implementations • NeurIPS Workshop AI4Scien 2021 • Enrique Amaya, Shahriar Shadkhoo, Dominik Schildknecht, Matt Thomson
ML approaches are relevant in active matter, a field that spans scales and studies dynamics of far-from-equilibrium systems where there are significant challenges in predicting macroscopic behavior from microscopic interactions of active particles.
no code implementations • NeurIPS Workshop AI4Scien 2021 • Pranav Bhamidipati, Guruprasad Raghavan, Matt Thomson
Biological tissues reliably grow into precise, functional structures from simple starting states during development.
no code implementations • 16 Jul 2021 • Marco Santorelli, Pranav Bhamidipati, Andriu Kavanagh, Victoria Fitts, Trusha Sondkar, Matt Thomson, Leonardo Morsut
Here, we exploit physical-chemical coupling observed in a synthetic patterning circuit in order to control the size and spatial distribution of patterned synthetic cell sheets.
no code implementations • 2 Jul 2021 • Zitong Jerry Wang, Matt Thomson
By viewing cell surface receptors as a sensor network, we develop an information theoretic framework for computing the optimal spatial organization of a sensing system for a given spatial signaling environment.
no code implementations • 15 Jun 2021 • Xiaoqiao Chen, Sisi Chen, Matt Thomson
Here, we introduce an active learning method (ActiveSVM) that identifies minimal but highly-informative gene sets that enable the identification of cell-types, physiological states, and genetic perturbations in single-cell data using a small number of genes.
no code implementations • 5 Jun 2021 • Guruprasad Raghavan, Matt Thomson
Broadly, we introduce a geometric framework that unifies a range of machine learning objectives and that can be applied to multiple classes of neural network architectures.
no code implementations • 28 May 2021 • Dominik Schildknecht, Anastasia N. Popova, Jack Stellwagen, Matt Thomson
The control of far-from-equilibrium physical systems, including active materials, has emerged as an important area for the application of reinforcement learning (RL) strategies to derive control policies for physical systems.
no code implementations • 21 Jan 2021 • Zijie Qu, Jialong Jiang, Heun Jin Lee, Rob Phillips, Shahriar Shadkhoo, Matt Thomson
Studying a large set of shapes, we observe that the active networks contract in a shape-preserving manner that persists over the course of contraction.
Soft Condensed Matter
no code implementations • NeurIPS Workshop DL-IG 2020 • Guruprasad Raghavan, Matt Thomson
The geometry of weight spaces and functional manifolds of neural networks play an important role towards 'understanding' the intricacies of ML.
no code implementations • 12 Jun 2020 • Guruprasad Raghavan, Cong Lin, Matt Thomson
Inspired by this strategy, we attempt to efficiently self-organize large neural networks with an arbitrary number of layers into a wide variety of architectures.
no code implementations • 23 May 2020 • Guruprasad Raghavan, Jiayi Li, Matt Thomson
Biological neural networks have evolved to maintain performance despite significant circuit damage.
2 code implementations • NeurIPS 2019 • Guruprasad Raghavan, Matt Thomson
The algorithm is adaptable to a wide-range of input-layer geometries, robust to malfunctioning units in the first layer, and so can be used to successfully grow and self-organize pooling architectures of different pool-sizes and shapes.
1 code implementation • 25 Mar 2019 • Jialong Jiang, David A. Sivak, Matt Thomson
We apply the framework to the inference of spin network models and find that designed perturbations can reduce the sampling complexity by $10^6$-fold across a variety of network architectures.