Search Results for author: Matthew A. Wright

Found 6 papers, 2 papers with code

Representing Long-Range Context for Graph Neural Networks with Global Attention

1 code implementation NeurIPS 2021 Zhanghao Wu, Paras Jain, Matthew A. Wright, Azalia Mirhoseini, Joseph E. Gonzalez, Ion Stoica

Inspired by recent computer vision results that find position-invariant attention performant in learning long-range relationships, our method, which we call GraphTrans, applies a permutation-invariant Transformer module after a standard GNN module.

Graph Classification Graph Embedding

Transformers are Deep Infinite-Dimensional Non-Mercer Binary Kernel Machines

no code implementations2 Jun 2021 Matthew A. Wright, Joseph E. Gonzalez

In particular, we show that the "dot-product attention" that is the core of the Transformer's operation can be characterized as a kernel learning method on a pair of Banach spaces.

Deep Attention

Contingencies from Observations: Tractable Contingency Planning with Learned Behavior Models

1 code implementation21 Apr 2021 Nicholas Rhinehart, Jeff He, Charles Packer, Matthew A. Wright, Rowan Mcallister, Joseph E. Gonzalez, Sergey Levine

Humans have a remarkable ability to make decisions by accurately reasoning about future events, including the future behaviors and states of mind of other agents.

Attentional Policies for Cross-Context Multi-Agent Reinforcement Learning

no code implementations31 May 2019 Matthew A. Wright, Roberto Horowitz

Many potential applications of reinforcement learning in the real world involve interacting with other agents whose numbers vary over time.

Multi-agent Reinforcement Learning reinforcement-learning +1

Neural-Attention-Based Deep Learning Architectures for Modeling Traffic Dynamics on Lane Graphs

no code implementations18 Apr 2019 Matthew A. Wright, Simon F. G. Ehlers, Roberto Horowitz

Deep neural networks can be powerful tools, but require careful application-specific design to ensure that the most informative relationships in the data are learnable.

Cannot find the paper you are looking for? You can Submit a new open access paper.