1 code implementation • 1 Jun 2023 • Paul Soulos, Edward Hu, Kate McCurdy, Yunmo Chen, Roland Fernandez, Paul Smolensky, Jianfeng Gao
To facilitate the learning of these symbolic sequences, we introduce a differentiable tree interpreter that compiles high-level symbolic tree operations into subsymbolic matrix operations on tensors.
1 code implementation • 21 Dec 2022 • Najoung Kim, Tal Linzen, Paul Smolensky
Human linguistic capacity is often characterized by compositionality and the generalization it enables -- human learners can produce and comprehend novel complex expressions by composing known parts.
no code implementations • MTSummit 2021 • Paul Soulos, Sudha Rao, Caitlin Smith, Eric Rosen, Asli Celikyilmaz, R. Thomas McCoy, Yichen Jiang, Coleman Haley, Roland Fernandez, Hamid Palangi, Jianfeng Gao, Paul Smolensky
Machine translation has seen rapid progress with the advent of Transformer-based models.
no code implementations • 2 May 2022 • Paul Smolensky, R. Thomas McCoy, Roland Fernandez, Matthew Goldrick, Jianfeng Gao
What explains the dramatic progress from 20th-century to 21st-century AI, and how can the remaining limitations of current AI be overcome?
no code implementations • 18 Nov 2021 • R. Thomas McCoy, Paul Smolensky, Tal Linzen, Jianfeng Gao, Asli Celikyilmaz
We apply these analyses to four neural language models (an LSTM, a Transformer, Transformer-XL, and GPT-2).
1 code implementation • 24 Oct 2021 • Matthias Lalisse, Eric Rosen, Paul Smolensky
We present Harmonic Memory Networks (HMem), a neural architecture for knowledge base completion that models entities as weighted sums of pairwise bindings between an entity's neighbors and corresponding relations.
no code implementations • 24 Oct 2021 • Matthias Lalisse, Paul Smolensky
A framework and method are proposed for the study of constituent composition in fMRI.
1 code implementation • NAACL 2021 • Yichen Jiang, Asli Celikyilmaz, Paul Smolensky, Paul Soulos, Sudha Rao, Hamid Palangi, Roland Fernandez, Caitlin Smith, Mohit Bansal, Jianfeng Gao
On several syntactic and semantic probing tasks, we demonstrate the emergent structural information in the role vectors and improved syntactic interpretability in the TPR layer outputs.
1 code implementation • 19 May 2021 • Jacob Russin, Roland Fernandez, Hamid Palangi, Eric Rosen, Nebojsa Jojic, Paul Smolensky, Jianfeng Gao
A longstanding question in cognitive science concerns the learning mechanisms underlying compositionality in human cognition.
no code implementations • COLING 2020 • Coleman Haley, Paul Smolensky
We present a novel method for embedding trees in a vector space based on Tensor-Product Representations (TPRs) which allows for inversion: the retrieval of the original tree structure and nodes from the vectorial embedding.
1 code implementation • 18 Nov 2020 • Hassan Akbari, Hamid Palangi, Jianwei Yang, Sudha Rao, Asli Celikyilmaz, Roland Fernandez, Paul Smolensky, Jianfeng Gao, Shih-Fu Chang
In this paper, we propose a new model architecture for learning multi-modal neuro-symbolic representations for video captioning.
1 code implementation • 29 Jun 2020 • R. Thomas McCoy, Erin Grant, Paul Smolensky, Thomas L. Griffiths, Tal Linzen
To facilitate computational modeling aimed at addressing this question, we introduce a framework for giving particular linguistic inductive biases to a neural network model; such a model can then be used to empirically explore the effects of those inductive biases.
1 code implementation • 25 Oct 2019 • Mehrad Moradshahi, Hamid Palangi, Monica S. Lam, Paul Smolensky, Jianfeng Gao
We introduce HUBERT which combines the structured-representational power of Tensor-Product Representations (TPRs) and BERT, a pre-trained bidirectional Transformer language model.
2 code implementations • EMNLP (BlackboxNLP) 2020 • Paul Soulos, Tom McCoy, Tal Linzen, Paul Smolensky
How can neural networks perform so well on compositional tasks even though they lack explicit compositional representations?
3 code implementations • 15 Oct 2019 • Imanol Schlag, Paul Smolensky, Roland Fernandez, Nebojsa Jojic, Jürgen Schmidhuber, Jianfeng Gao
We incorporate Tensor-Product Representations within the Transformer in order to better support the explicit representation of relation structure.
Ranked #1 on
Question Answering
on Mathematics Dataset
2 code implementations • ICML 2020 • Kezhen Chen, Qiuyuan Huang, Hamid Palangi, Paul Smolensky, Kenneth D. Forbus, Jianfeng Gao
The encoder of TP-N2F employs TPR `binding' to encode natural-language symbolic structure in vector space and the decoder uses TPR `unbinding' to generate, in symbolic space, a sequential program represented by relational tuples, each consisting of a relation (or operation) and a number of arguments.
no code implementations • 25 Sep 2019 • Kezhen Chen, Qiuyuan Huang, Hamid Palangi, Paul Smolensky, Kenneth D. Forbus, Jianfeng Gao
Generating formal-language represented by relational tuples, such as Lisp programs or mathematical expressions, from a natural-language input is an extremely challenging task because it requires to explicitly capture discrete symbolic structural information from the input to generate the output.
1 code implementation • ICLR 2019 • R. Thomas McCoy, Tal Linzen, Ewan Dunbar, Paul Smolensky
Recurrent neural networks (RNNs) can learn continuous vector representations of symbolic structures such as sequences and sentences; these representations often exhibit linear regularities (analogies).
no code implementations • 20 Dec 2018 • R. Thomas McCoy, Tal Linzen, Ewan Dunbar, Paul Smolensky
Recurrent neural networks (RNNs) can learn continuous vector representations of symbolic structures such as sequences and sentences; these representations often exhibit linear regularities (analogies).
no code implementations • 2 Nov 2018 • Matthias Lalisse, Paul Smolensky
Neural models of Knowledge Base data have typically employed compositional representations of graph objects: entity and relation embeddings are systematically combined to evaluate the truth of a candidate Knowedge Base entry.
Ranked #1 on
Knowledge Graphs
on FB15k
1 code implementation • 29 Oct 2018 • Shuai Tang, Paul Smolensky, Virginia R. de Sa
idely used recurrent units, including Long-short Term Memory (LSTM) and the Gated Recurrent Unit (GRU), perform well on natural language tasks, but their ability to learn structured representations is still questionable.
no code implementations • 20 Sep 2018 • Najoung Kim, Kyle Rawlins, Benjamin Van Durme, Paul Smolensky
Distinguishing between arguments and adjuncts of a verb is a longstanding, nontrivial problem.
no code implementations • 10 Mar 2018 • Roland Fernandez, Asli Celikyilmaz, Rishabh Singh, Paul Smolensky
We present a formal language with expressions denoting general symbol structures and queries which access information in those structures.
no code implementations • 4 Jan 2018 • Paul Tupper, Paul Smolensky, Pyeong Whan Cho
Gradient Symbolic Computation is proposed as a means of solving discrete global optimization problems using a neurally plausible continuous stochastic dynamical system.
no code implementations • 29 Oct 2017 • Qiuyuan Huang, Paul Smolensky, Xiaodong He, Li Deng, Dapeng Wu
To address this, this paper promotes image/visual captioning based CAPTCHAs, which is robust against machine-learning-based attacks.
2 code implementations • NAACL 2018 • Qiuyuan Huang, Paul Smolensky, Xiaodong He, Li Deng, Dapeng Wu
We present a new approach to the design of deep networks for natural language processing (NLP), based on the general technique of Tensor Product Representations (TPRs) for encoding and processing symbol structures in distributed neural networks.
no code implementations • 23 May 2017 • Hamid Palangi, Paul Smolensky, Xiaodong He, Li Deng
In our application of TPRN, internal representations learned by end-to-end optimization in a deep neural network performing a textual question-answering (QA) task can be interpreted using basic concepts from linguistic theory.
no code implementations • 12 Jan 2016 • Paul Smolensky, Moontae Lee, Xiaodong He, Wen-tau Yih, Jianfeng Gao, Li Deng
In this paper we present the initial development of a general theory for mapping inference in predicate logic to computation over Tensor Product Representations (TPRs; Smolensky (1990), Smolensky & Legendre (2006)).
no code implementations • 19 Nov 2015 • Moontae Lee, Xiaodong He, Wen-tau Yih, Jianfeng Gao, Li Deng, Paul Smolensky
Question answering tasks have shown remarkable progress with distributed vector representation.