Search Results for author: Michael Witbrock

Found 23 papers, 5 papers with code

Convolutional and Recurrent Neural Networks for Spoken Emotion Recognition

no code implementations ALTA 2020 Aaron Keesing, Ian Watson, Michael Witbrock

We test four models proposed in the speech emotion recognition (SER) literature on 15 public and academic licensed datasets in speaker-independent cross-validation.

Speech Emotion Recognition

Semantic Construction Grammar: Bridging the NL / Logic Divide

no code implementations10 Dec 2021 Dave Schneider, Michael Witbrock

In this paper, we discuss Semantic Construction Grammar (SCG), a system developed over the past several years to facilitate translation between natural language and logical representations.

Translation

Relating Blindsight and AI: A Review

no code implementations9 Dec 2021 Joshua Bensemann, Qiming Bao, Gaël Gendron, Tim Hartill, Michael Witbrock

If we assume that artificial networks have no form of visual experience, then deficits caused by blindsight give us insights into the processes occurring within visual experience that we can incorporate into artificial neural networks.

DeepQR: Neural-based Quality Ratings for Learnersourced Multiple-Choice Questions

no code implementations19 Nov 2021 Lin Ni, Qiming Bao, Xiaoxuan Li, Qianqian Qi, Paul Denny, Jim Warren, Michael Witbrock, Jiamou Liu

We propose DeepQR, a novel neural-network model for AQQR that is trained using multiple-choice-question (MCQ) datasets collected from PeerWise, a widely-used learnersourcing platform.

Contrastive Learning

Learning to Guide a Saturation-Based Theorem Prover

no code implementations7 Jun 2021 Ibrahim Abdelaziz, Maxwell Crouse, Bassem Makni, Vernon Austil, Cristina Cornelio, Shajith Ikbal, Pavan Kapanipathi, Ndivhuwo Makondo, Kavitha Srinivas, Michael Witbrock, Achille Fokoue

In addition, to the best of our knowledge, TRAIL is the first reinforcement learning-based approach to exceed the performance of a state-of-the-art traditional theorem prover on a standard theorem proving benchmark (solving up to 17% more problems).

Automated Theorem Proving

Graph Enhanced Cross-Domain Text-to-SQL Generation

no code implementations WS 2019 Siyu Huo, Tengfei Ma, Jie Chen, Maria Chang, Lingfei Wu, Michael Witbrock

Semantic parsing is a fundamental problem in natural language understanding, as it involves the mapping of natural language to structured forms such as executable queries or logic-like knowledge representations.

Natural Language Understanding Semantic Parsing +3

A Sequential Set Generation Method for Predicting Set-Valued Outputs

no code implementations12 Mar 2019 Tian Gao, Jie Chen, Vijil Chenthamarakshan, Michael Witbrock

Though SSG is sequential in nature, it does not penalize the ordering of the appearance of the set elements and can be applied to a variety of set output problems, such as a set of classification labels or sequences.

General Classification Multi-Label Classification

From Node Embedding to Graph Embedding: Scalable Global Graph Kernel via Random Features

no code implementations NIPS 2018 2018 Lingfei Wu, Ian En-Hsu Yen, Kun Xu, Liang Zhao, Yinglong Xia, Michael Witbrock

Graph kernels are one of the most important methods for graph data analysis and have been successfully applied in diverse applications.

Graph Embedding

Answering Science Exam Questions Using Query Rewriting with Background Knowledge

no code implementations15 Sep 2018 Ryan Musa, Xiaoyan Wang, Achille Fokoue, Nicholas Mattei, Maria Chang, Pavan Kapanipathi, Bassem Makni, Kartik Talamadupula, Michael Witbrock

Open-domain question answering (QA) is an important problem in AI and NLP that is emerging as a bellwether for progress on the generalizability of AI methods and techniques.

Information Retrieval Natural Language Inference +1

Random Warping Series: A Random Features Method for Time-Series Embedding

1 code implementation14 Sep 2018 Lingfei Wu, Ian En-Hsu Yen, Jin-Feng Yi, Fangli Xu, Qi Lei, Michael Witbrock

The proposed kernel does not suffer from the issue of diagonal dominance while naturally enjoys a \emph{Random Features} (RF) approximation, which reduces the computational complexity of existing DTW-based techniques from quadratic to linear in terms of both the number and the length of time-series.

Dynamic Time Warping Time Series

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

4 code implementations ICLR 2019 Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock, Vadim Sheinin

Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.

Graph-to-Sequence SQL-to-Text +1

D2KE: From Distance to Kernel and Embedding

no code implementations14 Feb 2018 Lingfei Wu, Ian En-Hsu Yen, Fangli Xu, Pradeep Ravikumar, Michael Witbrock

For many machine learning problem settings, particularly with structured inputs such as sequences or sets of objects, a distance measure between inputs can be specified more naturally than a feature representation.

Time Series

An Implementation of Back-Propagation Learning on GF11, a Large SIMD Parallel Computer

no code implementations4 Jan 2018 Michael Witbrock, Marco Zagha

We describe a neural network simulator for the IBM GF11, an experimental SIMD machine with 566 processors and a peak arithmetic performance of 11 Gigaflops.

Neural Network simulation

Dilated Recurrent Neural Networks

2 code implementations NeurIPS 2017 Shiyu Chang, Yang Zhang, Wei Han, Mo Yu, Xiaoxiao Guo, Wei Tan, Xiaodong Cui, Michael Witbrock, Mark Hasegawa-Johnson, Thomas S. Huang

To provide a theory-based quantification of the architecture's advantages, we introduce a memory capacity measure, the mean recurrent length, which is more suitable for RNNs with long skip connections than existing measures.

Sequential Image Classification

Controlling Search in Very large Commonsense Knowledge Bases: A Machine Learning Approach

no code implementations14 Mar 2016 Abhishek Sharma, Michael Witbrock, Keith Goolsbey

Results show that these methods lead to an order of magnitude reduction in inference time.

Cannot find the paper you are looking for? You can Submit a new open access paper.