Adversarial attacks seriously threaten the high accuracy of face anti-spoofing models.
Recent research found it beneficial to use large state spaces for HMMs and PCFGs.
Ranked #1 on Constituency Grammar Induction on PTB
Second-order semantic parsing with end-to-end mean-field inference has been shown good performance.
They treat nested entities as partially-observed constituency trees and propose the masked inside algorithm for partial marginalization.
Constituency parsing and nested named entity recognition (NER) are similar tasks since they both aim to predict a collection of nested and non-crossing spans.
Graph-based methods, which decompose the score of a dependency tree into scores of dependency arcs, are popular in dependency parsing for decades.
In this work, we present a new parameterization form of PCFGs based on tensor decomposition, which has at most quadratic computational complexity in the symbol number and therefore allows us to use a much larger number of symbols.
Ranked #4 on Constituency Grammar Induction on PTB
Inspired by second-order supervised dependency parsing, we proposed a second-order extension of unsupervised neural dependency models that incorporate grandparent-child or sibling information.
Ranked #1 on Dependency Grammar Induction on WSJ10