Search Results for author: Ozan İrsoy

Found 13 papers, 4 papers with code

Learning Syntax from Naturally-Occurring Bracketings

1 code implementation NAACL 2021 Tianze Shi, Ozan İrsoy, Igor Malioutov, Lillian Lee

Naturally-occurring bracketings, such as answer fragments to natural language questions and hyperlinks on webpages, can reflect human syntactic intuition regarding phrasal boundaries.

Constituency Parsing

Diversity-Aware Batch Active Learning for Dependency Parsing

1 code implementation NAACL 2021 Tianze Shi, Adrian Benton, Igor Malioutov, Ozan İrsoy

While the predictive performance of modern statistical dependency parsers relies heavily on the availability of expensive expert-annotated treebank data, not all annotations contribute equally to the training of the parsers.

Active Learning Dependency Parsing +1

Corrected CBOW Performs as well as Skip-gram

1 code implementation EMNLP (insights) 2021 Ozan İrsoy, Adrian Benton, Karl Stratos

Mikolov et al. (2013a) observed that continuous bag-of-words (CBOW) word embeddings tend to underperform Skip-gram (SG) embeddings, and this finding has been reported in subsequent works.

Word Embeddings

Semantic Role Labeling as Syntactic Dependency Parsing

1 code implementation EMNLP 2020 Tianze Shi, Igor Malioutov, Ozan İrsoy

We reduce the task of (span-based) PropBank-style semantic role labeling (SRL) to syntactic dependency parsing.

Dependency Parsing Semantic Role Labeling

On Expected Accuracy

no code implementations1 May 2019 Ozan İrsoy

We empirically investigate the (negative) expected accuracy as an alternative loss function to cross entropy (negative log likelihood) for classification tasks.

Classification General Classification +1

Dropout Regularization in Hierarchical Mixture of Experts

no code implementations25 Dec 2018 Ozan İrsoy, Ethem Alpaydin

Dropout is a very effective method in preventing overfitting and has become the go-to regularizer for multi-layer neural networks in recent years.

Continuously Constructive Deep Neural Networks

no code implementations7 Apr 2018 Ozan İrsoy, Ethem Alpaydin

Traditionally, deep learning algorithms update the network weights whereas the network architecture is chosen manually, using a process of trial and error.

Modeling Compositionality with Multiplicative Recurrent Neural Networks

no code implementations20 Dec 2014 Ozan İrsoy, Claire Cardie

We present the multiplicative recurrent neural network as a general model for compositional meaning in language, and evaluate it on the task of fine-grained sentiment analysis.

Sentiment Analysis

Distributed Decision Trees

no code implementations19 Dec 2014 Ozan İrsoy, Ethem Alpaydin

Recently proposed budding tree is a decision tree algorithm in which every node is part internal node and part leaf.

Representation Learning

Autoencoder Trees

no code implementations26 Sep 2014 Ozan İrsoy, Ethem Alpaydin

We also see that the autoencoder tree captures hierarchical representations at different granularities of the data on its different levels and the leaves capture the localities in the input space.

Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure

no code implementations2 Dec 2013 Ozan İrsoy, Claire Cardie

Recently, deep architectures, such as recurrent and recursive neural networks have been successfully applied to various natural language processing tasks.

Natural Language Processing

Cannot find the paper you are looking for? You can Submit a new open access paper.