Search Results for author: David Lassner

Found 4 papers, 1 papers with code

Domain-Specific Word Embeddings with Structure Prediction

1 code implementation6 Oct 2022 Stephanie Brandl, David Lassner, Anne Baillot, Shinichi Nakajima

Complementary to finding good general word embeddings, an important question for representation learning is to find dynamic word embeddings, e. g., across time or domain.

Philosophy Representation Learning +1

Automatic Identification of Types of Alterations in Historical Manuscripts

no code implementations20 Mar 2020 David Lassner, Anne Baillot, Sergej Dogadov, Klaus-Robert Müller, Shinichi Nakajima

In addition to the findings based on the digital scholarly edition Berlin Intellectuals, we present a general framework for the analysis of text genesis that can be used in the context of other digital resources representing document variants.

BIG-bench Machine Learning

Balancing the composition of word embeddings across heterogenous data sets

no code implementations14 Jan 2020 Stephanie Brandl, David Lassner, Maximilian Alber

Word embeddings capture semantic relationships based on contextual information and are the basis for a wide variety of natural language processing applications.

Word Embeddings Word Similarity

Times Are Changing: Investigating the Pace of Language Change in Diachronic Word Embeddings

no code implementations WS 2019 Br, Stephanie l, David Lassner

We propose Word Embedding Networks, a novel method that is able to learn word embeddings of individual data slices while simultaneously aligning and ordering them without feeding temporal information a priori to the model.

Diachronic Word Embeddings Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.