1 code implementation • 6 Oct 2022 • Stephanie Brandl, David Lassner, Anne Baillot, Shinichi Nakajima
Complementary to finding good general word embeddings, an important question for representation learning is to find dynamic word embeddings, e. g., across time or domain.
no code implementations • 20 Mar 2020 • David Lassner, Anne Baillot, Sergej Dogadov, Klaus-Robert Müller, Shinichi Nakajima
In addition to the findings based on the digital scholarly edition Berlin Intellectuals, we present a general framework for the analysis of text genesis that can be used in the context of other digital resources representing document variants.
no code implementations • 14 Jan 2020 • Stephanie Brandl, David Lassner, Maximilian Alber
Word embeddings capture semantic relationships based on contextual information and are the basis for a wide variety of natural language processing applications.
no code implementations • WS 2019 • Br, Stephanie l, David Lassner
We propose Word Embedding Networks, a novel method that is able to learn word embeddings of individual data slices while simultaneously aligning and ordering them without feeding temporal information a priori to the model.