no code implementations • 29 Mar 2024 • Markus J. Hofmann, Markus T. Jansen, Christoph Wigbels, Benny Briesemeister, Arthur M. Jacobs
For training and validation, we relied on 179 participants and held out a test sample of 35 participants.
no code implementations • 2 Feb 2022 • Markus J. Hofmann, Steffen Remus, Chris Biemann, Ralph Radach, Lars Kuchinke
(3) In recurrent neural networks (RNNs), the subsymbolic units are trained to predict the next word, given all preceding words in the sentences.
no code implementations • COLING (CogALex) 2020 • Markus J. Hofmann, Lara Müller, Andre Rölke, Ralph Radach, Chris Biemann
Then we trained word2vec models from individual corpora and a 70 million-sentence newspaper corpus to obtain individual and norm-based long-term memory structure.
no code implementations • 6 Dec 2019 • Markus J. Hofmann, Mareike A. Kleemann, Andre Roelke, Christian Vorstius, Ralph Radach
Direct associations between stimulus words were controlled, and semantic feature overlap between prime and target words was manipulated by their common associates.