Search Results for author: Peter beim Graben

Found 9 papers, 2 papers with code

A neural network account to Kant's philosophical aesthetics

no code implementations12 Apr 2024 Peter beim Graben

In my reconstruction, the convergence of the GAN algorithm during the reception of art, either music or fine, entails the harmony of the faculties and thereby a neural network analogue of subjective purposefulness, i. e., beauty.

Quantum Tonality: A Mathemusical Playground

no code implementations3 Apr 2024 Peter beim Graben, Thomas Noll

Fitting Gaussian Mixture Models (GMM) to the Krumhansl-Kessler (KK) probe tone profiles for static attraction opens the possibility to investigate the underlying wave function as the stationary ground state of an anharmonic quantum oscillator with a schematic Hamiltonian involving a perturbation potential.

Invariants for neural automata

1 code implementation4 Feb 2023 Jone Uria-Albizuri, Giovanni Sirio Carmantini, Peter beim Graben, Serafim Rodrigues

Our work could be of substantial importance for related regression studies of real-world measurements with neurosymbolic processors for avoiding confounding results that are dependant on a particular encoding and not intrinsic to the dynamics.

Machine Semiotics

no code implementations24 Aug 2020 Peter beim Graben, Markus Huber-Liebl, Peter Klimczak, Günther Wirsching

For speech assistive devices, the learning of machine-specific meanings of human utterances, i. e. the fossilization of conversational implicatures into conventionalized ones by trial and error through lexicalization appears to be sufficient.

Implicatures speech-recognition +1

Reinforcement learning of minimalist grammars

no code implementations30 Apr 2020 Peter beim Graben, Ronald Römer, Werner Meyer, Markus Huber, Matthias Wolff

In order to develop proper cognitive information and communication technologies, simple slot-filling should be replaced by utterance meaning transducers (UMT) that are based on semantic parsers and a mental lexicon, comprising syntactic, phonetic and semantic features of the language under consideration.

reinforcement-learning Reinforcement Learning (RL) +2

Vector symbolic architectures for context-free grammars

no code implementations11 Mar 2020 Peter beim Graben, Markus Huber, Werner Meyer, Ronald Römer, Matthias Wolff

Our approach could leverage the development of VSA for explainable artificial intelligence (XAI) by means of hyperdimensional deep neural computation.

Explainable artificial intelligence Explainable Artificial Intelligence (XAI)

Reinforcement Learning of Minimalist Numeral Grammars

no code implementations11 Jun 2019 Peter beim Graben, Ronald Römer, Werner Meyer, Markus Huber, Matthias Wolff

In order to develop proper cognitive information and communication technologies, simple slot-filling should be replaced by utterance meaning transducers (UMT) that are based on semantic parsers and a \emph{mental lexicon}, comprising syntactic, phonetic and semantic features of the language under consideration.

reinforcement-learning Reinforcement Learning (RL) +2

A modular architecture for transparent computation in Recurrent Neural Networks

1 code implementation7 Sep 2016 Giovanni Sirio Carmantini, Peter beim Graben, Mathieu Desroches, Serafim Rodrigues

We then show that the Goedelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space.

Turing Computation with Recurrent Artificial Neural Networks

no code implementations4 Nov 2015 Giovanni S Carmantini, Peter beim Graben, Mathieu Desroches, Serafim Rodrigues

We improve the results by Siegelmann & Sontag (1995) by providing a novel and parsimonious constructive mapping between Turing Machines and Recurrent Artificial Neural Networks, based on recent developments of Nonlinear Dynamical Automata.

Cannot find the paper you are looking for? You can Submit a new open access paper.