Search Results for author: Peter Sollich

Found 14 papers, 1 papers with code

Towards Robust Waveform-Based Acoustic Models

no code implementations16 Oct 2021 Dino Oglic, Zoran Cvetkovic, Peter Sollich, Steve Renals, Bin Yu

We study the problem of learning robust acoustic models in adverse environments, characterized by a significant mismatch between training and test conditions.

Data Augmentation Inductive Bias +3

Shear Induced Orientational Ordering in Active Glass

no code implementations26 Jan 2021 Rituparno Mandal, Peter Sollich

Using a Fokker-Planck descriptionwe make testable predictions without any fit parameters for the joint distribution of single particleposition and orientation.

Soft Condensed Matter Disordered Systems and Neural Networks Statistical Mechanics

Fragmentation in trader preferences among multiple markets: Market coexistence versus single market dominance

no code implementations7 Dec 2020 Robin Nicole, Aleksandra Alorić, Peter Sollich

These changes have re-emphasized the importance of understanding the effects of market competition: does proliferation of trading venues and increased competition lead to dominance of a single market or coexistence of multiple markets?

Systematic model reduction captures the dynamics of extrinsic noise in biochemical subnetworks

no code implementations19 Mar 2020 Barbara Bravi, Katy J. Rubin, Peter Sollich

We consider the general problem of describing the dynamics of subnetworks of larger biochemical reaction networks, e. g. protein interaction networks involving complex formation and dissociation reactions.

Dynamical selection of Nash equilibria using Experience Weighted Attraction Learning: emergence of heterogeneous mixed equilibria

no code implementations29 Jun 2017 Robin Nicole, Peter Sollich

We therefore compare with the results of Experience-Weighted Attraction (EWA) learning, which at long times leads to Nash equilibria in the appropriate limits of large intensity of choice, low noise (long agent memory) and perfect imputation of missing scores (fictitious play).

Imputation

Phase Diagram of Restricted Boltzmann Machines and Generalised Hopfield Networks with Arbitrary Priors

no code implementations20 Feb 2017 Adriano Barra, Giuseppe Genovese, Peter Sollich, Daniele Tantari

Restricted Boltzmann Machines are described by the Gibbs measure of a bipartite spin glass, which in turn corresponds to the one of a generalised Hopfield network.

Retrieval

Phase transitions in Restricted Boltzmann Machines with generic priors

no code implementations9 Dec 2016 Adriano Barra, Giuseppe Genovese, Peter Sollich, Daniele Tantari

We study Generalised Restricted Boltzmann Machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables.

Retrieval

A Subband-Based SVM Front-End for Robust ASR

no code implementations24 Dec 2013 Jibran Yousafzai, Zoran Cvetkovic, Peter Sollich, Matthew Ager

This work proposes a novel support vector machine (SVM) based robust automatic speech recognition (ASR) front-end that operates on an ensemble of the subband components of high-dimensional acoustic waveforms.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +2

Speech Recognition Front End Without Information Loss

no code implementations24 Dec 2013 Matthew Ager, Zoran Cvetkovic, Peter Sollich

Speech representation and modelling in high-dimensional spaces of acoustic waveforms, or a linear transformation thereof, is investigated with the aim of improving the robustness of automatic speech recognition to additive noise.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +2

Learning curves for multi-task Gaussian process regression

no code implementations NeurIPS 2012 Peter Sollich, Simon Ashton

We study the average case performance of multi-task Gaussian process (GP) regression as captured in the learning curve, i. e.\ the average Bayes error for a chosen task versus the total number of examples $n$ for all tasks.

Multi-Task Learning regression

Random walk kernels and learning curves for Gaussian process regression on random graphs

no code implementations6 Nov 2012 Matthew Urry, Peter Sollich

Our method for predicting the learning curves using belief propagation is significantly more accurate than previous approximations and should become exact in the limit of large random graphs.

Gaussian Processes regression

Exact learning curves for Gaussian process regression on large random graphs

no code implementations NeurIPS 2010 Matthew Urry, Peter Sollich

We study learning curves for Gaussian process regression which characterise performance in terms of the Bayes error averaged over datasets of a given size.

regression

Kernels and learning curves for Gaussian process regression on random graphs

no code implementations NeurIPS 2009 Peter Sollich, Matthew Urry, Camille Coti

The fully correlated limit is reached only once loops become relevant, and we estimate where the crossover to this regime occurs.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.