no code implementations • 10 May 2022 • Ilya Sklyar, Anna Piunova, Christian Osendorfer
Finally, we establish a novel framework for segmentation analysis of multi-party conversations through emission latency metrics.
no code implementations • 9 Dec 2019 • Giorgio Giannone, Saeed Saremi, Jonathan Masci, Christian Osendorfer
To explicitly demonstrate the effect of these higher order objects, we show that the inferred latent transformations reflect interpretable properties in the observation space.
2 code implementations • 13 Jun 2019 • Timon Willi, Jonathan Masci, Jürgen Schmidhuber, Christian Osendorfer
We extend Neural Processes (NPs) to sequential data through Recurrent NPs or RNPs, a family of conditional state space models.
1 code implementation • CVPR 2020 • Jan Svoboda, Asha Anoosheh, Christian Osendorfer, Jonathan Masci
This paper introduces a neural style transfer model to generate a stylized image conditioning on a set of examples describing the desired style.
2 code implementations • CVPR 2020 • Jan Eric Lenssen, Christian Osendorfer, Jonathan Masci
This results in a state-of-the-art surface normal estimator that is robust to noise, outliers and point density variation, preserves sharp features through anisotropic kernels and equivariance through a local quaternion-based spatial transformer.
Ranked #7 on
Surface Normals Estimation
on PCPNet
1 code implementation • NeurIPS 2018 • Marco Ciccone, Marco Gallieri, Jonathan Masci, Christian Osendorfer, Faustino Gomez
This paper introduces Non-Autonomous Input-Output Stable Network(NAIS-Net), a very deep architecture where each stacked processing block is derived from a time-invariant non-autonomous dynamical system.
no code implementations • 29 Dec 2014 • Maximilian Karl, Christian Osendorfer
A process centric view of robust PCA (RPCA) allows its fast approximate implementation based on a special form o a deep neural network with weights shared across all layers.
1 code implementation • 27 Nov 2014 • Justin Bayer, Christian Osendorfer
Leveraging advances in variational inference, we propose to enhance recurrent neural networks with latent variables, resulting in Stochastic Recurrent Networks (STORNs).
no code implementations • 6 Jun 2014 • Justin Bayer, Christian Osendorfer
Recent advances in the estimation of deep directed graphical models and recurrent networks let us contribute to the removal of a blind spot in the area of probabilistc modelling of time series.
1 code implementation • 4 Nov 2013 • Justin Bayer, Christian Osendorfer, Daniela Korhammer, Nutan Chen, Sebastian Urban, Patrick van der Smagt
Recurrent Neural Networks (RNNs) are rich models for the processing of sequential data.
no code implementations • 30 Apr 2013 • Christian Osendorfer, Justin Bayer, Patrick van der Smagt
A standard deep convolutional neural network paired with a suitable loss function learns compact local image descriptors that perform comparably to state-of-the art approaches.
no code implementations • 14 Jan 2013 • Christian Osendorfer, Justin Bayer, Sebastian Urban, Patrick van der Smagt
Unsupervised feature learning has shown impressive results for a wide range of input modalities, in particular for object classification tasks in computer vision.
no code implementations • 9 Sep 2011 • Justin Bayer, Christian Osendorfer, Patrick van der Smagt
Recurrent neural networks (RNNs) in combination with a pooling operator and the neighbourhood components analysis (NCA) objective function are able to detect the characterizing dynamics of sequences and embed them into a fixed-length vector space of arbitrary dimensionality.