Search Results for author: Peter Ochs

Found 13 papers, 4 papers with code

Near-optimal Closed-loop Method via Lyapunov Damping for Convex Optimization

1 code implementation16 Nov 2023 Severin Maier, Camille Castera, Peter Ochs

We introduce an autonomous system with closed-loop damping for first-order convex optimization.

PAC-Bayesian Learning of Optimization Algorithms

no code implementations20 Oct 2022 Michael Sucker, Peter Ochs

We apply the PAC-Bayes theory to the setting of learning-to-optimize.

Fixed-Point Automatic Differentiation of Forward--Backward Splitting Algorithms for Partly Smooth Functions

no code implementations5 Aug 2022 Sheheryar Mehmood, Peter Ochs

A large class of non-smooth practical optimization problems can be written as minimization of a sum of smooth and partly smooth functions.

Image Denoising

Global Convergence of Model Function Based Bregman Proximal Minimization Algorithms

no code implementations24 Dec 2020 Mahesh Chandra Mukkamala, Jalal Fadili, Peter Ochs

We fix this issue by proposing the MAP property, which generalizes the $L$-smad property and is also valid for a large class of nonconvex nonsmooth composite problems.

Retrieval valid

Bregman Proximal Framework for Deep Linear Neural Networks

no code implementations8 Oct 2019 Mahesh Chandra Mukkamala, Felix Westerkamp, Emanuel Laude, Daniel Cremers, Peter Ochs

This initiated the development of the Bregman proximal gradient (BPG) algorithm and an inertial variant (momentum based) CoCaIn BPG, which however rely on problem dependent Bregman distances.

Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms

2 code implementations NeurIPS 2019 Mahesh Chandra Mukkamala, Peter Ochs

Matrix Factorization is a popular non-convex optimization problem, for which alternating minimization schemes are mostly used.

Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization

2 code implementations6 Apr 2019 Mahesh Chandra Mukkamala, Peter Ochs, Thomas Pock, Shoham Sabach

Backtracking line-search is an old yet powerful strategy for finding a better step sizes to be used in proximal gradient algorithms.

Model Function Based Conditional Gradient Method with Armijo-like Line Search

no code implementations23 Jan 2019 Yura Malitsky, Peter Ochs

The Conditional Gradient Method is generalized to a class of non-smooth non-convex optimization problems with many applications in machine learning.

BIG-bench Machine Learning

Lifting Layers: Analysis and Applications

1 code implementation ECCV 2018 Peter Ochs, Tim Meinhardt, Laura Leal-Taixe, Michael Moeller

A lifting layer increases the dimensionality of the input, naturally yields a linear spline when combined with a fully connected layer, and therefore closes the gap between low and high dimensional approximation problems.

Denoising Image Classification

iPiano: Inertial Proximal Algorithm for Non-Convex Optimization

no code implementations18 Apr 2014 Peter Ochs, Yunjin Chen, Thomas Brox, Thomas Pock

A rigorous analysis of the algorithm for the proposed class of problems yields global convergence of the function values and the arguments.

Image Compression Image Denoising

An Iterated L1 Algorithm for Non-smooth Non-convex Optimization in Computer Vision

no code implementations CVPR 2013 Peter Ochs, Alexey Dosovitskiy, Thomas Brox, Thomas Pock

Here we extend the problem class to linearly constrained optimization of a Lipschitz continuous function, which is the sum of a convex function and a function being concave and increasing on the non-negative orthant (possibly non-convex and nonconcave on the whole space).

Image Denoising Optical Flow Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.