Search Results for author: Mario Lezcano-Casado

Found 6 papers, 2 papers with code

Automatic Differentiation: Theory and Practice

no code implementations13 Jul 2022 Mario Lezcano-Casado

We present the classical coordinate-free formalism for forward and backward mode ad in the real and complex setting.

Geometric Optimisation on Manifolds with Applications to Deep Learning

no code implementations9 Mar 2022 Mario Lezcano-Casado

We design and implement a Python library to help the non-expert using all these powerful tools in a way that is efficient, extensible, and simple to incorporate into the workflow of the data scientist, practitioner, and applied researcher.

Time Series Time Series Analysis

Adaptive and Momentum Methods on Manifolds Through Trivializations

no code implementations9 Oct 2020 Mario Lezcano-Casado

We introduce a framework to generalize adaptive and momentum methods to arbitrary manifolds by noting that for every differentiable manifold, there exists a radially convex open set that covers almost all the manifold.

Curvature-Dependant Global Convergence Rates for Optimization on Manifolds of Bounded Geometry

no code implementations6 Aug 2020 Mario Lezcano-Casado

We give curvature-dependant convergence rates for the optimization of weakly convex functions defined on a manifold of 1-bounded geometry via Riemannian gradient descent and via the dynamic trivialization algorithm.

Trivializations for Gradient-Based Optimization on Manifolds

2 code implementations20 Sep 2019 Mario Lezcano-Casado

We prove conditions under which a trivialization is sound in the context of gradient-based optimization and we show how two large families of trivializations have overall favorable properties, but also suffer from a performance issue.

Cheap Orthogonal Constraints in Neural Networks: A Simple Parametrization of the Orthogonal and Unitary Group

3 code implementations24 Jan 2019 Mario Lezcano-Casado, David Martínez-Rubio

We demonstrate how our method constitutes a more robust approach to optimization with orthogonal constraints, showing faster, accurate, and more stable convergence in several tasks designed to test RNNs.

Cannot find the paper you are looking for? You can Submit a new open access paper.