no code implementations • 28 Mar 2024 • Yiping Ji, Hemanth Saratchandran, Cameron Gordon, Zeyu Zhang, Simon Lucey
Low-rank decomposition has emerged as a vital tool for enhancing parameter efficiency in neural network architectures, gaining traction across diverse applications in machine learning.
no code implementations • 28 Mar 2024 • Hemanth Saratchandran, Sameera Ramasinghe, Simon Lucey
In the realm of computer vision, Neural Fields have gained prominence as a contemporary tool harnessing neural networks for signal representation.
no code implementations • 28 Mar 2024 • Cameron Gordon, Lachlan Ewen MacDonald, Hemanth Saratchandran, Simon Lucey
We instead present a strategy for the initialization of run-time deep implicit functions for single-instance signals through a Decoder-Only randomly projected Hypernetwork (D'OH).
no code implementations • 13 Feb 2024 • Shin-Fang Chng, Hemanth Saratchandran, Simon Lucey
Implicit neural representations have emerged as a powerful technique for encoding complex continuous multidimensional signals as neural networks, enabling a wide range of applications in computer vision, robotics, and geometry.
no code implementations • 8 Feb 2024 • Hemanth Saratchandran, Sameera Ramasinghe, Violetta Shevchenko, Alexander Long, Simon Lucey
Implicit Neural Representations (INRs) have gained popularity for encoding signals as compact, differentiable entities.
no code implementations • 7 Feb 2024 • Hemanth Saratchandran, Shin-Fang Chng, Simon Lucey
In this paper, we aim to address this gap by providing a theoretical understanding of periodically activated networks through an analysis of their Neural Tangent Kernel (NTK).
no code implementations • 5 Feb 2024 • Hemanth Saratchandran, Shin-Fang Chng, Simon Lucey
Physics-informed neural networks (PINNs) offer a promising avenue for tackling both forward and inverse problems in partial differential equations (PDEs) by incorporating deep learning with fundamental physics principles.
no code implementations • ICCV 2023 • Hemanth Saratchandran, Shin-Fang Chng, Sameera Ramasinghe, Lachlan MacDonald, Simon Lucey
Coordinate networks are widely used in computer vision due to their ability to represent signals as compressed, continuous entities.
no code implementations • 10 Mar 2023 • Sameera Ramasinghe, Hemanth Saratchandran, Violetta Shevchenko, Simon Lucey
Modelling dynamical systems is an integral component for understanding the natural world.
no code implementations • NeurIPS 2023 • Lachlan Ewen MacDonald, Jack Valmadre, Hemanth Saratchandran, Simon Lucey
We introduce a general theoretical framework, designed for the study of gradient optimisation of deep neural networks, that encompasses ubiquitous architecture choices including batch normalisation, weight normalisation and skip connections.
no code implementations • 17 Jun 2022 • Sameera Ramasinghe, Lachlan MacDonald, Moshiur Farazi, Hemanth Saratchandran, Simon Lucey
Characterizing the remarkable generalization properties of over-parameterized neural networks remains an open problem.