Search Results for author: Matthew Brand

Found 3 papers, 0 papers with code

SuperLoRA: Parameter-Efficient Unified Adaptation of Multi-Layer Attention Modules

no code implementations18 Mar 2024 Xiangyu Chen, Jing Liu, Ye Wang, Pu, Wang, Matthew Brand, Guanghui Wang, Toshiaki Koike-Akino

Low-rank adaptation (LoRA) and its variants are widely employed in fine-tuning large models, including large language models for natural language processing and diffusion models for computer vision.

Transfer Learning

G-RepsNet: A Fast and General Construction of Equivariant Networks for Arbitrary Matrix Groups

no code implementations23 Feb 2024 Sourya Basu, Suhas Lohit, Matthew Brand

Recent work by Finzi et al. (2021) directly solves the equivariance constraint for arbitrary matrix groups to obtain equivariant MLPs (EMLPs).

Image Classification Inductive Bias

Convergent Block Coordinate Descent for Training Tikhonov Regularized Deep Neural Networks

no code implementations NeurIPS 2017 Ziming Zhang, Matthew Brand

By lifting the ReLU function into a higher dimensional space, we develop a smooth multi-convex formulation for training feed-forward deep neural networks (DNNs).

Cannot find the paper you are looking for? You can Submit a new open access paper.