Search Results for author: Diana-Nicoleta Grigore

Found 2 papers, 0 papers with code

Weight Copy and Low-Rank Adaptation for Few-Shot Distillation of Vision Transformers

no code implementations14 Apr 2024 Diana-Nicoleta Grigore, Mariana-Iuliana Georgescu, Jon Alvarez Justo, Tor Johansen, Andreea Iuliana Ionescu, Radu Tudor Ionescu

Few-shot knowledge distillation recently emerged as a viable approach to harness the knowledge of large-scale pre-trained models, using limited data and computational resources.

Knowledge Distillation

Discriminability-enforcing loss to improve representation learning

no code implementations14 Feb 2022 Florinel-Alin Croitoru, Diana-Nicoleta Grigore, Radu Tudor Ionescu

During the training process, deep neural networks implicitly learn to represent the input data samples through a hierarchy of features, where the size of the hierarchy is determined by the number of layers.

Image Classification Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.