1 code implementation • 9 May 2023 • Tianle Chen, Zheda Mai, Ruiwen Li, Wei-Lun Chao
Weakly supervised semantic segmentation (WSSS) aims to bypass the need for laborious pixel-level annotation by using only image-level annotation.
1 code implementation • CVPR 2023 • Cheng-Hao Tu, Zheda Mai, Wei-Lun Chao
Through introducing a handful of learnable ``query'' tokens to each layer, VQT leverages the inner workings of Transformers to ``summarize'' rich intermediate features of each layer, which can then be used to train the prediction heads of downstream tasks.
1 code implementation • 14 Mar 2022 • Ruiwen Li, Zheda Mai, Chiheb Trabelsi, Zhibo Zhang, Jongseong Jang, Scott Sanner
In this paper, we propose TransCAM, a Conformer-based solution to WSSS that explicitly leverages the attention weights from the transformer branch of the Conformer to refine the CAM generated from the CNN branch.
Weakly supervised Semantic Segmentation Weakly-Supervised Semantic Segmentation
no code implementations • 17 Jan 2022 • Tianshu Shen, Jiaru Li, Mohamed Reda Bouadjenek, Zheda Mai, Scott Sanner
Conversational Recommendation Systems (CRSs) have recently started to leverage pretrained language models (LM) such as BERT for their ability to semantically interpret a wide range of preference statement variations.
3 code implementations • 22 Mar 2021 • Zheda Mai, Ruiwen Li, Hyunwoo Kim, Scott Sanner
Online class-incremental continual learning (CL) studies the problem of learning new classes continually from an online non-stationary data stream, intending to adapt to new data while mitigating catastrophic forgetting.
1 code implementation • 25 Jan 2021 • Zheda Mai, Ruiwen Li, Jihwan Jeong, David Quispe, Hyunwoo Kim, Scott Sanner
To better understand the relative advantages of various approaches and the settings where they work best, this survey aims to (1) compare state-of-the-art methods such as MIR, iCARL, and GDumb and determine which works best at different experimental settings; (2) determine if the best class incremental methods are also competitive in domain incremental setting; (3) evaluate the performance of 7 simple but effective trick such as "review" trick and nearest class mean (NCM) classifier to assess their relative impact.
no code implementations • 24 Oct 2020 • Zheda Mai, Ga Wu, Kai Luo, Scott Sanner
In order to capture multifaceted user preferences, existing recommender systems either increase the encoding complexity or extend the latent representation dimension.
1 code implementation • 14 Sep 2020 • Vincenzo Lomonaco, Lorenzo Pellegrini, Pau Rodriguez, Massimo Caccia, Qi She, Yu Chen, Quentin Jodelet, Ruiping Wang, Zheda Mai, David Vazquez, German I. Parisi, Nikhil Churamani, Marc Pickett, Issam Laradji, Davide Maltoni
In the last few years, we have witnessed a renewed and fast-growing interest in continual learning with deep neural networks with the shared objective of making current AI systems more adaptive, efficient and autonomous.
3 code implementations • 31 Aug 2020 • Dongsub Shim, Zheda Mai, Jihwan Jeong, Scott Sanner, Hyunwoo Kim, Jongseong Jang
As image-based deep learning becomes pervasive on every device, from cell phones to smart watches, there is a growing need to develop methods that continually learn from data while minimizing memory footprint and power consumption.
no code implementations • 3 Aug 2020 • Jin Peng Zhou, Ga Wu, Zheda Mai, Scott Sanner
One-class collaborative filtering (OC-CF) is a common class of recommendation problem where only the positive class is explicitly observed (e. g., purchases, clicks).
1 code implementation • 11 Jul 2020 • Zheda Mai, Hyunwoo Kim, Jihwan Jeong, Scott Sanner
Continual learning is a branch of deep learning that seeks to strike a balance between learning stability and plasticity.