Search Results for author: Penghao Jiang

Found 5 papers, 0 papers with code

Jaeger: A Concatenation-Based Multi-Transformer VQA Model

no code implementations11 Oct 2023 Jieting Long, Zewei Shi, Penghao Jiang, Yidong Gan

By leveraging pre-trained models for feature extraction, our approach has the potential to amplify the performance of these models through concatenation.

Dimensionality Reduction Question Answering +1

Device Tuning for Multi-Task Large Model

no code implementations21 Feb 2023 Penghao Jiang, Xuanchen Hou, Yinsi Zhou

Unsupervised pre-training approaches have achieved great success in many fields such as Computer Vision (CV), Natural Language Processing (NLP) and so on.

Multi-Task Learning Unsupervised Pre-training

Deep Transfer Tensor Factorization for Multi-View Learning

no code implementations13 Feb 2023 Penghao Jiang, Ke Xin, Chunxi Li

To solve data sparsity problem in multiview ratings, we propose a generic architecture of deep transfer tensor factorization (DTTF) by integrating deep learning and cross-domain tensor factorization, where the side information is embedded to provide effective compensation for the tensor sparsity.

Denoising MULTI-VIEW LEARNING

Robust Meta Learning for Image based tasks

no code implementations30 Jan 2023 Penghao Jiang, Xin Ke, Zifeng Wang, Chunxi Li

However, learning such a model is not possible in standard machine learning frameworks as the distribution of the test data is unknown.

Meta-Learning

Invariant Meta Learning for Out-of-Distribution Generalization

no code implementations26 Jan 2023 Penghao Jiang, Ke Xin, Zifeng Wang, Chunxi Li

Modern deep learning techniques have illustrated their excellent capabilities in many areas, but relies on large training data.

Meta-Learning Out-of-Distribution Generalization

Cannot find the paper you are looking for? You can Submit a new open access paper.