Search Results for author: Zijia Lin

Found 22 papers, 14 papers with code

PYRA: Parallel Yielding Re-Activation for Training-Inference Efficient Task Adaptation

no code implementations14 Mar 2024 Yizhe Xiong, Hui Chen, Tianxiang Hao, Zijia Lin, Jungong Han, Yuesong Zhang, Guoxin Wang, Yongjun Bao, Guiguang Ding

Consequently, a simple combination of them cannot guarantee accomplishing both training efficiency and inference efficiency with minimal costs.

Model Compression

Drop your Decoder: Pre-training with Bag-of-Word Prediction for Dense Passage Retrieval

1 code implementation20 Jan 2024 Guangyuan Ma, Xing Wu, Zijia Lin, Songlin Hu

In this study, we aim to shed light on this issue by revealing that masked auto-encoder (MAE) pre-training with enhanced decoding significantly improves the term coverage of input tokens in dense representations, compared to vanilla BERT checkpoints.

Passage Retrieval Retrieval +1

RepViT-SAM: Towards Real-Time Segmenting Anything

2 code implementations10 Dec 2023 Ao Wang, Hui Chen, Zijia Lin, Jungong Han, Guiguang Ding

Here, to achieve real-time segmenting anything on mobile devices, following MobileSAM, we replace the heavyweight image encoder in SAM with RepViT model, ending up with the RepViT-SAM model.

KwaiYiiMath: Technical Report

no code implementations11 Oct 2023 Jiayi Fu, Lei Lin, Xiaoyang Gao, Pengli Liu, Zhengzong Chen, Zhirui Yang, ShengNan Zhang, Xue Zheng, Yan Li, Yuliang Liu, Xucheng Ye, Yiqiao Liao, Chao Liao, Bin Chen, Chengru Song, Junchen Wan, Zijia Lin, Fuzheng Zhang, Zhongyuan Wang, Di Zhang, Kun Gai

Recent advancements in large language models (LLMs) have demonstrated remarkable abilities in handling a variety of natural language processing (NLP) downstream tasks, even on mathematical tasks requiring multi-step reasoning.

Ranked #87 on Arithmetic Reasoning on GSM8K (using extra training data)

Arithmetic Reasoning GSM8K +1

Confidence-based Visual Dispersal for Few-shot Unsupervised Domain Adaptation

1 code implementation ICCV 2023 Yizhe Xiong, Hui Chen, Zijia Lin, Sicheng Zhao, Guiguang Ding

To address this issue, recent works consider the Few-shot Unsupervised Domain Adaptation (FUDA) where only a few source samples are labeled, and conduct knowledge transfer via self-supervised learning methods.

Self-Supervised Learning Transfer Learning +1

Pre-training with Large Language Model-based Document Expansion for Dense Passage Retrieval

no code implementations16 Aug 2023 Guangyuan Ma, Xing Wu, Peng Wang, Zijia Lin, Songlin Hu

Concretely, we leverage the capabilities of LLMs for document expansion, i. e. query generation, and effectively transfer expanded knowledge to retrievers using pre-training strategies tailored for passage retrieval.

Contrastive Learning Language Modelling +3

RepViT: Revisiting Mobile CNN From ViT Perspective

7 code implementations18 Jul 2023 Ao Wang, Hui Chen, Zijia Lin, Jungong Han, Guiguang Ding

Recently, lightweight Vision Transformers (ViTs) demonstrate superior performance and lower latency, compared with lightweight Convolutional Neural Networks (CNNs), on resource-constrained mobile devices.

CoT-MAE v2: Contextual Masked Auto-Encoder with Multi-view Modeling for Passage Retrieval

no code implementations5 Apr 2023 Xing Wu, Guangyuan Ma, Peng Wang, Meng Lin, Zijia Lin, Fuzheng Zhang, Songlin Hu

As an effective representation bottleneck pretraining technique, the contextual masked auto-encoder utilizes contextual embedding to assist in the reconstruction of passages.

Passage Retrieval Retrieval +1

Query-as-context Pre-training for Dense Passage Retrieval

2 code implementations19 Dec 2022 Xing Wu, Guangyuan Ma, Wanhui Qian, Zijia Lin, Songlin Hu

Recently, methods have been developed to improve the performance of dense passage retrieval by using context-supervised pre-training.

Contrastive Learning Passage Retrieval +1

RaP: Redundancy-aware Video-language Pre-training for Text-Video Retrieval

1 code implementation13 Oct 2022 Xing Wu, Chaochen Gao, Zijia Lin, Zhongyuan Wang, Jizhong Han, Songlin Hu

Sparse sampling is also likely to miss important frames corresponding to some text portions, resulting in textual redundancy.

Contrastive Learning Retrieval +1

InfoCSE: Information-aggregated Contrastive Learning of Sentence Embeddings

2 code implementations8 Oct 2022 Xing Wu, Chaochen Gao, Zijia Lin, Jizhong Han, Zhongyuan Wang, Songlin Hu

Contrastive learning has been extensively studied in sentence embedding learning, which assumes that the embeddings of different views of the same sentence are closer.

Contrastive Learning Language Modelling +5

ConTextual Masked Auto-Encoder for Dense Passage Retrieval

2 code implementations16 Aug 2022 Xing Wu, Guangyuan Ma, Meng Lin, Zijia Lin, Zhongyuan Wang, Songlin Hu

Dense passage retrieval aims to retrieve the relevant passages of a query from a large corpus based on dense representations (i. e., vectors) of the query and the passages.

Passage Retrieval Retrieval +1

UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data

1 code implementation15 Jul 2020 Qianhui Wu, Zijia Lin, Börje F. Karlsson, Biqing Huang, Jian-Guang Lou

Prior works in cross-lingual named entity recognition (NER) with no/little labeled data fall into two primary categories: model transfer based and data transfer based methods.

Cross-Lingual NER Knowledge Distillation +4

Shallow Feature Based Dense Attention Network for Crowd Counting

no code implementations17 Jun 2020 Yunqi Miao, Zijia Lin, Guiguang Ding, Jungong Han

In this paper, we propose a Shallow feature based Dense Attention Network (SDANet) for crowd counting from still images, which diminishes the impact of backgrounds via involving a shallow feature based attention model, and meanwhile, captures multi-scale information via densely connecting hierarchical image features.

Crowd Counting

Single-/Multi-Source Cross-Lingual NER via Teacher-Student Learning on Unlabeled Data in Target Language

1 code implementation ACL 2020 Qianhui Wu, Zijia Lin, Börje F. Karlsson, Jian-Guang Lou, Biqing Huang

However, such methods either are not applicable if the labeled data in the source languages is unavailable, or do not leverage information contained in unlabeled data in the target language.

Cross-Lingual NER named-entity-recognition +2

Enhanced Meta-Learning for Cross-lingual Named Entity Recognition with Minimal Resources

1 code implementation14 Nov 2019 Qianhui Wu, Zijia Lin, Guoxin Wang, Hui Chen, Börje F. Karlsson, Biqing Huang, Chin-Yew Lin

For languages with no annotated resources, transferring knowledge from rich-resource languages is an effective solution for named entity recognition (NER).

Cross-Lingual NER Meta-Learning +4

Semantics-Preserving Hashing for Cross-View Retrieval

no code implementations CVPR 2015 Zijia Lin, Guiguang Ding, Mingqing Hu, Jian-Min Wang

With benefits of low storage costs and high query speeds, hashing methods are widely researched for efficiently retrieving large-scale data, which commonly contains multiple views, e. g. a news report with images, videos and texts.

Retrieval

Image Tag Completion via Image-Specific and Tag-Specific Linear Sparse Reconstructions

no code implementations CVPR 2013 Zijia Lin, Guiguang Ding, Mingqing Hu, Jian-Min Wang, Xiaojun Ye

Though widely utilized for facilitating image management, user-provided image tags are usually incomplete and insufficient to describe the whole semantic content of corresponding images, resulting in performance degradations in tag-dependent applications and thus necessitating effective tag completion methods.

Management TAG

Cannot find the paper you are looking for? You can Submit a new open access paper.