Search Results for author: BingYi Jing

Found 10 papers, 2 papers with code

Parametric Scaling Law of Tuning Bias in Conformal Prediction

no code implementations5 Feb 2025 Hao Zeng, Kangdao Liu, BingYi Jing, Hongxin Wei

In this work, we empirically find that the tuning bias - the coverage gap introduced by leveraging the same dataset for tuning and calibration, is negligible for simple parameter tuning in many conformal prediction methods.

Conformal Prediction Holdout Set +2

ChineseSafe: A Chinese Benchmark for Evaluating Safety in Large Language Models

no code implementations24 Oct 2024 Hengxiang Zhang, Hongfu Gao, Qiang Hu, Guanhua Chen, Lili Yang, BingYi Jing, Hongxin Wei, Bing Wang, Haifeng Bai, Lei Yang

While previous works have introduced several benchmarks to evaluate the safety risk of LLMs, the community still has a limited understanding of current LLMs' capability to recognize illegal and unsafe content in Chinese contexts.

Fine-tuning can Help Detect Pretraining Data from Large Language Models

no code implementations9 Oct 2024 Hengxiang Zhang, Songxin Zhang, BingYi Jing, Hongxin Wei

In light of this, we introduce a novel and effective method termed Fine-tuned Score Deviation (FSD), which improves the performance of current scoring functions for pretraining data detection.

Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution

no code implementations3 Apr 2024 Simiao Li, Yun Zhang, Wei Li, Hanting Chen, Wenjia Wang, BingYi Jing, Shaohui Lin, Jie Hu

Knowledge distillation (KD) is a promising yet challenging model compression technique that transfers rich learning representations from a well-performing but cumbersome teacher model to a compact student model.

Image Super-Resolution Knowledge Distillation +1

Enhanced Bayesian Personalized Ranking for Robust Hard Negative Sampling in Recommender Systems

no code implementations28 Mar 2024 Kexin Shi, Jing Zhang, Linjiajie Fang, Wenjia Wang, BingYi Jing

In implicit collaborative filtering, hard negative mining techniques are developed to accelerate and enhance the recommendation model learning.

Collaborative Filtering Recommendation Systems

TorchCP: A Python Library for Conformal Prediction

1 code implementation20 Feb 2024 Jianguo Huang, Jianqing Song, Xuanning Zhou, BingYi Jing, Hongxin Wei

Conformal Prediction (CP) has attracted great attention from the research community due to its strict theoretical guarantees.

Conformal Prediction Deep Learning +2

Exploring Learning Complexity for Efficient Downstream Dataset Pruning

no code implementations8 Feb 2024 Wenyu Jiang, Zhenlong Liu, Zejian Xie, Songxin Zhang, BingYi Jing, Hongxin Wei

In this paper, we propose a straightforward, novel, and training-free hardness score named Distorting-based Learning Complexity (DLC), to identify informative images and instructions from the downstream dataset efficiently.

Informativeness

Lyrics: Boosting Fine-grained Language-Vision Alignment and Comprehension via Semantic-aware Visual Objects

no code implementations8 Dec 2023 Junyu Lu, Dixiang Zhang, Songxin Zhang, Zejian Xie, Zhuoyang Song, Cong Lin, Jiaxing Zhang, BingYi Jing, Pingjian Zhang

During the instruction fine-tuning stage, we introduce semantic-aware visual feature extraction, a crucial method that enables the model to extract informative features from concrete visual objects.

Image Captioning object-detection +5

Data Upcycling Knowledge Distillation for Image Super-Resolution

1 code implementation25 Sep 2023 Yun Zhang, Wei Li, Simiao Li, Hanting Chen, Zhijun Tu, Wenjia Wang, BingYi Jing, Shaohui Lin, Jie Hu

Knowledge distillation (KD) compresses deep neural networks by transferring task-related knowledge from cumbersome pre-trained teacher models to compact student models.

Image Super-Resolution Knowledge Distillation +1

Enhancing Recommender Systems: A Strategy to Mitigate False Negative Impact

no code implementations25 Nov 2022 Kexin Shi, Yun Zhang, BingYi Jing, Wenjia Wang

In implicit collaborative filtering (CF) task of recommender systems, recent works mainly focus on model structure design with promising techniques like graph neural networks (GNNs).

Collaborative Filtering Recommendation Systems

Cannot find the paper you are looking for? You can Submit a new open access paper.