Search Results for author: Ge Jin

Found 7 papers, 3 papers with code

Cross-to-merge training with class balance strategy for learning with noisy labels

1 code implementation Expert Systems with Applications 2024 Qian Zhang, Yi Zhu, Ming Yang, Ge Jin, YingWen Zhu, Qiu Chen

Although sample selection is a mainstream method in the field of learning with noisy labels, which aims to mitigate the impact of noisy labels during model training, the testing performance of these methods exhibits significant fluctuations across different noise rates and types.

Learning with noisy labels

pTSE: A Multi-model Ensemble Method for Probabilistic Time Series Forecasting

no code implementations16 May 2023 Yunyi Zhou, Zhixuan Chu, Yijia Ruan, Ge Jin, Yuchen Huang, Sheng Li

However, the choice of model highly relies on the characteristics of the input time series and the fixed distribution that the model is based on.

Probabilistic Time Series Forecasting Time Series

MixSeq: Connecting Macroscopic Time Series Forecasting with Microscopic Time Series Data

no code implementations NeurIPS 2021 Zhibo Zhu, Ziqi Liu, Ge Jin, Zhiqiang Zhang, Lei Chen, Jun Zhou, Jianyong Zhou

Time series forecasting is widely used in business intelligence, e. g., forecast stock market price, sales, and help the analysis of data trend.

Time Series Time Series Forecasting

FANDA: A Novel Approach to Perform Follow-up Query Analysis

1 code implementation24 Jan 2019 Qian Liu, Bei Chen, Jian-Guang Lou, Ge Jin, Dongmei Zhang

NLIDB allow users to search databases using natural language instead of SQL-like query languages.

Highly Efficient 8-bit Low Precision Inference of Convolutional Neural Networks with IntelCaffe

1 code implementation4 May 2018 Jiong Gong, Haihao Shen, Guoming Zhang, Xiaoli Liu, Shane Li, Ge Jin, Niharika Maheshwari, Evarist Fomenko, Eden Segal

High throughput and low latency inference of deep neural networks are critical for the deployment of deep learning applications.

Model Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.