no code implementations • 20 Dec 2024 • Jianming Chen, Yawen Wang, Junjie Wang, Xiaofei Xie, Jun Hu, Qing Wang, Fanjiang Xu
Inspired by counterfactual reasoning, a larger change in reward caused by the randomized action of agent indicates its higher importance.
1 code implementation • 18 Dec 2024 • Jun Hu, Bryan Hooi, Bingsheng He, Yinwei Wei
Our results indicate that the optimal $K$ for certain modalities on specific datasets can be as low as 1 or 2, which may restrict the GNNs' capacity to capture global information.
Ranked #1 on
Multi-modal Recommendation
on Amazon Clothing
no code implementations • 3 Dec 2024 • Jun Hu, Zhang Chen, Zhong Li, Yi Xu, Juyong Zhang
In this work, we propose SparseLGS to address the challenge of 3D scene understanding with pose-free and sparse view input images.
no code implementations • 3 Oct 2024 • Yizhang Zou, Xuegang Hu, Peipei Li, Jun Hu, You Wu
Motivated by this, we propose an online multi-label classification algorithm under Noisy and Changing Label Distribution (NCLD).
no code implementations • 5 Sep 2024 • Yihao Zhao, Enhao Zhong, Cuiyun Yuan, Yang Li, Man Zhao, Chunxia Li, Jun Hu, Chenbin Liu
We propose TG-LMM (Text-Guided Large Multi-Modal Model), a novel approach that leverages textual descriptions of organs to enhance segmentation accuracy in medical images.
no code implementations • 1 Aug 2024 • Houye Ji, Ye Tang, Zhaoxin Chen, Lixi Deng, Jun Hu, Lei Su
In this paper, we first leverage the dual graph to model the co-existing of user-video and user-item interactions in video-driven e-commerce and innovatively reduce user preference understanding to a graph matching problem.
no code implementations • 20 May 2024 • Yihao Zhao, Cuiyun Yuan, Ying Liang, Yang Li, Chunxia Li, Man Zhao, Jun Hu, Wei Liu, Chenbin Liu
Automatic segmentation can be used to reduce the physician workload and improve the consistency.
no code implementations • 11 Feb 2024 • Jun Hu, Pengzhan Jin
We propose a hybrid iterative method based on MIONet for PDEs, which combines the traditional numerical iterative solver and the recent powerful machine learning method of neural operator, and further systematically analyze its theoretical properties, including the convergence condition, the spectral behavior, as well as the convergence rate, in terms of the errors of the discretization and the model inference.
no code implementations • ACM Transactions on Intelligent Systems and Technology 2023 • Junwei Lv, Yuqi Chu, Jun Hu, Peipei Li, Xuegang Hu
Existing approaches mainly utilize heuristic stopping rules to capture stopping signals from the prediction results of time series classifiers.
Ranked #1 on
Early Classification
on ECG200
1 code implementation • 7 Nov 2023 • Enhong Liu, Joseph Suarez, Chenhui You, Bo Wu, BingCheng Chen, Jun Hu, Jiaxin Chen, Xiaolong Zhu, Clare Zhu, Julian Togelius, Sharada Mohanty, Weijun Hong, Rui Du, Yibing Zhang, Qinwen Wang, Xinhang Li, Zheng Yuan, Xiang Li, Yuejia Huang, Kun Zhang, Hanhui Yang, Shiqi Tang, Phillip Isola
In this paper, we present the results of the NeurIPS-2022 Neural MMO Challenge, which attracted 500 participants and received over 1, 600 submissions.
no code implementations • CCKS 2023 • Guandong Feng, Guoliang Zhu, Shengze Shi, Yue Sun, Zhongyi Fan, Sulin Gao, Jun Hu
Knowledge Base Question Answering (KBQA) is a significant task in natural language processing, aiming to retrieve answers from structured knowledge bases in response to natural language questions.
1 code implementation • 23 Oct 2023 • Jun Hu, Bryan Hooi, Bingsheng He
To achieve low information loss, we introduce a Relation-wise Neighbor Collection component with an Even-odd Propagation Scheme, which aims to collect information from neighbors in a finer-grained way.
Ranked #1 on
Heterogeneous Node Classification
on OAG-L1-Field
no code implementations • 22 Oct 2023 • Zuoli Tang, ZhaoXin Huan, Zihao Li, Xiaolu Zhang, Jun Hu, Chilin Fu, Jun Zhou, Chenliang Li
We expect that by mixing the user's behaviors across different domains, we can exploit the common knowledge encoded in the pre-trained language model to alleviate the problems of data sparsity and cold start problems.
no code implementations • 3 Aug 2023 • Zachary A. Daniels, Jun Hu, Michael Lomnitz, Phil Miller, Aswin Raghavan, Joe Zhang, Michael Piacentino, David Zhang
This paper presents the Encoder-Adaptor-Reconfigurator (EAR) framework for efficient continual learning under domain shifts.
no code implementations • 28 Feb 2023 • Bo Peng, Jun Hu, Jingtao Zhou, Xuan Gao, Juyong Zhang
To achieve this target, we introduce a continuous and optimizable intrinsic coordinate rather than the original explicit Euclidean coordinate in the hash encoding module of instant-NGP.
no code implementations • 1 Feb 2023 • Jun Hu, Pengzhan Jin
Here we utilize a low-rank tensor model (LTM) as a function approximator, combined with the gradient descent method, to solve eigenvalue problems including the Laplacian operator and the harmonic oscillator.
no code implementations • CVPR 2023 • Mingzhen Huang, Xiaoxing Li, Jun Hu, Honghong Peng, Siwei Lyu
DETracker outperforms existing state-of-the-art method on the DogThruGlasses dataset and YouTube-Hand dataset.
no code implementations • 4 Oct 2022 • Bo Peng, Jun Hu, Jingtao Zhou, Juyong Zhang
Extensive experimental results on several different datasets demonstrate the effectiveness and efficiency of SelfNeRF to challenging monocular videos.
no code implementations • 10 Jun 2022 • Indhumathi Kandaswamy, Saurabh Farkya, Zachary Daniels, Gooitzen van der Wal, Aswin Raghavan, Yuzheng Zhang, Jun Hu, Michael Lomnitz, Michael Isnardi, David Zhang, Michael Piacentino
In this paper we present Hyper-Dimensional Reconfigurable Analytics at the Tactical Edge (HyDRATE) using low-SWaP embedded hardware that can perform real-time reconfiguration at the edge leveraging non-MAC (free of floating-point MultiplyACcumulate operations) deep neural nets (DNN) combined with hyperdimensional (HD) computing accelerators.
2 code implementations • 5 Apr 2022 • Jun Hu, Bryan Hooi, Shengsheng Qian, Quan Fang, Changsheng Xu
Based on a Markov process that trades off two types of distances, we present Markov Graph Diffusion Collaborative Filtering (MGDCF) to generalize some state-of-the-art GNN-based CF models.
Ranked #4 on
Multi-modal Recommendation
on Amazon Sports
no code implementations • 11 Mar 2022 • Hanxing Chi, Baihong Lin, Jun Hu, Liang Wang
Recently, attention mechanisms have been extensively investigated in computer vision, but few of them show excellent performance on both large and mobile networks.
1 code implementation • 2 Dec 2021 • Jun Hu, Shengsheng Qian, Quan Fang, Changsheng Xu
Recently the field has advanced from local propagation schemes that focus on local neighbors towards extended propagation schemes that can directly deal with extended neighbors consisting of both local and high-order neighbors.
1 code implementation • 19 Nov 2021 • Desheng Cai, Jun Hu, Quan Zhao, Shengsheng Qian, Quan Fang, Changsheng Xu
In this paper, we present GRecX, an open-source TensorFlow framework for benchmarking GNN-based recommendation models in an efficient and unified way.
no code implementations • ICCV 2021 • Ze Wang, Zichen Miao, Jun Hu, Qiang Qiu
Applying feature dependent network weights have been proved to be effective in many fields.
no code implementations • 12 Aug 2021 • Shoubin Li, Xuyan Ma, Shuaiqun Pan, Jun Hu, Lin Shi, Qing Wang
In the second stage, the deep visual, shallow visual, and text features are extracted for fusion to identify the category blocks of documents.
no code implementations • 2 Mar 2021 • Lucas D. Young, Fitsum A. Reda, Rakesh Ranjan, Jon Morton, Jun Hu, Yazhu Ling, Xiaoyu Xiang, David Liu, Vikas Chandra
(2) A novel Feature Matching Loss that allows knowledge distillation from large denoising networks in the form of a perceptual content loss.
1 code implementation • 27 Jan 2021 • Jun Hu, Shengsheng Qian, Quan Fang, Youze Wang, Quan Zhao, Huaiwen Zhang, Changsheng Xu
We introduce tf_geometric, an efficient and friendly library for graph deep learning, which is compatible with both TensorFlow 1. x and 2. x.
no code implementations • 22 Dec 2020 • Shiqi Sheng, Haijun Yang, Liuhua Mu, Zixin Wang, Jihong Wang, Peng Xiu, Jun Hu, Xin Zhang, Feng Zhang, Haiping Fang
We experimentally demonstrated that the AYFFF self-assemblies adsorbed with various monovalent cations (Na+, K+, and Li+) show unexpectedly super strong paramagnetism.
Biological Physics
no code implementations • 1 Nov 2020 • Kunjin Chen, Tomáš Vantuch, Yu Zhang, Jun Hu, Jinliang He
The detection and characterization of partial discharge (PD) are crucial for the insulation diagnosis of overhead lines with covered conductors.
no code implementations • 28 Jan 2020 • Shoubin Li, Wenzao Cui, Yujiang Liu, Xuran Ming, Jun Hu, YuanzheHu, Qing Wang
Pre-trained models such as BERT are widely used in NLP tasks and are fine-tuned to improve the performance of various NLP tasks consistently.
no code implementations • 17 Nov 2019 • Kunjin Chen, Yu Zhang, Qin Wang, Jun Hu, Hang Fan, Jinliang He
Non-intrusive load monitoring addresses the challenging task of decomposing the aggregate signal of a household's electricity consumption into appliance-level data without installing dedicated meters.
1 code implementation • 22 Dec 2018 • Kunjin Chen, Jun Hu, Yu Zhang, Zhanqing Yu, Jinliang He
This paper develops a novel graph convolutional network (GCN) framework for fault location in power distribution networks.
no code implementations • 6 Jun 2018 • Kunjin Chen, Qin Wang, Ziyu He, Kunlong Chen, Jun Hu, Jinliang He
A convolutional sequence to sequence non-intrusive load monitoring model is proposed in this paper.
1 code implementation • 30 May 2018 • Kunjin Chen, Kunlong Chen, Qin Wang, Ziyu He, Jun Hu, Jinliang He
We present in this paper a model for forecasting short-term power loads based on deep residual networks.
no code implementations • 7 Apr 2015 • Orazio Gallo, Alejandro Troccoli, Jun Hu, Kari Pulli, Jan Kautz
Image registration for stack-based HDR photography is challenging.
1 code implementation • CVPR 2013 • Jun Hu, Orazio Gallo, Kari Pulli, Xiaobai Sun
We present a novel method for aligning images in an HDR (high-dynamic-range) image stack to produce a new exposure stack where all the images are aligned and appear as if they were taken simultaneously, even in the case of highly dynamic scenes.