1 code implementation • 11 Mar 2025 • Xiwen Chen, Wenhui Zhu, Peijie Qiu, Hao Wang, Huayu Li, Haiyu Wu, Aristeidis Sotiras, Yalin Wang, Abolfazl Razi
Vision-language models (VLMs) such as CLIP demonstrate strong performance but struggle when adapted to downstream tasks.
no code implementations • 20 Feb 2025 • Mingfu Liang, Xi Liu, Rong Jin, Boyang Liu, Qiuling Suo, Qinghai Zhou, Song Zhou, Laming Chen, Hua Zheng, Zhiyuan Li, Shali Jiang, Jiyan Yang, Xiaozhen Xia, Fan Yang, Yasmine Badr, Ellie Wen, Shuyu Xu, Hansey Chen, Zhengyu Zhang, Jade Nie, Chunzhi Yang, Zhichen Zeng, Weilin Zhang, Xingliang Huang, Qianru Li, Shiquan Wang, Evelyn Lyu, Wenjing Lu, Rui Zhang, Wenjun Wang, Jason Rudy, Mengyue Hang, Kai Wang, Yinbin Ma, Shuaiwen Wang, Sihan Zeng, Tongyi Tang, Xiaohan Wei, Longhao Jin, Jamey Zhang, Marcus Chen, Jiayi Zhang, Angie Huang, Chi Zhang, Zhengli Zhao, Jared Yang, Qiang Jin, Xian Chen, Amit Anand Amlesahwaram, Lexi Song, Liang Luo, Yuchen Hao, Nan Xiao, Yavuz Yetim, Luoshang Pan, Gaoxiang Liu, Yuxi Hu, Yuzhen Huang, Jackie Xu, Rich Zhu, Xin Zhang, Yiqun Liu, Hang Yin, Yuxin Chen, Buyun Zhang, Xiaoyi Liu, Xingyuan Wang, Wenguang Mao, Zhijing Li, Zhehui Zhou, Feifan Gu, Qin Huang, Chonglin Sun, Nancy Yu, Shuo Gu, Shupin Mao, Benjamin Au, Jingzheng Qin, Peggy Yao, Jae-Woo Choi, Bin Gao, Ernest Wang, Lei Zhang, Wen-Yen Chen, Ted Lee, Jay Zha, Yi Meng, Alex Gong, Edison Gao, Alireza Vahdatpour, Yiping Han, Yantao Yao, Toshinari Kureha, Shuo Chang, Musharaf Sultan, John Bocharov, Sagar Chordia, Xiaorui Gan, Peng Sun, Rocky Liu, Bo Long, Wenlin Chen, Santanu Kolay, Huayu Li
Second, large-volume data arrive in a streaming mode with data distributions dynamically shifting, as new users/ads join and existing users/ads leave the system.
1 code implementation • 27 Jan 2025 • Huayu Li, Xiwen Chen, Ci Zhang, Stuart F. Quan, William D. S. Killgore, Shu-Fen Wung, Chen X. Chen, Geng Yuan, Jin Lu, Ao Li
Large language models (LLMs) exhibit remarkable capabilities in visual inspection of medical time-series data, achieving proficiency comparable to human clinicians.
no code implementations • 6 Jan 2025 • Xiwen Chen, Peijie Qiu, Wenhui Zhu, Huayu Li, Hao Wang, Aristeidis Sotiras, Yalin Wang, Abolfazl Razi
Since its introduction, the transformer has shifted the development trajectory away from traditional models (e. g., RNN, MLP) in time series forecasting, which is attributed to its ability to capture global dependencies within temporal tokens.
no code implementations • 4 Jan 2025 • Huixue Zhou, Hengrui Gu, Xi Liu, Kaixiong Zhou, Mingfu Liang, Yongkang Xiao, Srinivas Govindan, Piyush Chawla, Jiyan Yang, Xiangfei Meng, Huayu Li, Buyun Zhang, Liang Luo, Wen-Yen Chen, Yiping Han, Bo Long, Rui Zhang, Tianlong Chen
The deployment of Large Language Models (LLMs) in recommender systems for predicting Click-Through Rates (CTR) necessitates a delicate balance between computational efficiency and predictive accuracy.
no code implementations • 15 Nov 2024 • Zhichen Zeng, Xiaolong Liu, Mengyue Hang, Xiaoyi Liu, Qinghai Zhou, Chaofei Yang, Yiqun Liu, Yichen Ruan, Laming Chen, Yuxin Chen, Yujia Hao, Jiaqi Xu, Jade Nie, Xi Liu, Buyun Zhang, Wei Wen, Siyang Yuan, Kai Wang, Wen-Yen Chen, Yiping Han, Huayu Li, Chunzhi Yang, Bo Long, Philip S. Yu, Hanghang Tong, Jiyan Yang
A mutually beneficial integration of heterogeneous information is the cornerstone towards the success of CTR prediction.
1 code implementation • 4 Nov 2024 • Xiwen Chen, Sayed Pedram Haeri Boroujeni, Xin Shu, Huayu Li, Abolfazl Razi
The extensive experiments, utilizing real-world data across states and cities in the USA, demonstrate that integrating CP into 12 state-of-the-art GNN architectures leads to significant improvements, with gains ranging from 3% to 13% in F1 score and 1. 3% to 9% in AUC metrics.
no code implementations • 9 Oct 2024 • Fang Zhou, Yaning Huang, Dong Liang, Dai Li, Zhongke Zhang, Kai Wang, Xiao Xin, Abdallah Aboelela, Zheliang Jiang, Yang Wang, Jeff Song, Wei zhang, Chen Liang, Huayu Li, Chonglin Sun, Hang Yang, Lei Qu, Zhan Shu, Mindi Yuan, Emanuele Maccherani, Taha Hayat, John Guo, Varna Puvvada, Uladzimir Pashkevich
The increasing complexity of deep learning models used for calculating user representations presents significant challenges, particularly with limited computational resources and strict service-level agreements (SLAs).
1 code implementation • 3 Aug 2024 • Zijian Wang, Bin Wang, Haifeng Jing, Huayu Li, Hongbo Dou
The high-level agent walks on the simplified knowledge graph to provide stage-wise hints for the low-level agent walking on the original knowledge graph.
3 code implementations • 6 May 2024 • Xiwen Chen, Peijie Qiu, Wenhui Zhu, Huayu Li, Hao Wang, Aristeidis Sotiras, Yalin Wang, Abolfazl Razi
Deep neural networks, including transformers and convolutional neural networks, have significantly improved multivariate time series classification (MTSC).
1 code implementation • 6 Mar 2024 • Hao Wang, Sayed Pedram Haeri Boroujeni, Xiwen Chen, Ashish Bastola, Huayu Li, Wenhui Zhu, Abolfazl Razi
Specifically, the fusion of Perlin noise in this work significantly improved the quality of synthesized images.
no code implementations • 11 Jan 2024 • Xiwen Chen, Hao Wang, Zhao Zhang, Zhenmin Li, Huayu Li, Tong Ye, Abolfazl Razi
Untrained Physics-based Deep Learning (DL) methods for digital holography have gained significant attention due to their benefits, such as not requiring an annotated training dataset, and providing interpretability since utilizing the governing laws of hologram formation.
no code implementations • 19 Nov 2023 • Haiyu Wu, Sicong Tian, Huayu Li, Kevin W. Bowyer
We explore the potential reasons for this oversight and introduce two pressing challenges to the field: 1) How can we ensure that a model, when trained with data checked for logical consistency, yields predictions that are logically consistent?
no code implementations • 18 Nov 2023 • Huayu Li, Gregory Ditzler
In this work, we demonstrate that continual learning systems can be manipulated by malicious misinformation and present a new category of data poisoning attacks specific for continual learners, which we refer to as {\em Poisoning Attacks Against Continual Learners} (PACOL).
no code implementations • 16 Nov 2023 • Wei zhang, Dai Li, Chen Liang, Fang Zhou, Zhongke Zhang, Xuewei Wang, Ru Li, Yi Zhou, Yaning Huang, Dong Liang, Kai Wang, Zhangyuan Wang, Zhengxing Chen, Fenggang Wu, Minghai Chen, Huayu Li, Yunnan Wu, Zhan Shu, Mindi Yuan, Sri Reddy
To address these challenges, we present Scaling User Modeling (SUM), a framework widely deployed in Meta's ads ranking system, designed to facilitate efficient and scalable sharing of online user representation across hundreds of ads models.
no code implementations • 14 Nov 2023 • Hang Yin, Kuang-Hung Liu, Mengying Sun, Yuxin Chen, Buyun Zhang, Jiang Liu, Vivek Sehgal, Rudresh Rajnikant Panchal, Eugen Hotaj, Xi Liu, Daifeng Guo, Jamey Zhang, Zhou Wang, Shali Jiang, Huayu Li, Zhengxing Chen, Wen-Yen Chen, Jiyan Yang, Wei Wen
The large scale of models and tight production schedule requires AutoML to outperform human baselines by only using a small number of model evaluation trials (around 100).
1 code implementation • 19 Oct 2023 • Huayu Li, Ana S. Carreon-Rascon, Xiwen Chen, Geng Yuan, Ao Li
MTS-LOF leverages the strengths of contrastive learning and Masked Autoencoder (MAE) methods, offering a unique approach to representation learning for medical time series data.
no code implementations • 4 Jun 2023 • Xiwen Chen, Huayu Li, Rahul Amin, Abolfazl Razi
A determinant-preserved sparse representation of selected samples is used to perform sample precoding in local sources to be processed by DPP.
no code implementations • 19 Apr 2023 • Huayu Li, Xiwen Chen, Gregory Ditzler, Janet Roveda, Ao Li
Within this context, softmax regression representation learning serves as a widely embraced approach, leveraging a pre-established teacher network to guide the learning process of a diminutive student network.
no code implementations • 9 Apr 2023 • Xiwen Chen, Huayu Li, Rahul Amin, Abolfazl Razi
However, the number of selected samples is restricted to the rank of the kernel matrix implied by the dimensionality of data samples.
1 code implementation • 16 Jan 2023 • Ping Chang, Huayu Li, Stuart F. Quan, Shuyang Lu, Shu-Fen Wung, Janet Roveda, Ao Li
The TDSTF model showed state-of-the-art performance in predicting vital signs in the ICU, outperforming other models' ability to predict distributions of vital signs and being more computationally efficient.
no code implementations • 10 Oct 2022 • Wentao Ding, Hao Chen, Huayu Li, Yuzhong Qu
Answering factual questions with temporal intent over knowledge graphs (temporal KGQA) attracts rising attention in recent years.
Ranked #1 on
Question Answering
on TempQuestions
1 code implementation • 31 Jul 2022 • Huayu Li, Gregory Ditzler, Janet Roveda, Ao Li
Significance: This study is one of the first to extend the conditional diffusion-based generative model for ECG noise removal, and the DeScoD-ECG has the potential to be widely used in biomedical applications.
Ranked #2 on
ECG Denoising
on QT-NSTDB
no code implementations • 11 Mar 2022 • Buyun Zhang, Liang Luo, Xi Liu, Jay Li, Zeliang Chen, Weilin Zhang, Xiaohan Wei, Yuchen Hao, Michael Tsang, Wenjun Wang, Yang Liu, Huayu Li, Yasmine Badr, Jongsoo Park, Jiyan Yang, Dheevatsa Mudigere, Ellie Wen
To overcome the challenge brought by DHEN's deeper and multi-layer structure in training, we propose a novel co-designed training system that can further improve the training efficiency of DHEN.
no code implementations • 7 Mar 2022 • Abolfazl Razi, Xiwen Chen, Huayu Li, Hao Wang, Brendan Russo, Yan Chen, Hongbin Yu
This paper explores Deep Learning (DL) methods that are used or have the potential to be used for traffic video analysis, emphasizing driving safety for both Autonomous Vehicles (AVs) and human-operated vehicles.
no code implementations • 12 Apr 2021 • Dheevatsa Mudigere, Yuchen Hao, Jianyu Huang, Zhihao Jia, Andrew Tulloch, Srinivas Sridharan, Xing Liu, Mustafa Ozdal, Jade Nie, Jongsoo Park, Liang Luo, Jie Amy Yang, Leon Gao, Dmytro Ivchenko, Aarti Basant, Yuxi Hu, Jiyan Yang, Ehsan K. Ardestani, Xiaodong Wang, Rakesh Komuravelli, Ching-Hsiang Chu, Serhat Yilmaz, Huayu Li, Jiyuan Qian, Zhuobo Feng, Yinbin Ma, Junjie Yang, Ellie Wen, Hong Li, Lin Yang, Chonglin Sun, Whitney Zhao, Dimitry Melts, Krishna Dhulipala, KR Kishore, Tyler Graf, Assaf Eisenman, Kiran Kumar Matam, Adi Gangidi, Guoqiang Jerry Chen, Manoj Krishnan, Avinash Nayak, Krishnakumar Nair, Bharath Muthiah, Mahmoud khorashadi, Pallab Bhattacharya, Petr Lapukhov, Maxim Naumov, Ajit Mathews, Lin Qiao, Mikhail Smelyanskiy, Bill Jia, Vijay Rao
Deep learning recommendation models (DLRMs) are used across many business-critical services at Facebook and are the single largest AI application in terms of infrastructure demand in its data-centers.
no code implementations • 23 Dec 2020 • Huayu Li, Haiyu Wu, Xiwen Chen, Hanning Zhang, Abolfazl Razi
We equip the SPA blocks on a network designed for real image denoising.
1 code implementation • 25 Apr 2020 • Huayu Li, Xiwen Chen, Haiyu Wu, Zaoyi Chi, Christopher Mann, Abolfazl Razi
Recently, end-to-end deep learning-based methods have been utilized to reconstruct the object wavefront (as a surrogate for the 3D structure of the object) directly from a single-shot in-line digital hologram.
no code implementations • 6 Jan 2019 • Huayu Li
In this paper, we proposed a variant of SE block based on channel locality.
no code implementations • 27 Aug 2018 • Huayu Li
Artificial Neural Networks (ANNs) are computational models inspired by the central nervous system (especially the brain) of animals and are used to estimate or generate unknown approximation functions relied on large amounts of inputs.
no code implementations • 22 Dec 2016 • Huayu Li, Martin Renqiang Min, Yong Ge, Asim Kadav
Employing these attention mechanisms, our model accurately understands when it can output an answer or when it requires generating a supplementary question for additional input depending on different contexts.
1 code implementation • 1 Aug 2016 • Huayu Li, Yong Ge, Richang Hong, Hengshu Zhu
The emergence of Location-based Social Network (LBSN) services provides a wonderful opportunity to build personalized Point-of-Interest (POI) recommender systems.