no code implementations • ACL (ECNLP) 2021 • Ying Lin, Han Wang, Jiangning Chen, Tong Wang, Yue Liu, Heng Ji, Yang Liu, Premkumar Natarajan
We first build a cross-source heterogeneous knowledge graph from customer purchase history and product knowledge graph to jointly learn customer and product embeddings.
no code implementations • 10 Apr 2025 • Tianyi Wu, Zhiwei Xue, Yue Liu, Jiaheng Zhang, Bryan Hooi, See-Kiong Ng
Despite achieving the promising attack success rate using dictionary-based evaluation, existing jailbreak attack methods fail to output detailed contents to satisfy the harmful request, leading to poor performance on GPT-based evaluation.
no code implementations • 1 Apr 2025 • Songran Bai, Yuheng Ji, Yue Liu, Xingwei Zhang, Xiaolong Zheng, Daniel Dajun Zeng
To address these issues, we propose MinGRE, a framework for Minority Class Gradients and Representations Enhancement.
no code implementations • 30 Mar 2025 • Shihao Cheng, Jinlu Zhang, Yue Liu, Zhigang Tu
However, the existing approaches overlook the full utilization of brightness information throughout the training phase, leading to suboptimal performance.
1 code implementation • 29 Mar 2025 • Yue Liu, Jiaying Wu, Yufei He, Hongcheng Gao, Hongyu Chen, Baolong Bi, Jiaheng Zhang, Zhiqi Huang, Bryan Hooi
Large Reasoning Models (LRMs) significantly improve the reasoning ability of Large Language Models (LLMs) by learning to reason, exhibiting promising performance in complex task-solving.
1 code implementation • 25 Mar 2025 • Hongcheng Gao, Jiashu Qu, Jingyi Tang, Baolong Bi, Yue Liu, Hongyu Chen, Li Liang, Li Su, Qingming Huang
The hallucination of large multimodal models (LMMs), providing responses that appear correct but are actually incorrect, limits their reliability and applicability.
no code implementations • 2 Feb 2025 • Yufei He, Yuan Sui, Xiaoxin He, Yue Liu, Yifei Sun, Bryan Hooi
Multimodal graphs (MMGs) represent such graphs where each node is associated with features from different modalities, while the edges capture the relationships between these entities.
1 code implementation • 30 Jan 2025 • Yue Liu, Hongcheng Gao, Shengfang Zhai, Jun Xia, Tianyi Wu, Zhiwei Xue, Yulin Chen, Kenji Kawaguchi, Jiaheng Zhang, Bryan Hooi
Then, we introduce reasoning SFT to unlock the reasoning capability of guard models.
no code implementations • 15 Jan 2025 • Qianniu Chen, Xiaoyang Hao, Bowen Li, Yue Liu, Li Lu
Furthermore, we present a two-stage self-distillation framework that constructs parallel data pairs for effectively disentangling linguistic content and speakers from the perspective of training data.
no code implementations • 19 Dec 2024 • Yide Yu, Yue Liu, Xiaochen Yuan, Dennis Wong, Huijie Li, Yan Ma
In this problem, states and observations are in a many-to-one relationship.
no code implementations • 11 Dec 2024 • Mu Zhang, Yunfan Liu, Yue Liu, Hongtian Yu, Qixiang Ye
Accurately depicting real-world landscapes in remote sensing (RS) images requires precise alignment between objects and their environment.
1 code implementation • 11 Dec 2024 • Jiayuan Ma, Hongbin Na, Zimu Wang, Yining Hua, Yue Liu, Wei Wang, Ling Chen
Mental manipulation severely undermines mental wellness by covertly and negatively distorting decision-making.
no code implementations • 5 Dec 2024 • Yuhao Wang, Junwei Pan, Xiangyu Zhao, Pengyue Jia, Wanyu Wang, YuAn Wang, Yue Liu, Dapeng Liu, Jie Jiang
Sequential recommendation (SR) aims to model the sequential dependencies in users' historical interactions to better capture their evolving interests.
no code implementations • 1 Dec 2024 • Yue Liu, Chakkrit Tantithamthavorn, Li Li
Recent years have witnessed the emerging trend of extensions in modern Integrated Development Environments (IDEs) like Visual Studio Code (VSCode) that significantly enhance developer productivity.
no code implementations • 25 Nov 2024 • Wang Bill Zhu, Deqing Fu, Kai Sun, Yi Lu, Zhaojiang Lin, Seungwhan Moon, Kanika Narang, Mustafa Canim, Yue Liu, Anuj Kumar, Xin Luna Dong
We hypothesize that a user's visual history with images reflecting their daily life, offers valuable insights into their interests and preferences, and can be leveraged for personalization.
no code implementations • 21 Nov 2024 • Fengxin Li, Yi Li, Yue Liu, Chao Zhou, YuAn Wang, Xiaoxiang Deng, Wei Xue, Dapeng Liu, Lei Xiao, Haijie Gu, Jie Jiang, Hongyan Liu, Biao Qin, Jun He
Display advertising provides significant value to advertisers, publishers, and users.
1 code implementation • 31 Oct 2024 • Yue Liu, Shihao Zhu, Tianyuan Yang, Jian Ma, Wenliang Zhong
Subsequently, at the self-supervised learning stage, the pull-and-repulsion pre-text task is proposed to optimize the user-group distribution.
no code implementations • 29 Oct 2024 • Xin Zhang, Zhen Xu, Yue Liu, Mengfang Sun, Tong Zhou, Wenying Sun
In the current context of accelerated globalization and digitalization, the complexity and uncertainty of financial markets are increasing, and the identification and prevention of economic risks have become a key link in maintaining the stability of the financial system.
no code implementations • 23 Oct 2024 • Jianjun Wei, Yue Liu, Xin Huang, Xin Zhang, Wenyi Liu, Xu Yan
This paper explores the applications and challenges of graph neural networks (GNNs) in processing complex graph data brought about by the rapid development of the Internet.
1 code implementation • 19 Oct 2024 • Sizhe Liu, Jun Xia, Lecheng Zhang, Yuchen Liu, Yue Liu, Wenjie Du, Zhangyang Gao, Bozhen Hu, Cheng Tan, Hongxin Xiang, Stan Z. Li
Molecular relational learning (MRL) is crucial for understanding the interaction behaviors between molecular pairs, a critical aspect of drug discovery and development.
1 code implementation • 8 Oct 2024 • Xiaoxia Xu, Xidong Mu, Yuanwei Liu, Hong Xing, Yue Liu, Arumugam Nallanathan
First, a DM-driven communication architecture is proposed, which introduces two key paradigms, i. e., conditional DM and DM-driven deep reinforcement learning (DRL), for wireless data generation and communication management, respectively.
2 code implementations • 2 Oct 2024 • Yue Liu, Xiaoxin He, Miao Xiong, Jinlan Fu, Shumin Deng, Bryan Hooi
Second, we verify the strong ability of LLMs to perform the text-flipping task, and then develop 4 variants to guide LLMs to denoise, understand, and execute harmful behaviors accurately.
no code implementations • 30 Aug 2024 • Harsha Perera, Sung Une Lee, Yue Liu, Boming Xia, Qinghua Lu, Liming Zhu, Jessica Cairns, Moana Nottage
As Artificial Intelligence (AI) becomes integral to business operations, integrating Responsible AI (RAI) within Environmental, Social, and Governance (ESG) frameworks is essential for ethical and sustainable AI deployment.
no code implementations • 16 Aug 2024 • Yue Liu, Dawen Zhang, Boming Xia, Julia Anticev, Tunde Adebayo, Zhenchang Xing, Moses Machao
In the era of advanced artificial intelligence, highlighted by large-scale generative models like GPT-4, ensuring the traceability, verifiability, and reproducibility of datasets throughout their lifecycle is paramount for research institutions and technology companies.
no code implementations • 15 Aug 2024 • Qingyuan Zheng, Yue Liu, Yangbo He
Then we introduce criteria for identifying causal relationships based solely on the local structure in the presence of prior knowledge.
no code implementations • 2 Aug 2024 • Sung Une Lee, Harsha Perera, Yue Liu, Boming Xia, Qinghua Lu, Liming Zhu, Jessica Cairns, Moana Nottage
The framework provides a structured approach to this integration, developed in collaboration with industry practitioners.
no code implementations • 2 Aug 2024 • Sung Une Lee, Harsha Perera, Yue Liu, Boming Xia, Qinghua Lu, Liming Zhu, Olivier Salvado, Jon Whittle
The rapid growth of Artificial Intelligence (AI) has underscored the urgent need for responsible AI practices.
no code implementations • 16 Jul 2024 • Dongnan Jin, Yali Liu, Qiuzhi Song, Xunju Ma, Yue Liu, Dehao Wu
To effectively search for the optimal motion template in dynamic multidimensional space, this paper proposes a novel optimization algorithm, Dynamic Dimension Wrapping (DDW). The algorithm combines Dynamic Time Warping (DTW) and Euclidean distance, and designs a fitness function that adapts to dynamic multidimensional space by establishing a time-data chain mapping across dimensions.
no code implementations • 16 Jun 2024 • Jingbo Zhou, Shaorong Chen, Jun Xia, Sizhe Liu, Tianze Ling, Wenjie Du, Yue Liu, Jianwei Yin, Stan Z. Li
In this work, we present the first unified benchmark NovoBench for \emph{de novo} peptide sequencing, which comprises diverse mass spectrum data, integrated models, and comprehensive evaluation metrics.
1 code implementation • 7 Jun 2024 • Xiao Yang, Kai Sun, Hao Xin, Yushi Sun, Nikita Bhalla, Xiangsen Chen, Sajal Choudhary, Rongze Daniel Gui, Ziran Will Jiang, Ziyu Jiang, Lingkun Kong, Brian Moran, Jiaqi Wang, Yifan Ethan Xu, An Yan, Chenyu Yang, Eting Yuan, Hanwen Zha, Nan Tang, Lei Chen, Nicolas Scheffer, Yue Liu, Nirav Shah, Rakesh Wanga, Anuj Kumar, Wen-tau Yih, Xin Luna Dong
To bridge this gap, we introduce the Comprehensive RAG Benchmark (CRAG), a factual question answering benchmark of 4, 409 question-answer pairs and mock APIs to simulate web and Knowledge Graph (KG) search.
1 code implementation • 29 May 2024 • Zhangkai Ni, Yue Liu, Keyan Ding, Wenhan Yang, Hanli Wang, Shiqi Wang
To bridge these gaps, we propose integrating deep features from pre-trained visual models with a statistical analysis model into a Multi-scale Deep Feature Statistics (MDFS) model for achieving opinion-unaware BIQA (OU-BIQA), thereby eliminating the reliance on human rating data and significantly improving training efficiency.
1 code implementation • 26 May 2024 • Zhaozhi Wang, Yue Liu, Yunfan Liu, Hongtian Yu, YaoWei Wang, Qixiang Ye, Yunjie Tian
A fundamental problem in learning robust and expressive visual representations lies in efficiently estimating the spatial relationships of visual semantics throughout the entire image.
1 code implementation • 25 May 2024 • Yuzhong Zhao, Feng Liu, Yue Liu, Mingxiang Liao, Chen Gong, Qixiang Ye, Fang Wan
Unfortunately, most of existing methods using fixed visual inputs remain lacking the resolution adaptability to find out precise language descriptions.
no code implementations • 16 May 2024 • Yue Liu, Sin Kit Lo, Qinghua Lu, Liming Zhu, Dehai Zhao, Xiwei Xu, Stefan Harrer, Jon Whittle
Foundation model-enabled generative artificial intelligence facilitates the development and implementation of agents, which can leverage distinguished reasoning and language processing capabilities to takes a proactive, autonomous role to pursue users' goals.
no code implementations • 20 Apr 2024 • Yuheng Ji, Yue Liu, Zhicheng Zhang, Zhao Zhang, YuTing Zhao, Gang Zhou, Xingwei Zhang, Xinwang Liu, Xiaolong Zheng
Different from LoRA, we improve the efficiency and robustness of adversarial adaptation by designing a novel reparameterizing method based on parameter clustering and parameter alignment.
1 code implementation • 17 Apr 2024 • Xin Li, Kun Yuan, Yajing Pei, Yiting Lu, Ming Sun, Chao Zhou, Zhibo Chen, Radu Timofte, Wei Sun, HaoNing Wu, ZiCheng Zhang, Jun Jia, Zhichao Zhang, Linhan Cao, Qiubo Chen, Xiongkuo Min, Weisi Lin, Guangtao Zhai, Jianhui Sun, Tianyi Wang, Lei LI, Han Kong, Wenxuan Wang, Bing Li, Cheng Luo, Haiqiang Wang, Xiangguang Chen, Wenhui Meng, Xiang Pan, Huiying Shi, Han Zhu, Xiaozhong Xu, Lei Sun, Zhenzhong Chen, Shan Liu, Fangyuan Kong, Haotian Fan, Yifang Xu, Haoran Xu, Mengduo Yang, Jie zhou, Jiaze Li, Shijie Wen, Mai Xu, Da Li, Shunyu Yao, Jiazhi Du, WangMeng Zuo, Zhibo Li, Shuai He, Anlong Ming, Huiyuan Fu, Huadong Ma, Yong Wu, Fie Xue, Guozhi Zhao, Lina Du, Jie Guo, Yu Zhang, huimin zheng, JunHao Chen, Yue Liu, Dulan Zhou, Kele Xu, Qisheng Xu, Tao Sun, Zhixiang Ding, Yuhang Hu
This paper reviews the NTIRE 2024 Challenge on Shortform UGC Video Quality Assessment (S-UGC VQA), where various excellent solutions are submitted and evaluated on the collected dataset KVQ from popular short-form video platform, i. e., Kuaishou/Kwai Platform.
no code implementations • 31 Jan 2024 • Tim Tse, Zhitang Chen, Shengyu Zhu, Yue Liu
To go about capturing these discrepancies between cause and effect remains to be a challenge and many current state-of-the-art algorithms propose to compare the norms of the kernel mean embeddings of the conditional distributions.
1 code implementation • 31 Jan 2024 • Yuzhong Zhao, Yue Liu, Zonghao Guo, Weijia Wu, Chen Gong, Fang Wan, Qixiang Ye
The multimodal model is constrained to generate captions within a few sub-spaces containing the control words, which increases the opportunity of hitting less frequent captions, alleviating the caption degeneration issue.
Ranked #1 on
Dense Captioning
on Visual Genome
no code implementations • 21 Jan 2024 • Man Luo, Xin Xu, Yue Liu, Panupong Pasupat, Mehran Kazemi
Language models, especially pre-trained large language models, have showcased remarkable abilities as few-shot in-context learners (ICL), adept at adapting to new tasks with just a few demonstrations in the input context.
12 code implementations • 18 Jan 2024 • Yue Liu, Yunjie Tian, Yuzhong Zhao, Hongtian Yu, Lingxi Xie, YaoWei Wang, Qixiang Ye, Jianbin Jiao, Yunfan Liu
At the core of VMamba is a stack of Visual State-Space (VSS) blocks with the 2D Selective Scan (SS2D) module.
2 code implementations • 11 Jan 2024 • Yue Liu, Shihao Zhu, Jun Xia, Yingwei Ma, Jian Ma, Xinwang Liu, Shengju Yu, Kejun Zhang, Wenliang Zhong
Concretely, we encode user behavior sequences and initialize the cluster centers (latent intents) as learnable neurons.
no code implementations • 4 Dec 2023 • Xiaobo Hu, Youfang Lin, Yue Liu, Jinwen Wang, Shuo Wang, Hehe Fan, Kai Lv
Visual reinforcement learning has proven effective in solving control tasks with high-dimensional observations.
no code implementations • 30 Nov 2023 • Dawen Zhang, Boming Xia, Yue Liu, Xiwei Xu, Thong Hoang, Zhenchang Xing, Mark Staples, Qinghua Lu, Liming Zhu
The advent of Generative AI has marked a significant milestone in artificial intelligence, demonstrating remarkable capabilities in generating realistic images, texts, and data patterns.
no code implementations • 15 Nov 2023 • Yue Liu, Shanlin Xiao, Bo Li, Zhiyi Yu
As the third-generation neural network, the Spiking Neural Network (SNN) has the advantages of low power consumption and high energy efficiency, making it suitable for implementation on edge devices.
1 code implementation • 27 Oct 2023 • Xinyu She, Yue Liu, Yanjie Zhao, Yiling He, Li Li, Chakkrit Tantithamthavorn, Zhan Qin, Haoyu Wang
After carefully examining these studies, we designed a taxonomy of pitfalls in LM4Code research and conducted a systematic study to summarize the issues, implications, current solutions, and challenges of different pitfalls for LM4Code systems.
1 code implementation • 28 Sep 2023 • Yingwei Ma, Yue Liu, Yue Yu, Yuanliang Zhang, Yu Jiang, Changjian Wang, Shanshan Li
Inspired by the great success of code data in training LLMs, we naturally wonder at which training stage introducing code data can really help LLMs reasoning.
no code implementations • 28 Sep 2023 • Yuhang Zhang, Yue Liu, Zhihua Zhang
Motivated by the synthetic control method, we construct a synthetic treatment group for the target population by a weighted mixture of treatment groups of source populations.
1 code implementation • 27 Sep 2023 • Seungwhan Moon, Andrea Madotto, Zhaojiang Lin, Tushar Nagarajan, Matt Smith, Shashank Jain, Chun-Fu Yeh, Prakash Murugesan, Peyman Heidari, Yue Liu, Kavya Srinet, Babak Damavandi, Anuj Kumar
We present Any-Modality Augmented Language Model (AnyMAL), a unified model that reasons over diverse input modality signals (i. e. text, image, video, audio, IMU motion sensor), and generates textual responses.
Ranked #9 on
Video Question Answering
on STAR Benchmark
1 code implementation • 21 Sep 2023 • Meng Liu, Ke Liang, Dayu Hu, Hao Yu, Yue Liu, Lingyuan Meng, Wenxuan Tu, Sihang Zhou, Xinwang Liu
We observe that these audiovisual data naturally have temporal attributes, such as the time information for each frame in the video.
1 code implementation • 4 Sep 2023 • Yue Liu, Kevin Suh, Philip K. Maini, Daniel J. Cohen, Ruth E. Baker
When employing mechanistic models to study biological phenomena, practical parameter identifiability is important for making accurate predictions across wide range of unseen scenarios, as well as for understanding the underlying mechanisms.
1 code implementation • 21 Aug 2023 • Xinyi Hou, Yanjie Zhao, Yue Liu, Zhou Yang, Kailong Wang, Li Li, Xiapu Luo, David Lo, John Grundy, Haoyu Wang
Nevertheless, a comprehensive understanding of the application, effects, and possible limitations of LLMs on SE is still in its early stages.
1 code implementation • 20 Aug 2023 • Kai Sun, Yifan Ethan Xu, Hanwen Zha, Yue Liu, Xin Luna Dong
Since the recent prosperity of Large Language Models (LLMs), there have been interleaved discussions regarding how to reduce hallucinations from LLM responses, how to increase the factuality of LLMs, and whether Knowledge Graphs (KGs), which store the world knowledge in a symbolic form, will be replaced with LLMs.
1 code implementation • 17 Aug 2023 • Xihong Yang, Jiaqi Jin, Siwei Wang, Ke Liang, Yue Liu, Yi Wen, Suyuan Liu, Sihang Zhou, Xinwang Liu, En Zhu
Then, a global contrastive calibration loss is proposed by aligning the view feature similarity graph and the high-confidence pseudo-label graph.
2 code implementations • 17 Aug 2023 • Xihong Yang, Cheng Tan, Yue Liu, Ke Liang, Siwei Wang, Sihang Zhou, Jun Xia, Stan Z. Li, Xinwang Liu, En Zhu
To address these problems, we propose a novel CONtrastiVe Graph ClustEring network with Reliable AugmenTation (CONVERT).
1 code implementation • 15 Aug 2023 • Jifeng Shen, Yifei Chen, Yue Liu, Xin Zuo, Heng Fan, Wankou Yang
Effective feature fusion of multispectral images plays a crucial role in multi-spectral object detection.
Ranked #2 on
Object Detection
on VEDAI
2 code implementations • 13 Aug 2023 • Yue Liu, Ke Liang, Jun Xia, Xihong Yang, Sihang Zhou, Meng Liu, Xinwang Liu, Stan Z. Li
To enable the deep graph clustering algorithms to work without the guidance of the predefined cluster number, we propose a new deep graph clustering method termed Reinforcement Graph Clustering (RGC).
no code implementations • 11 Aug 2023 • Yue Liu, Qinghua Lu, Liming Zhu, Hye-Young Paik
Foundation models including large language models (LLMs) are increasingly attracting interest worldwide for their distinguished capabilities and potential to perform a wide variety of tasks.
no code implementations • 6 Jul 2023 • Ke Liang, Sihang Zhou, Yue Liu, Lingyuan Meng, Meng Liu, Xinwang Liu
To this end, we propose the graph Structure Guided Multimodal Pretrained Transformer for knowledge graph reasoning, termed SGMPT.
no code implementations • 8 Jun 2023 • Meng Liu, Ke Liang, Yue Liu, Siwei Wang, Sihang Zhou, Xinwang Liu
It makes evaluating models for large-scale temporal graph clustering challenging.
no code implementations • 31 May 2023 • Zhisheng Wang, Yue Liu, Shunli Wang, Xingyuan Bian, Zongfeng Li, Junning Cui
This paper is to investigate the high-quality analytical reconstructions of multiple source-translation computed tomography (mSTCT) under an extended field of view (FOV).
3 code implementations • 28 May 2023 • Yue Liu, Ke Liang, Jun Xia, Sihang Zhou, Xihong Yang, Xinwang Liu, Stan Z. Li
Subsequently, the clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss in an adversarial manner.
no code implementations • 25 May 2023 • Sin Kit Lo, Yue Liu, Guangsheng Yu, Qinghua Lu, Xiwei Xu, Liming Zhu
Distributed trust is a nebulous concept that has evolved from different perspectives in recent years.
no code implementations • 23 May 2023 • Ke Liang, Lingyuan Meng, Sihang Zhou, Siwei Wang, Wenxuan Tu, Yue Liu, Meng Liu, Xinwang Liu
However, the uni-directional message-passing mechanism hinders such models from exploiting hidden mutual relations between entities in directed graphs.
2 code implementations • 18 May 2023 • Meng Liu, Yue Liu, Ke Liang, Wenxuan Tu, Siwei Wang, Sihang Zhou, Xinwang Liu
To solve the problem, we propose a general framework for deep Temporal Graph Clustering called TGC, which introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Ranked #1 on
Node Clustering
on arXivCS
no code implementations • 9 May 2023 • Qinghua Lu, Liming Zhu, Xiwei Xu, Yue Liu, Zhenchang Xing, Jon Whittle
The recent release of large language model (LLM) based chatbots, such as ChatGPT, has attracted huge interest in foundation models.
1 code implementation • 21 Apr 2023 • Cheng Tan, Zhangyang Gao, Lirong Wu, Jun Xia, Jiangbin Zheng, Xihong Yang, Yue Liu, Bozhen Hu, Stan Z. Li
In this paper, we propose a \textit{simple yet effective} model that can co-design 1D sequences and 3D structures of CDRs in a one-shot manner.
no code implementations • 20 Apr 2023 • Lingyuan Meng, Ke Liang, Bin Xiao, Sihang Zhou, Yue Liu, Meng Liu, Xihong Yang, Xinwang Liu
Moreover, most of the existing methods ignore leveraging the beneficial information from aliasing relations (AR), i. e., data-rich relations with similar contextual semantics to the target data-poor relation.
no code implementations • 22 Feb 2023 • Haiyi Mao, Cong Peng, Yue Liu, Jinping Tang, Hua Peng, Wei Yi
A variety of filters with track-before-detect (TBD) strategies have been developed and applied to low signal-to-noise ratio (SNR) scenarios, including the probability hypothesis density (PHD) filter.
no code implementations • 6 Feb 2023 • Zhenxing Cheng, Peng Wang, Yue Liu, Wei Qin, Zidi Tang
Power capacitor device is a widely used reactive power compensation equipment in power transmission and distribution system which can easily have internal fault and therefore affects the safe operation of the power system.
1 code implementation • 3 Jan 2023 • Xihong Yang, Yue Liu, Sihang Zhou, Siwei Wang, Wenxuan Tu, Qun Zheng, Xinwang Liu, Liming Fang, En Zhu
Then, guided by the high-confidence clustering information, we carefully select and construct the positive samples from the same high-confidence cluster in two views.
no code implementations • 3 Jan 2023 • Yue Liu, Tao Lin, Anastasia Koloskova, Sebastian U. Stich
Gradient tracking (GT) is an algorithm designed for solving decentralized optimization problems over a network (such as training a machine learning model).
1 code implementation • 28 Dec 2022 • Zi'an Xu, Yin Dai, Fayu Liu, Weibing Chen, Yue Liu, Lifu Shi, Sheng Liu, YuHang Zhou
The development of deep learning models in medical image analysis is majorly limited by the lack of large-sized and well-annotated datasets.
2 code implementations • 16 Dec 2022 • Yue Liu, Xihong Yang, Sihang Zhou, Xinwang Liu, Zhen Wang, Ke Liang, Wenxuan Tu, Liang Li, Jingcan Duan, Cancan Chen
Moreover, under the guidance of the carefully collected high-confidence clustering information, our proposed weight modulating function will first recognize the positive and negative samples and then dynamically up-weight the hard sample pairs while down-weighting the easy ones.
1 code implementation • 12 Dec 2022 • Ke Liang, Lingyuan Meng, Meng Liu, Yue Liu, Wenxuan Tu, Siwei Wang, Sihang Zhou, Xinwang Liu, Fuchun Sun
According to the graph types, existing KGR models can be roughly divided into three categories, i. e., static models, temporal models, and multi-modal models.
2 code implementations • 7 Dec 2022 • Xihong Yang, Erxue Min, Ke Liang, Yue Liu, Siwei Wang, Sihang Zhou, Huijun Wu, Xinwang Liu, En Zhu
During the training procedure, we notice the distinct optimization goals for training learnable augmentors and contrastive learning networks.
no code implementations • 1 Dec 2022 • Jingcan Duan, Siwei Wang, Pei Zhang, En Zhu, Jingtao Hu, Hu Jin, Yue Liu, Zhibin Dong
However, they neglect the subgraph-subgraph comparison information which the normal and abnormal subgraph pairs behave differently in terms of embeddings and structures in GAD, resulting in sub-optimal task performance.
2 code implementations • 23 Nov 2022 • Yue Liu, Jun Xia, Sihang Zhou, Xihong Yang, Ke Liang, Chenchen Fan, Yan Zhuang, Stan Z. Li, Xinwang Liu, Kunlun He
However, the corresponding survey paper is relatively scarce, and it is imminent to make a summary of this field.
no code implementations • 19 Nov 2022 • Ke Liang, Yue Liu, Sihang Zhou, Wenxuan Tu, Yi Wen, Xihong Yang, Xiangjun Dong, Xinwang Liu
To this end, we propose a knowledge graph contrastive learning framework based on relation-symmetrical structure, KGE-SymCL, which mines symmetrical structure information in KGs to enhance the discriminative ability of KGE models.
1 code implementation • 6 Sep 2022 • Yue Liu, Zhangkai Ni, Shiqi Wang, Hanli Wang, Sam Kwong
In this paper, a novel and effective image quality assessment (IQA) algorithm based on frequency disparity for high dynamic range (HDR) images is proposed, termed as local-global frequency feature-based model (LGFM).
1 code implementation • 10 Aug 2022 • Yue Liu, Christos Matsoukas, Fredrik Strand, Hossein Azizpour, Kevin Smith
This simple approach, PatchDropout, reduces FLOPs and memory by at least 50% in standard natural image datasets such as ImageNet, and those savings only increase with image size.
no code implementations • 25 Jul 2022 • Huaying Hao, Cong Xu, Dan Zhang, Qifeng Yan, Jiong Zhang, Yue Liu, Yitian Zhao
To be more specific, we first perform a simple degradation of the 3x3 mm2/high-resolution (HR) image to obtain the synthetic LR image.
no code implementations • 13 Jul 2022 • Junpu Zhang, Liang Li, Siwei Wang, Jiyuan Liu, Yue Liu, Xinwang Liu, En Zhu
As a representative, late fusion MKC first decomposes the kernels into orthogonal partition matrices, then learns a consensus one from them, achieving promising performance recently.
no code implementations • 10 Jul 2022 • Zhuangyan Fang, Ruiqi Zhao, Yue Liu, Yangbo He
Causal background knowledge about the existence or the absence of causal edges and paths is frequently encountered in observational studies.
no code implementations • 6 Jul 2022 • Bo Zhang, Yue Liu, Kaixin Lu, Li Niu, Liqing Zhang
Instead, we propose a novel correspondence learning network (CorrelNet) to model the correspondence between foreground and background using cross-attention maps, based on which we can predict the target coordinate that each source coordinate of foreground should be mapped to on the background.
no code implementations • 6 Jun 2022 • Xihong Yang, Yue Liu, Sihang Zhou, Xinwang Liu, En Zhu
Graph Neural Networks (GNNs) have achieved promising performance in semi-supervised node classification in recent years.
no code implementations • 11 May 2022 • Yue Liu, Xihong Yang, Sihang Zhou, Xinwang Liu
To solve this problem, we propose a Simple Contrastive Graph Clustering (SCGC) algorithm to improve the existing methods from the perspectives of network architecture, data augmentation, and objective function.
1 code implementation • 4 Apr 2022 • Shengyuan Hu, Jack Goetz, Kshitiz Malik, Hongyuan Zhan, Zhe Liu, Yue Liu
Model compression is important in federated learning (FL) with large models to reduce communication cost.
no code implementations • 25 Feb 2022 • Yue Liu, Sihang Zhou, Xinwang Liu, Wenxuan Tu, Xihong Yang
Deep graph clustering, which aims to reveal the underlying graph structure and divide the nodes into different clusters without human annotations, is a fundamental yet challenging task.
2 code implementations • 29 Dec 2021 • Yue Liu, Wenxuan Tu, Sihang Zhou, Xinwang Liu, Linxuan Song, Xihong Yang, En Zhu
To address this issue, we propose a novel self-supervised deep graph clustering method termed Dual Correlation Reduction Network (DCRN) by reducing information correlation in a dual manner.
no code implementations • 24 Dec 2021 • Hong-Li Zeng, Yue Liu, Vito Dichio, Erik Aurell
We use Direct Coupling Analysis (DCA) to determine epistatic interactions between loci of variability of the SARS-CoV-2 virus, segmenting genomes by month of sampling.
no code implementations • 9 Dec 2021 • Wenxuan Tu, Sihang Zhou, Yue Liu, Xinwang Liu
First, we entangle the attribute embedding and structure embedding by introducing a siamese network structure to share the parameters learned by both processes, which allows the network training to benefit from more abundant and diverse information.
2 code implementations • 2 Dec 2021 • Moein Sorkhei, Yue Liu, Hossein Azizpour, Edward Azavedo, Karin Dembrower, Dimitra Ntoula, Athanasios Zouzos, Fredrik Strand, Kevin Smith
Interval and large invasive breast cancers, which are associated with worse prognosis than other cancers, are usually detected at a late stage due to false negative assessments of screening mammograms.
1 code implementation • 15 Oct 2021 • Yue Liu, Philip K. Maini, Ruth E. Baker
In certain biological contexts, such as the plumage patterns of birds and stripes on certain species of fishes, pattern formation takes place behind a so-called "wave of competency".
no code implementations • 12 Oct 2021 • Pooja Sethi, Denis Savenkov, Forough Arabshahi, Jack Goetz, Micaela Tolliver, Nicolas Scheffer, Ilknur Kabul, Yue Liu, Ahmed Aly
Improving the quality of Natural Language Understanding (NLU) models, and more specifically, task-oriented semantic parsing models, in production is a cumbersome task.
no code implementations • 1 Oct 2021 • Yue Liu, Ethan X. Fang, Junwei Lu
Our proposed method aims to infer general ranking properties of the BTL model.
no code implementations • 7 Sep 2021 • Hong-Li Zeng, Yue Liu, Vito Dichio, Kaisa Thorell, Rickard Nordén, Erik Aurell
We compute the allele frequencies of the alpha (B. 1. 1. 7), beta (B. 1. 351) and delta (B. 167. 2) variants of SARS-CoV-2 from almost two million genome sequences on the GISAID repository.
no code implementations • 16 Aug 2021 • Sin Kit Lo, Yue Liu, Qinghua Lu, Chen Wang, Xiwei Xu, Hye-Young Paik, Liming Zhu
To enhance the accountability and fairness of federated learning systems, we present a blockchain-based trustworthy federated learning architecture.
no code implementations • NAACL 2021 • Mingyue Shang, Tong Wang, Mihail Eric, Jiangning Chen, Jiyang Wang, Matthew Welch, Tiantong Deng, Akshay Grewal, Han Wang, Yue Liu, Yang Liu, Dilek Hakkani-Tur
In recent years, incorporating external knowledge for response generation in open-domain conversation systems has attracted great interest.
no code implementations • NAACL 2021 • Tong Wang, Jiangning Chen, Mohsen Malmir, Shuyan Dong, Xin He, Han Wang, Chengwei Su, Yue Liu, Yang Liu
In dialog systems, the Natural Language Understanding (NLU) component typically makes the interpretation decision (including domain, intent and slots) for an utterance before the mentioned entities are resolved.
1 code implementation • 17 May 2021 • Jiawei Jiang, Shaoduo Gan, Yue Liu, Fanlin Wang, Gustavo Alonso, Ana Klimovic, Ankit Singla, Wentao Wu, Ce Zhang
The appeal of serverless (FaaS) has triggered a growing interest on how to use it in data-intensive applications such as ETL, query processing, or machine learning (ML).
no code implementations • 10 Apr 2021 • Yue Liu, Lixin Tian, Zhuyun Xie, Zaili Zhen, Huaping Sun
Considering the impact of price fluctuations of carbon emission right allowance, we investigate the operation of Chinese thermal power plant by modeling the decision-making with optimal stopping problem, which is established on the stochastic environment with carbon emission allowance price process simulated by geometric Brownian motion.
no code implementations • 6 Apr 2021 • Ying Lin, Han Wang, Jiangning Chen, Tong Wang, Yue Liu, Heng Ji, Yang Liu, Premkumar Natarajan
For example, with "add milk to my cart", a customer may refer to a certain organic product, while some customers may want to re-order products they regularly purchase.
no code implementations • 24 Mar 2021 • Wei Wei, Li Guan, Yue Liu, Hao Kang, Haoxiang Li, Ying Wu, Gang Hua
By the proposed physical regularization, our method can generate HDRs which are not only visually appealing but also physically plausible.
1 code implementation • 9 Mar 2021 • Yue Liu, Chakkrit Tantithamthavorn, Li Li, Yepang Liu
In this paper, we conducted a systematic literature review to search and analyze how deep learning approaches have been applied in the context of malware defenses in the Android environment.
no code implementations • 25 Feb 2021 • Zhuangyan Fang, Yue Liu, Zhi Geng, Shengyu Zhu, Yangbo He
We propose a local approach to identify whether a variable is a cause of a given target under the framework of causal graphical models of directed acyclic graphs (DAGs).
no code implementations • 7 Feb 2021 • Yue Liu, Ihor Korolov, Torben Hemke, Lena Bischoff, Gerrit Hübner, Julian Schulze, Thomas Mussenbrock
A two-dimensional fluid model is used to investigate the electron heating dynamics and the production of neutral species in a capacitively coupled radio-frequency micro atmospheric pressure helium plasma jet -- specifically the COST jet -- with a small oxygen admixture.
Plasma Physics
no code implementations • 21 Jan 2021 • Haotian Ye, Chuanlong Xie, Yue Liu, Zhenguo Li
One of the definitions of OOD accuracy is worst-domain accuracy.
no code implementations • 4 Dec 2020 • Panpan Zhou, Liyang Chen, Yue Liu, Ilya Sochnikov, Anthony T. Bollinger, Myung-Geun Han, Yimei Zhu, Xi He, Ivan Bozovic, Douglas Natelson
In the quest to understand high-temperature superconductivity in copper oxides, a vigorous debate has been focused on the pseudogap - a partial gap that opens over portions of the Fermi surface in the 'normal' state above the bulk critical temperature ($T_{c}$).
Superconductivity Mesoscale and Nanoscale Physics Strongly Correlated Electrons
no code implementations • 29 Nov 2020 • Yan He, Jifang Qiu, Chang Liu, Yue Liu, Jian Wu
The latest theoretical advances in the field of unlimited sampling framework (USF) show the potential to avoid clipping problems of analog-to-digital converters (ADC).
no code implementations • 14 Sep 2020 • Yue Liu, Alex Colburn, Mehlika Inanici
The proposed DNN model can faithfully predict high-quality annual panoramic luminance maps from one of the three options within 30 minutes training time: a) point-in-time luminance imagery spanning 5% of the year, when evenly distributed during daylight hours, b) one-month hourly imagery generated or collected continuously during daylight hours around the equinoxes (8% of the year); or c) 9 days of hourly data collected around the spring equinox, summer and winter solstices (2. 5% of the year) all suffice to predict the luminance maps for the rest of the year.
no code implementations • 6 Sep 2020 • Weishan Zhang, Qinghua Lu, Qiuyu Yu, Zhaotong Li, Yue Liu, Sin Kit Lo, Shiping Chen, Xiwei Xu, Liming Zhu
Therefore, in this paper, we present a platform architecture of blockchain-based federated learning systems for failure detection in IIoT.
no code implementations • 4 Sep 2020 • Xinli Yu, Mohsen Malmir, Cynthia He, Yue Liu, Rex Wu
However, the inference time will not be a problem for our model since our model has a simple architecture which enables efficient training and inference.
2 code implementations • ICML 2020 • Christos Matsoukas, Albert Bou I Hernandez, Yue Liu, Karin Dembrower, Gisele Miranda, Emir Konuk, Johan Fredin Haslum, Athanasios Zouzos, Peter Lindholm, Fredrik Strand, Kevin Smith
Evidence suggests that networks trained on large datasets generalize well not solely because of the numerous training examples, but also class diversity which encourages learning of enriched features.
1 code implementation • 11 Jul 2020 • Yue Liu, Hossein Azizpour, Fredrik Strand, Kevin Smith
With this in mind, we trained networks using three different criteria to select the positive training data (i. e. images from patients that will develop cancer): an inherent risk model trained on images with no visible signs of cancer, a cancer signs model trained on images containing cancer or early signs of cancer, and a conflated model trained on all images from patients with a cancer diagnosis.
no code implementations • 6 Jul 2020 • Yue Liu, Adam Ghandar, Georgios Theodoropoulos
In this paper, we describe application of Neuroevolution to a P2P lending problem in which a credit evaluation model is updated based on streaming data.
no code implementations • 13 Jun 2020 • Chuanlong Xie, Haotian Ye, Fei Chen, Yue Liu, Rui Sun, Zhenguo Li
The key of the out-of-distribution (OOD) generalization is to generalize invariance from training domains to target domains.
no code implementations • 10 Jun 2020 • Zhuangyan Fang, Shengyu Zhu, Jiji Zhang, Yue Liu, Zhitang Chen, Yangbo He
Despite several advances in recent years, learning causal structures represented by directed acyclic graphs (DAGs) remains a challenging task in high dimensional settings when the graphs to be learned are not sparse.
no code implementations • 9 Jun 2020 • Kun Kuang, Bo Li, Peng Cui, Yue Liu, Jianrong Tao, Yueting Zhuang, Fei Wu
By assuming the relationships between causal variables and response variable are invariant across data, to address this problem, we propose a conditional independence test based algorithm to separate those causal variables with a seed variable as priori, and adopt them for stable prediction.
no code implementations • 5 Jun 2020 • Xin Cheng, Lei Zhang, Yin Tang, Yue Liu, Hao Wu, Jun He
For deep learning, improvements in performance have to heavily rely on increasing model size or capacity to scale to larger and larger datasets, which inevitably leads to the increase of operations.
1 code implementation • 5 Mar 2020 • Helena H. Lee, Ke Shu, Palakorn Achananuparp, Philips Kokoh Prasetyo, Yue Liu, Ee-Peng Lim, Lav R. Varshney
Interests in the automatic generation of cooking recipes have been growing steadily over the past few years thanks to a large amount of online cooking recipes.
no code implementations • 23 Sep 2019 • Yue Liu, Elisabeth G. Rens, Leah Edelstein-Keshet
The polarization and motility of eukaryotic cells depends on assembly and contraction of the actin cytoskeleton and its regulation by proteins called GTPases.
1 code implementation • 17 Sep 2019 • Helena Lee, Palakorn Achananuparp, Yue Liu, Ee-Peng Lim, Lav R. Varshney
Consumption of diets with low glycemic impact is highly recommended for diabetics and pre-diabetics as it helps maintain their blood glucose levels.
1 code implementation • 17 Sep 2019 • Yue Liu, Helena Lee, Palakorn Achananuparp, Ee-Peng Lim, Tzu-Ling Cheng, Shou-De Lin
Human beings are creatures of habit.
no code implementations • 2 Sep 2019 • Zhitang Chen, Shengyu Zhu, Yue Liu, Tim Tse
We show our algorithm can be reduced to an eigen-decomposition task on a kernel matrix measuring intrinsic deviance/invariance.
no code implementations • 28 Aug 2019 • Andreas Buttenschön, Yue Liu, Leah Edelstein-Keshet
We further consider the feedback between mechanical tension, GTPase activation, and cell deformation in both static, growing, shrinking, and moving cells.
no code implementations • 17 Feb 2019 • Guang-Yu Nie, Yun Liu, Cong Wang, Yue Liu, Yongtian Wang
Three-dimensional (3-D) scene reconstruction is one of the key techniques in Augmented Reality (AR), which is related to the integration of image processing and display systems of complex information.
3 code implementations • 4 Jul 2018 • Yue Liu, Tongtao Zhang, Zhicheng Liang, Heng Ji, Deborah L. McGuinness
Inspired by recent successes in neural machine translation, we treat the triples within a given knowledge graph as an independent graph language and propose an encoder-decoder framework with an attention mechanism that leverages knowledge graph embeddings.
no code implementations • 18 May 2018 • Silvia L. Pintea, Yue Liu, Jan C. van Gemert
Knowledge distillation compacts deep networks by letting a small student network learn from a large teacher network.
no code implementations • WS 2015 • Yue Liu, Tao Ge, Kusum S. Mathews, Heng Ji, Deborah L. McGuinness
In the medical domain, identifying and expanding abbreviations in clinical texts is a vital task for both better human and machine understanding.
no code implementations • ICCV 2015 • Weipeng Xu, Mathieu Salzmann, Yongtian Wang, Yue Liu
Capturing the 3D motion of dynamic, non-rigid objects has attracted significant attention in computer vision.