no code implementations • 25 Nov 2024 • Shenghe Zheng, Hongzhi Wang
In the current era of rapid expansion in model scale, there is an increasing availability of open-source model weights for various tasks.
no code implementations • 24 Sep 2024 • Satyananda Kashyap, Niharika S. D'Souza, Luyao Shi, Ken C. L. Wong, Hongzhi Wang, Tanveer Syeda-Mahmood
Content-addressable memories such as Modern Hopfield Networks (MHN) have been studied as mathematical models of auto-association and storage/retrieval in the human declarative memory, yet their practical use for large-scale content storage faces challenges.
1 code implementation • 16 Aug 2024 • Huaiyuan Liu, Xianzhang Liu, Donghua Yang, Hongzhi Wang, Yingchi Long, Mengtong Ji, Dongjing Miao, Zhiyu Liang
The unsupervised solver is inspired by a relaxation-plus-rounding approach, the relaxed solution is parameterized by graph neural networks, and the cost and penalty of MMCP are explicitly written out, which can train the model end-to-end.
no code implementations • 20 Jun 2024 • Hongzhi Wang, Xiubo Liang, Mengjian Li, Tao Zhang
The Spiking Neural Networks (SNNs), renowned for their bio-inspired operational mechanism and energy efficiency, mirror the human brain's neural activity.
no code implementations • 16 Jun 2024 • Prashanth Vijayaraghavan, Hongzhi Wang, Luyao Shi, Tyler Baldwin, David Beymer, Ehsan Degan
Recently, there has been a growing availability of pre-trained text models on various model repositories.
1 code implementation • 2 May 2024 • Shenghe Zheng, Hongzhi Wang, Xianglong Liu
IntraMix efficiently tackles both issues faced by graphs and challenges the prior notion of the limited effectiveness of Mixup in node classification.
no code implementations • 9 Dec 2023 • Chen Liang, Donghua Yang, Zhiyu Liang, Hongzhi Wang, Zheng Liang, Xiyang Zhang, Jianfeng Huang
In contrast to conventional methods that fuse features from multiple modalities, our proposed approach simplifies the neural architecture by retaining a single time series encoder, consequently leading to preserved scalability.
no code implementations • 21 Nov 2023 • Ken C. L. Wong, Levente Klein, Ademir Ferreira da Silva, Hongzhi Wang, Jitendra Singh, Tanveer Syeda-Mahmood
To study the use of CNNs on SOC remote sensing, here we propose the FNO-DenseNet based on the Fourier neural operator (FNO).
1 code implementation • 5 Oct 2023 • Ken C. L. Wong, Hongzhi Wang, Tanveer Syeda-Mahmood
Due to the computational complexity of 3D medical image segmentation, training with downsampled images is a common remedy for out-of-memory errors in deep learning.
1 code implementation • 5 Oct 2023 • Ken C. L. Wong, Hongzhi Wang, Tanveer Syeda-Mahmood
With the introduction of Transformers, different attention-based models have been proposed for image segmentation with promising results.
1 code implementation • 25 Jul 2023 • Kaixin Zhang, Hongzhi Wang, Yabin Lu, ZiQi Li, Chang Shu, Yu Yan, Donghua Yang
Although both data-driven and hybrid methods are proposed to avoid this problem, most of them suffer from high training and estimation costs, limited scalability, instability, and long-tail distribution problems on high-dimensional tables, which seriously affects the practical application of learned cardinality estimators.
no code implementations • 13 Jul 2023 • Niharika S. D'Souza, Hongzhi Wang, Andrea Giovannini, Antonio Foncubierta-Rodriguez, Kristen L. Beck, Orest Boyko, Tanveer Syeda-Mahmood
With the emergence of multimodal electronic health records, the evidence for an outcome may be captured across multiple modalities ranging from clinical to imaging and genomic data.
1 code implementation • 30 May 2023 • Zhiyu Liang, Jianfeng Zhang, Chen Liang, Hongzhi Wang, Zheng Liang, Lujia Pan
Recent studies have shown great promise in unsupervised representation learning (URL) for multivariate time series, because URL has the capability in learning generalizable representation for many downstream tasks without using inaccessible labels.
1 code implementation • 11 Apr 2023 • Huaiyuan Liu, Xianzhang Liu, Donghua Yang, Zhiyu Liang, Hongzhi Wang, Yong Cui, Jun Gu
Unfortunately, the existing deep learning-based methods neglect the hidden dependencies in different dimensions and also rarely consider the unique dynamic features of time series, which lack sufficient feature extraction capability to obtain satisfactory classification accuracy.
1 code implementation • 24 Mar 2023 • Zhiyu Liang, Chen Liang, Zheng Liang, Hongzhi Wang, Bo Zheng
Machine learning has emerged as a powerful tool for time series analysis.
1 code implementation • 25 Feb 2023 • Shenghe Zheng, Hongzhi Wang, Tianyu Mu
Therefore, the critical issue in utilizing predictors for NAS is to train a high-performance predictor using as few trained neural networks as possible.
no code implementations • 21 Feb 2023 • Zhiyu Liang, Hongzhi Wang
We recognize the federated shapelet search step as the kernel of FedST.
no code implementations • 25 Oct 2022 • Niharika S. D'Souza, Hongzhi Wang, Andrea Giovannini, Antonio Foncubierta-Rodriguez, Kristen L. Beck, Orest Boyko, Tanveer Syeda-Mahmood
All these nuances make simple methods of early, late, or intermediate fusion of features inadequate for outcome prediction.
no code implementations • 19 Oct 2022 • Bozhou Chen, Hongzhi Wang, Chenmin Ba
Learning rate adaptation is a popular topic in machine learning.
no code implementations • 18 Jun 2022 • Geng Li, Boyuan Ren, Hongzhi Wang
To accelerate learning process with few samples, meta-learning resorts to prior knowledge from previous tasks.
no code implementations • 20 May 2022 • Xinyue Shao, Hongzhi Wang, Xiao Zhu, Feng Xiong
Meta-learning is used to efficiently enable the automatic selection of machine learning models by combining data and prior knowledge.
no code implementations • 26 Mar 2022 • Chunnan Wang, Xingyu Chen, Chengyue Wu, Hongzhi Wang
We allow the effective combination of design experience from different sources, so as to create an effective search space containing a variety of TSF models to support different TSF tasks.
1 code implementation • 24 Jan 2022 • Chunnan Wang, Hongzhi Wang, Xiangyu Shi
Model compression methods can reduce model complexity on the premise of maintaining acceptable performance, and thus promote the application of deep neural networks under resource constrained environments.
no code implementations • 9 Jan 2022 • Chunnan Wang, Chen Liang, Xiang Chen, Hongzhi Wang
They are lack of self-evaluation ability, that is, to examine the rationality of their prediction results, thus failing to guide users to identify high-quality ones from their candidate results.
no code implementations • CVPR 2022 • Chunnan Wang, Xiang Chen, Junzhe Wang, Hongzhi Wang
Although the Trajectory Prediction (TP) model has achieved great success in computer vision and robotics fields, its architecture and training scheme design rely on heavy manual work and domain knowledge, which is not friendly to common users.
no code implementations • 10 Dec 2021 • Ken C. L. Wong, Hongzhi Wang, Etienne E. Vos, Bianca Zadrozny, Campbell D. Watson, Tanveer Syeda-Mahmood
Global warming leads to the increase in frequency and intensity of climate extremes that cause tremendous loss of lives and property.
no code implementations • 29 Sep 2021 • Bozhou Chen, Hongzhi Wang, Chenmin Ba
We apply our method for the optimization of various neural network layers' hyper-parameters and compare it with multiple benchmark hyper-parameter optimization models.
no code implementations • 21 Sep 2021 • Guosheng Feng, Chunnan Wang, Hongzhi Wang
Current GNN-oriented NAS methods focus on the search for different layer aggregate components with shallow and simple architectures, which are limited by the 'over-smooth' problem.
no code implementations • 27 May 2021 • Kaixin Zhang, Hongzhi Wang, Han Hu, Songling Zou, Jiye Qiu, Tongxin Li, Zhishun Wang
In this paper, we demonstrated TENSILE, a method of managing GPU memory in tensor granularity to reduce the GPU memory peak, considering the multiple dynamic workloads.
no code implementations • 9 Apr 2021 • Chunnan Wang, Bozhou Chen, Geng Li, Hongzhi Wang
Recently, some Neural Architecture Search (NAS) techniques are proposed for the automatic design of Graph Convolutional Network (GCN) architectures.
no code implementations • 8 Jan 2021 • Meifan Zhang, Hongzhi Wang
Online sampling chooses samples for the given query at query time, but it requires a long latency.
no code implementations • 15 Oct 2020 • Chunnan Wang, Kaixin Zhang, Hongzhi Wang, Bozhou Chen
In recent years, many spatial-temporal graph convolutional network (STGCN) models are proposed to deal with the spatial-temporal network data forecasting problem.
1 code implementation • 18 Sep 2020 • Zhaochong An, Bozhou Chen, Houde Quan, Qihui Lin, Hongzhi Wang
To solve this problem, in this paper, we propose a general framework, named EM-RBR(embedding and rule-based reasoning), capable of combining the advantages of reasoning based on rules and the state-of-the-art models of embedding.
no code implementations • 15 Aug 2020 • Hongzhi Wang, Yan Wei, Hao Yan
Therefore, the users of the database need to select the storage engine and design data model according to the workload encountered.
no code implementations • 7 Jul 2020 • Tianyu Mu, Hongzhi Wang, Chunnan Wang, Zheng Liang
In our work, we present Auto-CASH, a pre-trained model based on meta-learning, to solve the CASH problem more efficiently.
1 code implementation • 6 Jul 2020 • Chunnan Wang, Hongzhi Wang, Guosheng Feng, Fei Geng
To reduce searching cost, most NAS algorithms use fixed outer network level structure, and search the repeatable cell structure only.
no code implementations • 16 Jun 2020 • Shun Yao, Hongzhi Wang, Yu Yan
We propose a new approach of NoSQL database index selection.
no code implementations • 20 Apr 2020 • John Paul Francis, Hongzhi Wang, Kate White, Tanveer Syeda-Mahmood, Raymond Stevens
The Pancreatic beta cell is an important target in diabetes research.
1 code implementation • 9 Apr 2020 • Hongzhi Wang, Bozhou Chen, Yueyang Xu, Kaixin Zhang, Shengwen Zheng
In this demo, we present ConsciousControlFlow(CCF), a prototype system to demonstrate conscious Artificial Intelligence (AI).
no code implementations • 5 Mar 2020 • Meifan Zhang, Hongzhi Wang
Querying on big data is a challenging task due to the rapid growth of data amount.
no code implementations • 3 Mar 2020 • Bozhou Chen, Kaixin Zhang, Longshen Ou, Chenmin Ba, Hongzhi Wang, Chunnan Wang
However, most machine learning algorithms are sensitive to the hyper-parameters.
no code implementations • 2 Dec 2019 • Chunnan Wang, Hongzhi Wang, Chang Zhou, Hanxiao Chen
Motivated by this, we propose ExperienceThinking algorithm to quickly find the best possible hyperparameter configuration of machine learning algorithms within a few configuration evaluations.
no code implementations • 24 Oct 2019 • Chunnan Wang, Hongzhi Wang, Tianyu Mu, Jianzhong Li, Hong Gao
In many fields, a mass of algorithms with completely different hyperparameters have been developed to address the same type of problems.
no code implementations • 22 Aug 2019 • Hongzhi Wang, Yijie Yang, Yang song
In industrial data analytics, one of the fundamental problems is to utilize the temporal correlation of the industrial data to make timely predictions in the production process, such as fault prediction and yield prediction.
no code implementations • 10 Aug 2019 • Xi Chen, Hongzhi Wang, Yanjie Wei, Jianzhong Li, Hong Gao
In this paper, we focus on online methods for AR-model-based time series prediction with missing values.
no code implementations • 9 Aug 2019 • Hongzhi Wang, Yang song, Shihan Tang
In this paper, a method of prediction on continuous time series variables from the production or flow -- an LSTM algorithm based on multivariate tuning -- is proposed.
no code implementations • 2 Jul 2019 • Vaishnavi Subramanian, Hongzhi Wang, Joy T. Wu, Ken C. L. Wong, Arjun Sharma, Tanveer Syeda-Mahmood
Central venous catheters (CVCs) are commonly used in critical care settings for monitoring body functions and administering medications.
no code implementations • 5 Aug 2018 • Zhemin Liu, Feng Xiong, Kaifa Zou, Hongzhi Wang
Real-time and open online course resources of MOOCs have attracted a large number of learners in recent years.
no code implementations • 5 Aug 2018 • Sifan Liu, Hongzhi Wang
Motived by this, we measure the relation of two concepts by the distance between their corresponding instances and detect errors within the intersection of the conflicting concept sets.
no code implementations • 16 Mar 2018 • Zhixin Qi, Hongzhi Wang, Jianzhong Li, Hong Gao
Data quality issues have attracted widespread attention due to the negative impacts of dirty data on data mining and machine learning results.
1 code implementation • 27 Nov 2017 • Xin Li, Qiao Liu, Nana Fan, Zhenyu He, Hongzhi Wang
In this paper, we cast the TIR tracking problem as a similarity verification task, which is coupled well to the objective of the tracking task.
no code implementations • 24 Nov 2014 • Junxiong Wang, Hongzhi Wang, Chenxu Zhao
Currently, many machine learning algorithms contain lots of iterations.