Search Results for author: Yao Lei Xu

Found 9 papers, 1 papers with code

TensorGPT: Efficient Compression of the Embedding Layer in LLMs based on the Tensor-Train Decomposition

no code implementations2 Jul 2023 Mingxue Xu, Yao Lei Xu, Danilo P. Mandic

High-dimensional token embeddings underpin Large Language Models (LLMs), as they can capture subtle semantic information and significantly enhance the modelling of complex language patterns.

Graph Tensor Networks: An Intuitive Framework for Designing Large-Scale Neural Learning Systems on Multiple Domains

no code implementations23 Mar 2023 Yao Lei Xu, Kriton Konstantinidis, Danilo P. Mandic

Despite the omnipresence of tensors and tensor operations in modern deep learning, the use of tensor mathematics to formally design and describe neural networks is still under-explored within the deep learning community.

Tensor Networks

Complexity-based Financial Stress Evaluation

no code implementations5 Dec 2022 Hongjian Xiao, Yao Lei Xu, Danilo P. Mandic

Financial markets typically exhibit dynamically complex properties as they undergo continuous interactions with economic and environmental factors.

Time Series Time Series Analysis

Graph-Regularized Tensor Regression: A Domain-Aware Framework for Interpretable Multi-Way Financial Modelling

no code implementations26 Oct 2022 Yao Lei Xu, Kriton Konstantinidis, Danilo P. Mandic

This represents a challenge for modern machine learning models, as the number of model parameters needed to process such data grows exponentially with the data dimensions; an effect known as the Curse-of-Dimensionality.

regression Tensor Decomposition

Graph Theory for Metro Traffic Modelling

no code implementations11 May 2021 Bruno Scalzo Dees, Yao Lei Xu, Anthony G. Constantinides, Danilo P. Mandic

Finally, we also explore the application of modern deep learning models, such as graph neural networks and hyper-graph neural networks, as general purpose models for the modelling and forecasting of underground data, especially in the context of the morning and evening rush hours.

Management

Tensor-Train Recurrent Neural Networks for Interpretable Multi-Way Financial Forecasting

no code implementations11 May 2021 Yao Lei Xu, Giuseppe G. Calvi, Danilo P. Mandic

Recurrent Neural Networks (RNNs) represent the de facto standard machine learning tool for sequence modelling, owing to their expressive power and memory.

Tensor Decomposition

Tensor Networks for Multi-Modal Non-Euclidean Data

no code implementations27 Mar 2021 Yao Lei Xu, Kriton Konstantinidis, Danilo P. Mandic

Modern data sources are typically of large scale and multi-modal natures, and acquired on irregular domains, which poses serious challenges to traditional deep learning models.

Tensor Networks

Multi-Graph Tensor Networks

1 code implementation25 Oct 2020 Yao Lei Xu, Kriton Konstantinidis, Danilo P. Mandic

The irregular and multi-modal nature of numerous modern data sources poses serious challenges for traditional deep learning algorithms.

Algorithmic Trading Tensor Networks

Recurrent Graph Tensor Networks: A Low-Complexity Framework for Modelling High-Dimensional Multi-Way Sequence

no code implementations18 Sep 2020 Yao Lei Xu, Danilo P. Mandic

Recurrent Neural Networks (RNNs) are among the most successful machine learning models for sequence modelling, but tend to suffer from an exponential increase in the number of parameters when dealing with large multidimensional data.

Tensor Networks Time Series Forecasting

Cannot find the paper you are looking for? You can Submit a new open access paper.