Search Results for author: Lionel M. Ni

Found 16 papers, 7 papers with code

QuantAgent: Seeking Holy Grail in Trading by Self-Improving Large Language Model

no code implementations6 Feb 2024 Saizhuo Wang, Hang Yuan, Lionel M. Ni, Jian Guo

Autonomous agents based on Large Language Models (LLMs) that devise plans and tackle real-world challenges have gained prominence. However, tailoring these agents for specialized domains like quantitative investment remains a formidable task.

Language Modelling Large Language Model

Alpha-GPT: Human-AI Interactive Alpha Mining for Quantitative Investment

no code implementations31 Jul 2023 Saizhuo Wang, Hang Yuan, Leon Zhou, Lionel M. Ni, Heung-Yeung Shum, Jian Guo

One of the most important tasks in quantitative investment research is mining new alphas (effective trading signals or factors).

Prompt Engineering

Think-on-Graph: Deep and Responsible Reasoning of Large Language Model on Knowledge Graph

3 code implementations15 Jul 2023 Jiashuo Sun, Chengjin Xu, Lumingyuan Tang, Saizhuo Wang, Chen Lin, Yeyun Gong, Lionel M. Ni, Heung-Yeung Shum, Jian Guo

Although large language models (LLMs) have achieved significant success in various tasks, they often struggle with hallucination problems, especially in scenarios requiring deep and responsible reasoning.

Hallucination Knowledge Graphs +3

Closed-Loop Transcription via Convolutional Sparse Coding

no code implementations18 Feb 2023 Xili Dai, Ke Chen, Shengbang Tong, Jingyuan Zhang, Xingjian Gao, Mingyang Li, Druv Pai, Yuexiang Zhai, Xiaojun Yuan, Heung-Yeung Shum, Lionel M. Ni, Yi Ma

Our method is arguably the first to demonstrate that a concatenation of multiple convolution sparse coding/decoding layers leads to an interpretable and effective autoencoder for modeling the distribution of large-scale natural image datasets.

Rolling Shutter Correction

Quant 4.0: Engineering Quantitative Investment with Automated, Explainable and Knowledge-driven Artificial Intelligence

no code implementations13 Dec 2022 Jian Guo, Saizhuo Wang, Lionel M. Ni, Heung-Yeung Shum

Quant has become one of the mainstream investment methodologies over the past decades, and has experienced three generations: Quant 1. 0, trading by mathematical modeling to discover mis-priced assets in markets; Quant 2. 0, shifting quant research pipeline from small ``strategy workshops'' to large ``alpha factories''; Quant 3. 0, applying deep learning techniques to discover complex nonlinear pricing rules.


DINO: DETR with Improved DeNoising Anchor Boxes for End-to-End Object Detection

14 code implementations7 Mar 2022 Hao Zhang, Feng Li, Shilong Liu, Lei Zhang, Hang Su, Jun Zhu, Lionel M. Ni, Heung-Yeung Shum

Compared to other models on the leaderboard, DINO significantly reduces its model size and pre-training data size while achieving better results.

Real-Time Object Detection

Vision-Language Intelligence: Tasks, Representation Learning, and Large Models

no code implementations3 Mar 2022 Feng Li, Hao Zhang, Yi-Fan Zhang, Shilong Liu, Jian Guo, Lionel M. Ni, Pengchuan Zhang, Lei Zhang

This survey is inspired by the remarkable progress in both computer vision and natural language processing, and recent trends shifting from single modality processing to multiple modality comprehension.

Few-Shot Learning Representation Learning

DN-DETR: Accelerate DETR Training by Introducing Query DeNoising

16 code implementations CVPR 2022 Feng Li, Hao Zhang, Shilong Liu, Jian Guo, Lionel M. Ni, Lei Zhang

Our method is universal and can be easily plugged into any DETR-like methods by adding dozens of lines of code to achieve a remarkable improvement.

Object Detection

Generalizing from a Few Examples: A Survey on Few-Shot Learning

4 code implementations10 Apr 2019 Yaqing Wang, Quanming Yao, James Kwok, Lionel M. Ni

Machine learning has been highly successful in data-intensive applications but is often hampered when the data set is small.

BIG-bench Machine Learning Few-Shot Learning

General Convolutional Sparse Coding with Unknown Noise

no code implementations8 Mar 2019 Yaqing Wang, James T. Kwok, Lionel M. Ni

However, existing CSC methods can only model noises from Gaussian distribution, which is restrictive and unrealistic.

Online Convolutional Sparse Coding with Sample-Dependent Dictionary

no code implementations ICML 2018 Yaqing Wang, Quanming Yao, James T. Kwok, Lionel M. Ni

Convolutional sparse coding (CSC) has been popularly used for the learning of shift-invariant dictionaries in image and signal processing.

Scalable Online Convolutional Sparse Coding

no code implementations21 Jun 2017 Yaqing Wang, Quanming Yao, James T. Kwok, Lionel M. Ni

Convolutional sparse coding (CSC) improves sparse coding by learning a shift-invariant dictionary from the data.

Cannot find the paper you are looking for? You can Submit a new open access paper.