Search Results for author: Shasha Guo

Found 6 papers, 1 papers with code

SGSH: Stimulate Large Language Models with Skeleton Heuristics for Knowledge Base Question Generation

1 code implementation2 Apr 2024 Shasha Guo, Lizi Liao, Jing Zhang, Yanling Wang, Cuiping Li, Hong Chen

Knowledge base question generation (KBQG) aims to generate natural language questions from a set of triplet facts extracted from KB.

Question Generation Question-Generation

A Survey on Neural Question Generation: Methods, Applications, and Prospects

no code implementations28 Feb 2024 Shasha Guo, Lizi Liao, Cuiping Li, Tat-Seng Chua

In this survey, we present a detailed examination of the advancements in Neural Question Generation (NQG), a field leveraging neural network techniques to generate relevant questions from diverse inputs like knowledge bases, texts, and images.

Question Generation Question-Generation

Diversifying Question Generation over Knowledge Base via External Natural Questions

no code implementations23 Sep 2023 Shasha Guo, Jing Zhang, Xirui Ke, Cuiping Li, Hong Chen

The above insights make diversifying question generation an intriguing task, where the first challenge is evaluation metrics for diversity.

Natural Questions Question Answering +2

SeqXFilter: A Memory-efficient Denoising Filter for Dynamic Vision Sensors

no code implementations2 Jun 2020 Shasha Guo, Lei Wang, Xiaofan Chen, Limeng Zhang, Ziyang Kang, Weixia Xu

We not only give the visual denoising effect of the filter but also use two metrics for quantitatively analyzing the filter's performance.

Denoising

A Noise Filter for Dynamic Vision Sensors using Self-adjusting Threshold

no code implementations8 Apr 2020 Shasha Guo, Ziyang Kang, Lei Wang, Limeng Zhang, Xiaofan Chen, Shiming Li, Weixia Xu

Neuromorphic event-based dynamic vision sensors (DVS) have much faster sampling rates and a higher dynamic range than frame-based imagers.

Emerging Technologies Signal Processing

Exploration of Input Patterns for Enhancing the Performance of Liquid State Machines

no code implementations6 Apr 2020 Shasha Guo, Lianhua Qu, Lei Wang, Shuo Tian, Shiming Li, Weixia Xu

To mitigate the difficulty in effectively dealing with huge input spaces of LSM, and to find that whether input reduction can enhance LSM performance, we explore several input patterns, namely fullscale, scanline, chessboard, and patch.

Cannot find the paper you are looking for? You can Submit a new open access paper.