Search Results for author: Haoxiang Shi

Found 7 papers, 2 papers with code

EM-TTS: Efficiently Trained Low-Resource Mongolian Lightweight Text-to-Speech

no code implementations13 Mar 2024 ZiQi Liang, Haoxiang Shi, Jiawei Wang, Keda Lu

Recurrent neural networks have become a standard modeling technique for sequential data in TTS systems and are widely used.

Speech Synthesis

Towards Consistency Filtering-Free Unsupervised Learning for Dense Retrieval

no code implementations5 Aug 2023 Haoxiang Shi, Sumio Fujita, Tetsuya Sakai

In addition, consistency filtering often struggles to identify retrieval intentions and recognize query and corpus distributions in a target domain.

Information Retrieval Retrieval

Is ChatGPT a Good NLG Evaluator? A Preliminary Study

1 code implementation7 Mar 2023 Jiaan Wang, Yunlong Liang, Fandong Meng, Zengkui Sun, Haoxiang Shi, Zhixu Li, Jinan Xu, Jianfeng Qu, Jie zhou

In detail, we regard ChatGPT as a human evaluator and give task-specific (e. g., summarization) and aspect-specific (e. g., relevance) instruction to prompt ChatGPT to evaluate the generated results of NLG models.

nlg evaluation Story Generation

GOAL: Towards Benchmarking Few-Shot Sports Game Summarization

1 code implementation18 Jul 2022 Jiaan Wang, Tingyi Zhang, Haoxiang Shi

Sports game summarization aims to generate sports news based on real-time commentaries.

Benchmarking

A Siamese CNN Architecture for Learning Chinese Sentence Similarity

no code implementations Asian Chapter of the Association for Computational Linguistics 2020 Haoxiang Shi, Cen Wang, Tetsuya Sakai

This paper presents a deep neural architecture which applies the siamese convolutional neural network sharing model parameters for learning a semantic similarity metric between two sentences.

Semantic Similarity Semantic Textual Similarity +2

Self-supervised Document Clustering Based on BERT with Data Augment

no code implementations17 Nov 2020 Haoxiang Shi, Cen Wang

Contrastive learning is a promising approach to unsupervised learning, as it inherits the advantages of well-studied deep models without a dedicated and complex model design.

Clustering Contrastive Learning +2

Cannot find the paper you are looking for? You can Submit a new open access paper.