Search Results for author: Shiqi Gao

Found 3 papers, 1 papers with code

MIKO: Multimodal Intention Knowledge Distillation from Large Language Models for Social-Media Commonsense Discovery

no code implementations28 Feb 2024 Feihong Lu, Weiqi Wang, Yangyifei Luo, Ziqin Zhu, Qingyun Sun, Baixuan Xu, Haochen Shi, Shiqi Gao, Qian Li, Yangqiu Song, JianXin Li

However, understanding the intention behind social media posts remains challenging due to the implicitness of intentions in social media posts, the need for cross-modality understanding of both text and images, and the presence of noisy information such as hashtags, misspelled words, and complicated abbreviations.

Knowledge Distillation Language Modelling +2

SSGD: A safe and efficient method of gradient descent

1 code implementation3 Dec 2020 Jinhuan Duan, Xianxian Li, Shiqi Gao, Jinyan Wang, Zili Zhong

In this paper, to prevent gradient leakage while keeping the accuracy of model, we propose the super stochastic gradient descent approach to update parameters by concealing the modulus length of gradient vectors and converting it or them into a unit vector.

Building a Computer Mahjong Player via Deep Convolutional Neural Networks

no code implementations5 Jun 2019 Shiqi Gao, Fuminori Okuya, Yoshihiro Kawahara, Yoshimasa Tsuruoka

The evaluation function for imperfect information games is always hard to define but owns a significant impact on the playing strength of a program.

Game of Go

Cannot find the paper you are looking for? You can Submit a new open access paper.