no code implementations • 16 Dec 2023 • Canlin Zhang, Xiuwen Liu
Embedding-based models usually need fine-tuning on new entity embeddings, and hence are difficult to be directly applied to inductive link prediction tasks.
no code implementations • 12 Nov 2020 • Canlin Zhang, Chun-Nan Hsu, Yannis Katsis, Ho-Cheol Kim, Yoshiki Vazquez-Baeza
Discovering precise and interpretable rules from knowledge graphs is regarded as an essential challenge, which can improve the performances of many downstream tasks and even provide new ways to approach some Natural Language Processing research topics.
1 code implementation • 22 Apr 2020 • Canlin Zhang, Xiuwen Liu
In order to create suitable labels for the training of sense spectra, we designed a new similarity measurement for noun and verb synsets in WordNet.
1 code implementation • 18 Mar 2020 • Canlin Zhang, Xiuwen Liu, Daniel Bis
To improve the generalization of the representations for natural language processing tasks, words are commonly represented using vectors, where distances among the vectors are related to the similarity of the words.
no code implementations • 18 Oct 2019 • Shaeke Salman, Canlin Zhang, Xiuwen Liu, Washington Mio
We show that the generalization intervals of a ReLU network behave similarly along pairwise directions between samples of the same label in both real and random cases on the MNIST and CIFAR-10 datasets.