Search Results for author: Hai-Tao Yu

Found 5 papers, 2 papers with code

A Decoupling and Aggregating Framework for Joint Extraction of Entities and Relations

no code implementations14 May 2024 Yao Wang, Xin Liu, Weikun Kong, Hai-Tao Yu, Teeradaj Racharak, Kyoung-Sook Kim, Minh Le Nguyen

Second, information interaction mainly focuses on the two subtasks, leaving the fine-grained informtion interaction among the subtask-specific features of encoding subjects, relations, and objects unexplored.

named-entity-recognition Named Entity Recognition +1

MM-Point: Multi-View Information-Enhanced Multi-Modal Self-Supervised 3D Point Cloud Understanding

1 code implementation15 Feb 2024 Hai-Tao Yu, Mofei Song

In perception, multiple sensory information is integrated to map visual information from 2D views onto 3D objects, which is beneficial for understanding in 3D environments.

3D Part Segmentation 3D Semantic Segmentation +2

Combining Spiking Neural Network and Artificial Neural Network for Enhanced Image Classification

no code implementations21 Feb 2021 Naoya Muramatsu, Hai-Tao Yu

With the continued innovations of deep neural networks, spiking neural networks (SNNs) that more closely resemble biological brain synapses have attracted attention owing to their low power consumption. However, for continuous data values, they must employ a coding process to convert the values to spike trains. Thus, they have not yet exceeded the performance of artificial neural networks (ANNs), which handle such values directly. To this end, we combine an ANN and an SNN to build versatile hybrid neural networks (HNNs) that improve the concerned performance. To qualify this performance, MNIST and CIFAR-10 image datasets are used for various classification tasks in which the training and coding methods changes. In addition, we present simultaneous and separate methods to train the artificial and spiking layers, considering the coding methods of each. We find that increasing the number of artificial layers at the expense of spiking layers improves the HNN performance. For straightforward datasets such as MNIST, it is easy to achieve the same performance as ANNs by using duplicate coding and separate learning. However, for more complex tasks, the use of Gaussian coding and simultaneous learning is found to improve the accuracy of HNNs while utilizing a smaller number of artificial layers.

General Classification Image Classification

Optimize What You Evaluate With: A Simple Yet Effective Framework For Direct Optimization Of IR Metrics

no code implementations31 Aug 2020 Hai-Tao Yu

To validate the effectiveness of the proposed framework for direct optimization of IR metrics, we conduct a series of experiments on the widely used benchmark collection MSLRWEB30K.

Information Retrieval Learning-To-Rank +1

PT-Ranking: A Benchmarking Platform for Neural Learning-to-Rank

1 code implementation31 Aug 2020 Hai-Tao Yu

We further conducted a series of demo experiments to clearly show the effect of different factors on neural learning-to-rank methods, such as the activation function, the number of layers and the optimization strategy.

Benchmarking Learning-To-Rank

Cannot find the paper you are looking for? You can Submit a new open access paper.