Search Results for author: Hiroya Takamura

Found 64 papers, 14 papers with code

Revisiting Statistical Laws of Semantic Shift in Romance Cognates

no code implementations COLING 2022 Yoshifumi Kawasaki, Maëlys Salingre, Marzena Karpinska, Hiroya Takamura, Ryo Nagata

This article revisits statistical relationships across Romance cognates between lexical semantic shift and six intra-linguistic variables, such as frequency and polysemy.

Word Embeddings

Generating Racing Game Commentary from Vision, Language, and Structured Data

no code implementations INLG (ACL) 2021 Tatsuya Ishigaki, Goran Topic, Yumi Hamazono, Hiroshi Noji, Ichiro Kobayashi, Yusuke Miyao, Hiroya Takamura

In this study, we introduce a new large-scale dataset that contains aligned video data, structured numerical data, and transcribed commentaries that consist of 129, 226 utterances in 1, 389 races in a game.

Abstractive Document Summarization with Word Embedding Reconstruction

no code implementations RANLP 2021 Jingyi You, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura

Neural sequence-to-sequence (Seq2Seq) models and BERT have achieved substantial improvements in abstractive document summarization (ADS) without and with pre-training, respectively.

Document Summarization Word Embeddings

Making Your Tweets More Fancy: Emoji Insertion to Texts

no code implementations RANLP 2021 Jingun Kwon, Naoki Kobayashi, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura

The results demonstrate that the position of emojis in texts is a good clue to boost the performance of emoji label prediction.

Position

Improving Character-Aware Neural Language Model by Warming up Character Encoder under Skip-gram Architecture

no code implementations RANLP 2021 Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura

Character-aware neural language models can capture the relationship between words by exploiting character-level information and are particularly effective for languages with rich morphology.

Language Modelling

Prompting for Numerical Sequences: A Case Study on Market Comment Generation

1 code implementation3 Apr 2024 Masayuki Kawarada, Tatsuya Ishigaki, Hiroya Takamura

Large language models (LLMs) have been applied to a wide range of data-to-text generation tasks, including tables, graphs, and time-series numerical data-to-text settings.

Comment Generation Data-to-Text Generation +1

Contextualized Word Vector-based Methods for Discovering Semantic Differences with No Training nor Word Alignment

no code implementations19 May 2023 Ryo Nagata, Hiroya Takamura, Naoki Otani, Yoshifumi Kawasaki

In this paper, we propose methods for discovering semantic differences in words appearing in two corpora based on the norms of contextualized word vectors.

Word Alignment

FinTech for Social Good: A Research Agenda from NLP Perspective

no code implementations13 Nov 2022 Chung-Chi Chen, Hiroya Takamura, Hsin-Hsi Chen

Making our research results positively impact on society and environment is one of the goals our community has been pursuing recently.

StoryER: Automatic Story Evaluation via Ranking, Rating and Reasoning

1 code implementation16 Oct 2022 Hong Chen, Duc Minh Vo, Hiroya Takamura, Yusuke Miyao, Hideki Nakayama

Existing automatic story evaluation methods place a premium on story lexical level coherence, deviating from human preference.

Comment Generation

Towards Parameter-Efficient Integration of Pre-Trained Language Models In Temporal Video Grounding

1 code implementation26 Sep 2022 Erica K. Shimomoto, Edison Marrese-Taylor, Hiroya Takamura, Ichiro Kobayashi, Hideki Nakayama, Yusuke Miyao

This paper explores the task of Temporal Video Grounding (TVG) where, given an untrimmed video and a natural language sentence query, the goal is to recognize and determine temporal boundaries of action instances in the video described by the query.

Benchmarking Natural Language Queries +2

Aspect-based Analysis of Advertising Appeals for Search Engine Advertising

no code implementations NAACL (ACL) 2022 Soichiro Murakami, Peinan Zhang, Sho Hoshino, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura

Writing an ad text that attracts people and persuades them to click or act is essential for the success of search engine advertising.

LocFormer: Enabling Transformers to Perform Temporal Moment Localization on Long Untrimmed Videos With a Feature Sampling Approach

no code implementations19 Dec 2021 Cristian Rodriguez-Opazo, Edison Marrese-Taylor, Basura Fernando, Hiroya Takamura, Qi Wu

We propose LocFormer, a Transformer-based model for video grounding which operates at a constant memory footprint regardless of the video length, i. e. number of frames.

Inductive Bias Video Grounding

SciXGen: A Scientific Paper Dataset for Context-Aware Text Generation

no code implementations Findings (EMNLP) 2021 Hong Chen, Hiroya Takamura, Hideki Nakayama

Generating texts in scientific papers requires not only capturing the content contained within the given input but also frequently acquiring the external information called \textit{context}.

Text Generation

Towards Table-to-Text Generation with Numerical Reasoning

1 code implementation ACL 2021 Lya Hulliyyatus Suadaa, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura, Hiroya Takamura

In summary, our contributions are (1) a new dataset for numerical table-to-text generation using pairs of a table and a paragraph of a table description with richer inference from scientific papers, and (2) a table-to-text generation framework enriched with numerical reasoning.

Descriptive Table-to-Text Generation

An Empirical Study of Generating Texts for Search Engine Advertising

no code implementations NAACL 2021 Hidetaka Kamigaito, Peinan Zhang, Hiroya Takamura, Manabu Okumura

Although there are many studies on neural language generation (NLG), few trials are put into the real world, especially in the advertising domain.

Text Generation

Generating Weather Comments from Meteorological Simulations

1 code implementation EACL 2021 Soichiro Murakami, Sora Tanaka, Masatsugu Hangyo, Hidetaka Kamigaito, Kotaro Funakoshi, Hiroya Takamura, Manabu Okumura

The task of generating weather-forecast comments from meteorological simulations has the following requirements: (i) the changes in numerical values for various physical quantities need to be considered, (ii) the weather comments should be dependent on delivery time and area information, and (iii) the comments should provide useful information for users.

Informativeness

GraphPlan: Story Generation by Planning with Event Graph

no code implementations INLG (ACL) 2021 Hong Chen, Raphael Shu, Hiroya Takamura, Hideki Nakayama

In this paper, we focus on planning a sequence of events assisted by event graphs, and use the events to guide the generator.

Story Generation

Commonsense Knowledge Aware Concept Selection For Diverse and Informative Visual Storytelling

no code implementations5 Feb 2021 Hong Chen, Yifei HUANG, Hiroya Takamura, Hideki Nakayama

To enrich the candidate concepts, a commonsense knowledge graph is created for each image sequence from which the concept candidates are proposed.

Informativeness Visual Storytelling

An empirical analysis of existing systems and datasets toward general simple question answering

1 code implementation COLING 2020 Namgi Han, Goran Topic, Hiroshi Noji, Hiroya Takamura, Yusuke Miyao

Our analysis, including shifting of training and test datasets and training on a union of the datasets, suggests that our progress in solving SimpleQuestions dataset does not indicate the success of more general simple question answering.

Natural Language Understanding Question Answering

Learning with Contrastive Examples for Data-to-Text Generation

1 code implementation COLING 2020 Yui Uehara, Tatsuya Ishigaki, Kasumi Aoki, Hiroshi Noji, Keiichi Goshima, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao

Existing models for data-to-text tasks generate fluent but sometimes incorrect sentences e. g., {``}Nikkei gains{''} is generated when {``}Nikkei drops{''} is expected.

Comment Generation Data-to-Text Generation

An Analysis of the Utility of Explicit Negative Examples to Improve the Syntactic Abilities of Neural Language Models

1 code implementation ACL 2020 Hiroshi Noji, Hiroya Takamura

Neural language models are commonly trained only on positive examples, a set of sentences in the training data, but recent studies suggest that the models trained in this way are not capable of robustly handling complex syntactic constructions, such as long-distance agreement.

Sentence

A Neural Pipeline Approach for the PharmaCoNER Shared Task using Contextual Exhaustive Models

no code implementations WS 2019 Mohammad Golam Sohrab, Minh Thang Pham, Makoto Miwa, Hiroya Takamura

We present a neural pipeline approach that performs named entity recognition (NER) and concept indexing (CI), which links them to concept unique identifiers (CUIs) in a knowledge base, for the PharmaCoNER shared task on pharmaceutical drugs and chemical entities.

Entity Embeddings named-entity-recognition +3

Discourse-Aware Hierarchical Attention Network for Extractive Single-Document Summarization

no code implementations RANLP 2019 Tatsuya Ishigaki, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura

To incorporate the information of a discourse tree structure into the neural network-based summarizers, we propose a discourse-aware neural extractive summarizer which can explicitly take into account the discourse dependency tree structure of the source document.

Document Summarization Sentence

Numeracy-600K: Learning Numeracy for Detecting Exaggerated Information in Market Comments

1 code implementation ACL 2019 Chung-Chi Chen, Hen-Hsen Huang, Hiroya Takamura, Hsin-Hsi Chen

In this paper, we attempt to answer the question of whether neural network models can learn numeracy, which is the ability to predict the magnitude of a numeral at some specific position in a text description.

Position

Global Optimization under Length Constraint for Neural Text Summarization

no code implementations ACL 2019 Takuya Makino, Tomoya Iwakura, Hiroya Takamura, Manabu Okumura

The experimental results show that a state-of-the-art neural summarization model optimized with GOLC generates fewer overlength summaries while maintaining the fastest processing speed; only 6. 70{\%} overlength summaries on CNN/Daily and 7. 8{\%} on long summary of Mainichi, compared to the approximately 20{\%} to 50{\%} on CNN/Daily Mail and 10{\%} to 30{\%} on Mainichi with the other optimization methods.

Document Summarization

Generating Market Comments Referring to External Resources

1 code implementation WS 2018 Tatsuya Aoki, Akira Miyazawa, Tatsuya Ishigaki, Keiichi Goshima, Kasumi Aoki, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao

Comments on a stock market often include the reason or cause of changes in stock prices, such as {``}Nikkei turns lower as yen{'}s rise hits exporters.

Text Generation

Semi-supervised Learning with Multi-Domain Sentiment Word Embeddings

no code implementations27 Sep 2018 Ran Tian, Yash Agrawal, Kento Watanabe, Hiroya Takamura

Word embeddings are known to boost performance of many NLP tasks such as text classification, meanwhile they can be enhanced by labels at the document level to capture nuanced meaning such as sentiment and topic.

Domain Adaptation text-classification +2

Exploring the Influence of Spelling Errors on Lexical Variation Measures

no code implementations COLING 2018 Ryo Nagata, Taisei Sato, Hiroya Takamura

This paper introduces and examines the hypothesis that lexical richness measures become unstable in learner English because of spelling errors.

Neural Machine Translation Incorporating Named Entity

no code implementations COLING 2018 Arata Ugawa, Akihiro Tamura, Takashi Ninomiya, Hiroya Takamura, Manabu Okumura

To alleviate these problems, the encoder of the proposed model encodes the input word on the basis of its NE tag at each time step, which could reduce the ambiguity of the input word.

Machine Translation NMT +3

Distinguishing Japanese Non-standard Usages from Standard Ones

no code implementations EMNLP 2017 Tatsuya Aoki, Ryohei Sasano, Hiroya Takamura, Manabu Okumura

Our experimental results show that the model leveraging the context embedding outperforms other methods and provide us with findings, for example, on how to construct context embeddings and which corpus to use.

Machine Translation Word Embeddings

Japanese Sentence Compression with a Large Training Dataset

no code implementations ACL 2017 Shun Hasegawa, Yuta Kikuchi, Hiroya Takamura, Manabu Okumura

In English, high-quality sentence compression models by deleting words have been trained on automatically created large training datasets.

Sentence Sentence Compression

Analyzing Semantic Change in Japanese Loanwords

no code implementations EACL 2017 Hiroya Takamura, Ryo Nagata, Yoshifumi Kawasaki

We analyze semantic changes in loanwords from English that are used in Japanese (Japanese loanwords).

Word Embeddings

Discriminative Analysis of Linguistic Features for Typological Study

no code implementations LREC 2016 Hiroya Takamura, Ryo Nagata, Yoshifumi Kawasaki

We address the task of automatically estimating the missing values of linguistic features by making use of the fact that some linguistic features in typological databases are informative to each other.

Attribute

Cannot find the paper you are looking for? You can Submit a new open access paper.