Search Results for author: Shih-Lun Wu

Found 10 papers, 6 papers with code

Listener Model for the PhotoBook Referential Game with CLIPScores as Implicit Reference Chain

1 code implementation16 Jun 2023 Shih-Lun Wu, Yi-Hui Chou, Liangze Li

PhotoBook is a collaborative dialogue game where two players receive private, partially-overlapping sets of images and resolve which images they have in common.

Compose & Embellish: Well-Structured Piano Performance Generation via A Two-Stage Approach

1 code implementation17 Sep 2022 Shih-Lun Wu, Yi-Hsuan Yang

Even with strong sequence models like Transformers, generating expressive piano performances with long-range musical structures remains challenging.

Theme Transformer: Symbolic Music Generation with Theme-Conditioned Transformer

1 code implementation7 Nov 2021 Yi-Jen Shih, Shih-Lun Wu, Frank Zalkow, Meinard Müller, Yi-Hsuan Yang

To condition the generation process of such a model with a user-specified sequence, a popular approach is to take that conditioning sequence as a priming sequence and ask a Transformer decoder to generate a continuation.

Music Generation Representation Learning Sound Multimedia Audio and Speech Processing

MuseMorphose: Full-Song and Fine-Grained Piano Music Style Transfer with One Transformer VAE

1 code implementation10 May 2021 Shih-Lun Wu, Yi-Hsuan Yang

Transformers and variational autoencoders (VAE) have been extensively employed for symbolic (e. g., MIDI) domain music generation.

Music Generation Music Style Transfer +1

Deep Learning for Automatic Quality Grading of Mangoes: Methods and Insights

no code implementations23 Nov 2020 Shih-Lun Wu, Hsiao-Yen Tung, Yu-Lun Hsu

The quality grading of mangoes is a crucial task for mango growers as it vastly affects their profit.

Multi-Task Learning

The Jazz Transformer on the Front Line: Exploring the Shortcomings of AI-composed Music through Quantitative Measures

2 code implementations4 Aug 2020 Shih-Lun Wu, Yi-Hsuan Yang

This paper presents the Jazz Transformer, a generative model that utilizes a neural sequence model called the Transformer-XL for modeling lead sheets of Jazz music.

Cannot find the paper you are looking for? You can Submit a new open access paper.