Recipe Generation

10 papers with code • 5 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Latest papers with no code

Recipe Generation from Unsegmented Cooking Videos

no code yet • 21 Sep 2022

However, unlike DVC, in recipe generation, recipe story awareness is crucial, and a model should extract an appropriate number of events in the correct order and generate accurate sentences based on them.

Ratatouille: A tool for Novel Recipe Generation

no code yet • 10 May 2022

Due to availability of a large amount of cooking recipes online, there is a growing interest in using this as data to create novel recipes.

Learning Structural Representations for Recipe Generation and Food Retrieval

no code yet • 4 Oct 2021

Our approach brings together several novel ideas in a systematic framework: (1) exploiting an unsupervised learning approach to obtain the sentence-level tree structure labels before training; (2) generating trees of target recipes from images with the supervision of tree structure labels learned from (1); and (3) integrating the learned tree structures into the recipe generation and food cross-modal retrieval procedure.

Building Hierarchically Disentangled Language Models for Text Generation with Named Entities

no code yet • COLING 2020

Named entities pose a unique challenge to traditional methods of language modeling.

Decomposing Generation Networks with Structure Prediction for Recipe Generation

no code yet • 27 Jul 2020

Recipe generation from food images and ingredients is a challenging task, which requires the interpretation of the information from another modality.

Set to Ordered Text: Generating Discharge Instructions from Medical Billing Codes

no code yet • IJCNLP 2019

This task differs from other natural language generation tasks in the following ways: (1) The input is a set of identifiable entities (ICD codes) where the relations between individual entity are not explicitly specified.

Reference-Aware Language Models

no code yet • EMNLP 2017

We propose a general class of language models that treat reference as an explicit stochastic latent variable.