1 code implementation • 3 Aug 2023 • Giovanni Bonetta, Davide Zago, Rossella Cancelliere, Andrea Grosso
We applied and tested our method in particular to some benchmark instances of Job Shop Problem, but this technique is general enough to be potentially used to tackle other different optimal job scheduling tasks with minimal intervention.
1 code implementation • 11 Apr 2022 • Giovanni Bonetta, Matteo Ribero, Rossella Cancelliere
Deep neural networks exploiting millions of parameters are nowadays the norm in deep learning applications.
no code implementations • 19 May 2021 • Giovanni Bonetta, Rossella Cancelliere, Ding Liu, Paul Vozila
Transformer-based models have demonstrated excellent capabilities of capturing patterns and structures in natural language generation and achieved state-of-the-art results in many tasks.
1 code implementation • 4 Feb 2021 • Clément Rebuffel, Marco Roberti, Laure Soulier, Geoffrey Scoutheeten, Rossella Cancelliere, Patrick Gallinari
Specifically, we propose a Multi-Branch Decoder which is able to leverage word-level labels to learn the relevant parts of each training instance.
Ranked #3 on
Table-to-Text Generation
on WikiBio
1 code implementation • 26 Apr 2019 • Marco Roberti, Giovanni Bonetta, Rossella Cancelliere, Patrick Gallinari
In the last few years, many different methods have been focusing on using deep recurrent neural networks for natural language generation.
Ranked #2 on
Data-to-Text Generation
on E2E NLG Challenge
no code implementations • 25 Aug 2015 • Rossella Cancelliere, Mario Gai, Patrick Gallinari, Luca Rubini
In this paper we consider the training of single hidden layer neural networks by pseudoinversion, which, in spite of its popularity, is sometimes affected by numerical instability issues.