We propose a new methodology to assign and learn embeddings for numbers.
It has been shown that training multi-task models with auxiliary tasks can improve the target task quality through cross-task transfer.
Numbers are essential components of text, like any other word tokens, from which natural language processing (NLP) models are built and deployed.
Attention-based models have shown significant improvement over traditional algorithms in several NLP tasks.
Vector representations of sentences, trained on massive text corpora, are widely used as generic sentence embeddings across a variety of NLP problems.