Paper

Don't Settle for Average, Go for the Max: Fuzzy Sets and Max-Pooled Word Vectors

Recent literature suggests that averaged word vectors followed by simple post-processing outperform many deep learning methods on semantic textual similarity tasks. Furthermore, when averaged word vectors are trained supervised on large corpora of paraphrases, they achieve state-of-the-art results on standard STS benchmarks... (read more)

Results in Papers With Code
(↓ scroll down to see all results)