no code implementations • 15 Jun 2021 • Zhaozhuo Xu, Minghao Yan, Junyan Zhang, Anshumali Shrivastava
The dot product self-attention in Transformer allows us to model interactions between words.
no code implementations • 16 Feb 2021 • Junyan Zhang
In this paper, we give a new proof of the local well-posedness by the combination of classical energy method and hyperbolic approach and also establish the incompressible limit.
Analysis of PDEs