Recurrent Neural Networks

The Legendre Memory Unit (LMU) is mathematically derived to orthogonalize its continuous-time history – doing so by solving d coupled ordinary differential equations (ODEs), whose phase space linearly maps onto sliding windows of time via the Legendre polynomials up to degree d-1. It is optimal for compressing temporal information.

See paper for equations (markdown isn't working).

Official github repo:

Source: Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks


Paper Code Results Date Stars


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign