Pointer Networks tackle problems where input and output data are sequential data, but can't be solved by seq2seq type models because discrete categories of output elements depend on the variable input size (and are not decided in advance).
A Pointer Network learns the conditional probability of an output sequence with elements that are discrete tokens corresponding to positions in an input sequence. They solve the problem of variable size output dictionaries using additive attention. But instead of using attention to blend hidden units of an encoder to a context vector at each decoder step, Pointer Networks use attention as a pointer to select a member of the input sequence as the output.
PointerNets can be used to learn approximate solutions to challenging geometric problems such as finding planar convex hulls, computing Delaunay triangulations, and the planar Travelling Salesman Problem.
Source: Pointer NetworksPaper  Code  Results  Date  Stars 

Task  Papers  Share 

Combinatorial Optimization  8  8.33% 
Language Modelling  5  5.21% 
Text Generation  5  5.21% 
Question Answering  4  4.17% 
Dialogue State Tracking  4  4.17% 
Abstractive Text Summarization  4  4.17% 
Document Summarization  4  4.17% 
Knowledge Graphs  3  3.13% 
Starcraft II  3  3.13% 
Component  Type 


Additive Attention

Attention Mechanisms  
LSTM

Recurrent Neural Networks  
Softmax

Output Functions 