A Convolutional Gated Recurrent Unit is a type of GRU that combines GRUs with the convolution operation. The update rule for input $x_{t}$ and the previous output $h_{t1}$ is given by the following:
$$ r = \sigma\left(W_{r} \star_{n}\left[h_{t1};x_{t}\right] + b_{r}\right) $$
$$ u = \sigma\left(W_{u} \star_{n}\left[h_{t1};x_{t}\right] + b_{u} \right) $$
$$ c = \rho\left(W_{c} \star_{n}\left[x_{t}; r \odot h_{t1}\right] + b_{c} \right) $$
$$ h_{t} = u \odot h_{t1} + \left(1u\right) \odot c $$
In these equations $\sigma$ and $\rho$ are the elementwise sigmoid and ReLU functions respectively and the $\star_{n}$ represents a convolution with a kernel of size $n \times n$. Brackets are used to represent a feature concatenation.
Source: Delving Deeper into Convolutional Networks for Learning Video RepresentationsPaper  Code  Results  Date  Stars 

Task  Papers  Share 

3D Character Animation From A Single Photo  1  14.29% 
Video Generation  1  14.29% 
Video Prediction  1  14.29% 
Action Recognition  1  14.29% 
Decoder  1  14.29% 
Temporal Action Localization  1  14.29% 
Video Captioning  1  14.29% 
Component  Type 


Convolution

Convolutions  
GRU

Recurrent Neural Networks  
ReLU

Activation Functions  
Sigmoid Activation

Activation Functions 