Cycle Consistency Loss is a type of loss used for generative adversarial networks that performs unpaired imagetoimage translation. It was introduced with the CycleGAN architecture. For two domains $X$ and $Y$, we want to learn a mapping $G : X \rightarrow Y$ and $F: Y \rightarrow X$. We want to enforce the intuition that these mappings should be reverses of each other and that both mappings should be bijections. Cycle Consistency Loss encourages $F\left(G\left(x\right)\right) \approx x$ and $G\left(F\left(y\right)\right) \approx y$. It reduces the space of possible mapping functions by enforcing forward and backwards consistency:
$$ \mathcal{L}_{cyc}\left(G, F\right) = \mathbb{E}_{x \sim p_{data}\left(x\right)}\left[F\left(G\left(x\right)\right)  x_{1}\right] + \mathbb{E}_{y \sim p_{data}\left(y\right)}\left[G\left(F\left(y\right)\right)  y_{1}\right] $$
Source: Unpaired ImagetoImage Translation using CycleConsistent Adversarial NetworksPaper  Code  Results  Date  Stars 

Task  Papers  Share 

ImagetoImage Translation  57  15.36% 
Domain Adaptation  28  7.55% 
Image Generation  26  7.01% 
Semantic Segmentation  17  4.58% 
Style Transfer  14  3.77% 
Unsupervised Domain Adaptation  11  2.96% 
Voice Conversion  11  2.96% 
Unsupervised ImageToImage Translation  9  2.43% 
SuperResolution  9  2.43% 
Component  Type 


🤖 No Components Found  You can add them if they exist; e.g. Mask RCNN uses RoIAlign 