Optimization

Two Time-scale Update Rule

Introduced by Heusel et al. in GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

The Two Time-scale Update Rule (TTUR) is an update rule for generative adversarial networks trained with stochastic gradient descent. TTUR has an individual learning rate for both the discriminator and the generator. The main premise is that the discriminator converges to a local minimum when the generator is fixed. If the generator changes slowly enough, then the discriminator still converges, since the generator perturbations are small. Besides ensuring convergence, the performance may also improve since the discriminator must first learn new patterns before they are transferred to the generator. In contrast, a generator which is overly fast, drives the discriminator steadily into new regions without capturing its gathered information.

Source: GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories