weight_norm_and_equal_lr

class torchelie.nn.utils.weight_norm_and_equal_lr(m: T_Module, leak: float = 0.0, mode: str = 'fan_in', init_gain: float = 1.0, lr_gain: float = 1.0, name: str = 'weight')

Set weight norm and equalized learning rate like demodulated convs in StyleGAN2 for module m.

The weight matrix is initialized for a leaky relu nonlinearity of slope a. An extra gain can be specified, as well as a differential learning rate multiplier.

See StyleGAN2 paper for more info.