SelfAttention2d

class torchelie.nn.SelfAttention2d(ch: int, num_heads: int = 1, out_ch: Optional[int] = None, channels_per_head: Optional[int] = None, shape: Optional[Tuple[int, int]] = None, checkpoint: bool = True)

Self Attention such as used in SAGAN or BigGAN.

Parameters

ch (int) – number of input / output channels

forward(x: torch.Tensor)torch.Tensor
training: bool