ClassificationHead

class torchelie.models.ClassificationHead(in_channels: int, num_classes: int)

A one layer classification head, turning activations / features into class log probabilities.

It initially contains an avgpool-flatten-linear architecture.

Parameters
  • in_channels (int) – the number of features in the last layer of the feature extractor

  • num_classes (int) – the number of output classes

leaky()torchelie.models.classifier.ClassificationHead

Make relus leaky

remove_pool(spatial_size: int)torchelie.models.classifier.ClassificationHead

remove the pooling operation

rm_dropout()torchelie.models.classifier.ClassificationHead

Experimental: Remove the dropout layers if any. .. warning:

ClassificationHead.rm_dropout() is experimental, and may change or be deleted soon if not already broken
set_num_classes(classes: int)torchelie.models.classifier.ClassificationHead

change the number of output classes

set_pool_size(size: int)torchelie.models.classifier.ClassificationHead

Average pool to spatial size size rather than 1. Recreate the first Linear to accomodate the change.

to_concat_pool()torchelie.models.classifier.ClassificationHead
to_convolutional()torchelie.models.classifier.ClassificationHead

Remove pooling and flattening operations, convert linears to conv1x1

to_resnet_style()torchelie.models.classifier.ClassificationHead

Set the classifier architecture to avgpool-flatten-linear.

to_two_layers(hidden_channels: int)torchelie.models.classifier.ClassificationHead

Set the classifier architecture to avgpool-flatten-linear1-relu-linear2.

to_vgg_style(hidden_channels: int)torchelie.models.classifier.ClassificationHead

Set the classifier architecture to avgpool-flatten-linear1-relu-dropout-linear2-relu-dropout-linear3, like initially done with VGG.

linear1: torch.nn.modules.linear.Linear