entropy_weights
EntropyWeights
¶
Bases: torch.nn.Module
Implementation of entropy weighting described in
Conditional Adversarial Domain Adaptation.
Computes the entropy (x
) per row of the input, and returns
1+exp(-x)
.
This can be used to weight losses, such that the most
confidently scored samples have a higher weighting.
Source code in pytorch_adapt\layers\entropy_weights.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
|
__init__(after_softmax=False, normalizer=None)
¶
Parameters:
Name | Type | Description | Default |
---|---|---|---|
after_softmax |
bool
|
If |
False
|
normalizer |
Callable[[torch.Tensor], torch.Tensor]
|
A callable for normalizing
(e.g. min-max normalization) the weights.
If |
None
|
Source code in pytorch_adapt\layers\entropy_weights.py
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
|
forward(logits)
¶
Parameters:
Name | Type | Description | Default |
---|---|---|---|
logits |
torch.Tensor
|
Raw logits if |
required |
Source code in pytorch_adapt\layers\entropy_weights.py
43 44 45 46 47 48 49 |
|