Skip to content

mcd_loss

MCDLoss

Bases: torch.nn.Module

Implementation of the loss function used in Maximum Classifier Discrepancy for Unsupervised Domain Adaptation.

Source code in pytorch_adapt\layers\mcd_loss.py
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
class MCDLoss(torch.nn.Module):
    """
    Implementation of the loss function used in
    [Maximum Classifier Discrepancy for Unsupervised Domain Adaptation](https://arxiv.org/abs/1712.02560).
    """

    def __init__(self, dist_fn: Callable[[torch.Tensor], torch.Tensor] = None):
        """
        Arguments:
            dist_fn: Computes the mean distance between two softmaxed tensors.
                If ```None```, then ```torch.nn.L1Loss``` is used.
        """
        super().__init__()
        self.dist_fn = c_f.default(dist_fn, torch.nn.L1Loss, {})

    def forward(self, x: torch.Tensor, y: torch.Tensor) -> torch.Tensor:
        """
        Arguments:
            x: a batch of class logits
            y: the other batch of class logits
        Returns:
            The discrepancy between the two batches of class logits.
        """
        return mcd_loss(x, y, self.dist_fn)

__init__(dist_fn=None)

Parameters:

Name Type Description Default
dist_fn Callable[[torch.Tensor], torch.Tensor]

Computes the mean distance between two softmaxed tensors. If None, then torch.nn.L1Loss is used.

None
Source code in pytorch_adapt\layers\mcd_loss.py
19
20
21
22
23
24
25
26
def __init__(self, dist_fn: Callable[[torch.Tensor], torch.Tensor] = None):
    """
    Arguments:
        dist_fn: Computes the mean distance between two softmaxed tensors.
            If ```None```, then ```torch.nn.L1Loss``` is used.
    """
    super().__init__()
    self.dist_fn = c_f.default(dist_fn, torch.nn.L1Loss, {})

forward(x, y)

Parameters:

Name Type Description Default
x torch.Tensor

a batch of class logits

required
y torch.Tensor

the other batch of class logits

required

Returns:

Type Description
torch.Tensor

The discrepancy between the two batches of class logits.

Source code in pytorch_adapt\layers\mcd_loss.py
28
29
30
31
32
33
34
35
36
def forward(self, x: torch.Tensor, y: torch.Tensor) -> torch.Tensor:
    """
    Arguments:
        x: a batch of class logits
        y: the other batch of class logits
    Returns:
        The discrepancy between the two batches of class logits.
    """
    return mcd_loss(x, y, self.dist_fn)