megengine.functional.nn.binary_cross_entropy#

binary_cross_entropy(pred, label, with_logits=True, reduction='mean')[source]#

Computes the binary cross entropy loss (using logits by default).

Parameters:
  • pred (Tensor) – (N, *), where * means any number of additional dimensions.

  • label (Tensor) – (N, *), same shape as the input.

  • with_logits (bool) – bool, whether to apply sigmoid first. Default: True

  • reduction (str) – the reduction to apply to the output: ‘none’ | ‘mean’ | ‘sum’.

Return type:

Tensor

Returns:

loss value.

Examples

By default(with_logitis is True), pred is assumed to be logits, class probabilities are given by softmax. It has better numerical stability compared with sequential calls to sigmoid and binary_cross_entropy.

>>> pred = Tensor([0.9, 0.7, 0.3])
>>> label = Tensor([1., 1., 1.])
>>> F.nn.binary_cross_entropy(pred, label)
Tensor(0.4328984, device=xpux:0)
>>> F.nn.binary_cross_entropy(pred, label, reduction="none")
Tensor([0.3412 0.4032 0.5544], device=xpux:0)

If the pred value has been probabilities, set with_logits to False:

>>> pred = Tensor([0.9, 0.7, 0.3])
>>> label = Tensor([1., 1., 1.])
>>> F.nn.binary_cross_entropy(pred, label, with_logits=False)
Tensor(0.5553361, device=xpux:0)
>>> F.nn.binary_cross_entropy(pred, label, with_logits=False, reduction="none")
Tensor([0.1054 0.3567 1.204 ], device=xpux:0)