# Losses

Collection of Ivy loss functions.

ivy.binary_cross_entropy(true, pred, /, *, epsilon=1e-07, reduction='none', out=None)[source]

Computes the binary cross entropy loss.

Parameters
• true (`Union`[`Array`, `NativeArray`]) – input array containing true labels.

• pred (`Union`[`Array`, `NativeArray`]) – input array containing Predicted labels.

• epsilon (`float`) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the (default: `1e-07`) loss. If epsilon is `0`, no smoothing will be applied. Default: `1e-7`.

• out (`Optional`[`Array`]) – optional output array, for writing the result to. It must have a shape (default: `None`) that the inputs broadcast to.

Return type

`Array`

Returns

ret – The binary cross entropy between the given distributions.

Functional Examples

With `ivy.Array` input:

```>>> x = ivy.array([0, 1, 0, 0])
>>> y = ivy.array([0.2, 0.8, 0.3, 0.8])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.223,0.223,0.357,1.61])
```
```>>> x = ivy.array([[0, 1, 0, 0]])
>>> y = ivy.array([[0.6, 0.2, 0.7, 0.3]])
>>> z = ivy.binary_cross_entropy(x, y, epsilon=1e-3)
>>> print(z)
ivy.array([[0.916,1.61,1.2,0.357]])
```

With `ivy.NativeArray` input:

```>>> x = ivy.native_array([0, 1, 0, 1])
>>> y = ivy.native_array([0.2, 0.7, 0.2, 0.6])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.223,0.357,0.223,0.511])
```

With a mix of `ivy.Array` and `ivy.NativeArray` inputs:

```>>> x = ivy.array([0, 0, 1, 1])
>>> y = ivy.native_array([0.1, 0.2, 0.8, 0.6])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.105,0.223,0.223,0.511])
```

With `ivy.Container` input:

```>>> x = ivy.Container(a=ivy.array([1, 0, 0]),b=ivy.array([0, 0, 1]))
>>> y = ivy.Container(a=ivy.array([0.6, 0.2, 0.3]),b=ivy.array([0.8, 0.2, 0.2]))
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
{a:ivy.array([0.511,0.223,0.357]),b:ivy.array([1.61,0.223,1.61])}
```

With a mix of `ivy.Array` and `ivy.Container` inputs:

```>>> x = ivy.array([1 , 1, 0])
>>> y = ivy.Container(a=ivy.array([0.7, 0.8, 0.2]))
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
{
a: ivy.array([0.357, 0.223, 0.223])
}
```

Instance Method Examples

Using `ivy.Array` instance method:

```>>> x = ivy.array([1, 0, 0, 0])
>>> y = ivy.array([0.8, 0.2, 0.2, 0.2])
>>> z = ivy.binary_cross_entropy(x, y)
>>> print(z)
ivy.array([0.223, 0.223, 0.223, 0.223])
```
ivy.cross_entropy(true, pred, /, *, axis=-1, epsilon=1e-07, reduction='sum', out=None)[source]

Computes cross-entropy between predicted and true discrete distributions.

Parameters
• true (`Union`[`Array`, `NativeArray`]) – input array containing true labels.

• pred (`Union`[`Array`, `NativeArray`]) – input array containing the predicted labels.

• axis (`int`) – the axis along which to compute the cross-entropy. If axis is `-1`, (default: `-1`) the cross-entropy will be computed along the last dimension. Default: `-1`.

• epsilon (`float`) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating (default: `1e-07`) the loss. If epsilon is `0`, no smoothing will be applied. Default: `1e-7`.

• out (`Optional`[`Array`]) – optional output array, for writing the result to. It must have a shape (default: `None`) that the inputs broadcast to.

Return type

`Array`

Returns

ret – The cross-entropy loss between the given distributions

Examples

```>>> x = ivy.array([0, 0, 1, 0])
>>> y = ivy.array([0.25, 0.25, 0.25, 0.25])
>>> print(ivy.cross_entropy(x, y))
ivy.array(1.3862944)
```
```>>> z = ivy.array([0.1, 0.1, 0.7, 0.1])
>>> print(ivy.cross_entropy(x, z))
ivy.array(0.35667497)
```
ivy.sparse_cross_entropy(true, pred, /, *, axis=-1, epsilon=1e-07, reduction='sum', out=None)[source]

Computes sparse cross entropy between logits and labels.

Parameters
• true (`Union`[`Array`, `NativeArray`]) – input array containing the true labels as logits.

• pred (`Union`[`Array`, `NativeArray`]) – input array containing the predicted labels as logits.

• axis (`int`) – the axis along which to compute the cross-entropy. If axis is `-1`, the (default: `-1`) cross-entropy will be computed along the last dimension. Default: `-1`.

• epsilon (`float`) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the (default: `1e-07`) loss. If epsilon is `0`, no smoothing will be applied. Default: `1e-7`.

• out (`Optional`[`Array`]) – optional output array, for writing the result to. It must have a shape (default: `None`) that the inputs broadcast to.

Return type

`Array`

Returns

ret – The sparse cross-entropy loss between the given distributions

Functional Examples

With `ivy.Array` input:

```>>> x = ivy.array([2])
>>> y = ivy.array([0.1, 0.1, 0.7, 0.1])
>>> print(ivy.sparse_cross_entropy(x, y))
ivy.array([0.357])
```
```>>> x = ivy.array([3])
>>> y = ivy.array([0.1, 0.1, 0.7, 0.1])
>>> print(ivy.cross_entropy(x, y))
ivy.array(21.793291)
```
```>>> x = ivy.array([2,3])
>>> y = ivy.array([0.1, 0.1])
>>> print(ivy.cross_entropy(x, y))
ivy.array(11.512926)
```

With `ivy.NativeArray` input:

```>>> x = ivy.native_array([4])
>>> y = ivy.native_array([0.1, 0.2, 0.1, 0.1, 0.5])
>>> print(ivy.sparse_cross_entropy(x, y))
ivy.array([0.693])
```

With `ivy.Container` input:

```>>> x = ivy.Container(a=ivy.array([4]))
>>> y = ivy.Container(a=ivy.array([0.1, 0.2, 0.1, 0.1, 0.5]))
>>> print(ivy.sparse_cross_entropy(x, y))
{
a: ivy.array([0.693])
}
```

With a mix of `ivy.Array` and `ivy.NativeArray` inputs:

```>>> x = ivy.array([0])
>>> y = ivy.native_array([0.1, 0.2, 0.6, 0.1])
>>> print(ivy.sparse_cross_entropy(x,y))
ivy.array([2.3])
```

With a mix of `ivy.Array` and `ivy.Container` inputs:

```>>> x = ivy.array([0])
>>> y = ivy.Container(a=ivy.array([0.1, 0.2, 0.6, 0.1]))
>>> print(ivy.sparse_cross_entropy(x,y))
{
a: ivy.array([2.3])
}
```

With `ivy.Array` input:

```>>> x = ivy.array([2])
>>> y = ivy.array([0.1, 0.1, 0.7, 0.1])
>>> print(x.sparse_cross_entropy(y))
ivy.array([0.357])
```

With `ivy.Container` input:

```>>> x = ivy.Container(a=ivy.array([2]))
>>> y = ivy.Container(a=ivy.array([0.1, 0.1, 0.7, 0.1]))
>>> print(x.sparse_cross_entropy(y))
{
a: ivy.array([0.357])
}
```

This should have hopefully given you an overview of the losses submodule,If you have any questions, please feel free to reach out on our discord in the losses channel or in the losses forum!