Skip to content

Commit

Permalink
Fix example title in probabilistic_metrics.py (keras-team#20662)
Browse files Browse the repository at this point in the history
  • Loading branch information
dhantule authored and shashaka committed Dec 20, 2024
1 parent 7faa622 commit d9eb698
Showing 1 changed file with 4 additions and 12 deletions.
16 changes: 4 additions & 12 deletions keras/src/metrics/probabilistic_metrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,9 +69,7 @@ class Poisson(reduction_metrics.MeanMetricWrapper):
name: (Optional) string name of the metric instance.
dtype: (Optional) data type of the metric result.
Example:
Example:
Examples:
>>> m = keras.metrics.Poisson()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
Expand Down Expand Up @@ -119,9 +117,7 @@ class BinaryCrossentropy(reduction_metrics.MeanMetricWrapper):
e.g. `label_smoothing=0.2` means that we will use
a value of 0.1 for label "0" and 0.9 for label "1".
Example:
Example:
Examples:
>>> m = keras.metrics.BinaryCrossentropy()
>>> m.update_state([[0, 1], [0, 0]], [[0.6, 0.4], [0.4, 0.6]])
Expand Down Expand Up @@ -195,9 +191,7 @@ class CategoricalCrossentropy(reduction_metrics.MeanMetricWrapper):
axis: (Optional) Defaults to `-1`.
The dimension along which entropy is computed.
Example:
Example:
Examples:
>>> # EPSILON = 1e-7, y = y_true, y` = y_pred
>>> # y` = clip_ops.clip_by_value(output, EPSILON, 1. - EPSILON)
Expand Down Expand Up @@ -282,9 +276,7 @@ class SparseCategoricalCrossentropy(reduction_metrics.MeanMetricWrapper):
axis: (Optional) Defaults to `-1`.
The dimension along which entropy is computed.
Example:
Example:
Examples:
>>> # y_true = one_hot(y_true) = [[0, 1, 0], [0, 0, 1]]
>>> # logits = log(y_pred)
Expand Down

0 comments on commit d9eb698

Please sign in to comment.