langfair.metrics.classification.metrics.false_omission.FalseOmissionRateParity#

class langfair.metrics.classification.metrics.false_omission.FalseOmissionRateParity#

Bases: Metric

__init__()#

This class computes false omission rate parity. The user may specify whether to compute this metric as a difference or a ratio. For more information on these metrics, see Bellamy et al. (2018) [1] and Saleiro et al. (2019) [2].

Methods

__init__()

This class computes false omission rate parity.

binary_confusion_matrix(y_true, y_pred)

Method for computing binary confusion matrix

evaluate(groups, y_pred, y_true[, ratio])

This method computes disparity in false omission rates between two groups.

static binary_confusion_matrix(y_true, y_pred)#

Method for computing binary confusion matrix

Parameters:
  • y_true (Array-like) – Binary labels (ground truth values)

  • y_pred (Array-like) – Binary model predictions

Returns:

2x2 confusion matrix

Return type:

List[List[float]]

evaluate(groups, y_pred, y_true, ratio=False)#

This method computes disparity in false omission rates between two groups.

Parameters:
  • groups (Array-like) – Group indicators. Must contain exactly two unique values.

  • y_pred (Array-like) – Binary model predictions. Positive and negative predictions must be 1 and 0, respectively.

  • y_true (Array-like, default=None) – Binary labels (ground truth values). Positive and negative labels must be 1 and 0, respectively.

  • ratio (bool, default=False) – Indicates whether to compute the metric as a difference or a ratio

Returns:

Value of false omission rate parity

Return type:

float

References