What is sensitivity in machine learning?
Sensitivity in machine learning is a measure to determine the performance of a machine learning model. Sensitivity determines how well a machine learning model can predict positive instances. Before we understand the sensitivity in machine learning, we need to understand a few terms. They are True Positive, True Negative, False Positive, and False Negative.
True Positive (TP): True Positives (TP) are the output labels that are predicted to be true, and they are actually true.
True Negative (TN): True Negatives (TN) are the output labels that are predicted to be false, and they are actually false.
False Positive (FP): False Positives (FP) are the output labels that are predicted to be true, but they are actually false.
False Negative (FN): False Negatives (FN) are the output labels that are predicted to be false, but they are actually true.
Sensitivity in machine learning is defined as:
Sensitivity is also called the recall, hit rate, or true positive rate.
How to calculate sensitivity using sklearn in Python?
We can use the following Python code to calculate sensitivity using sklearn.
from sklearn.metrics import recall_score y_true = [True, False, True, True, False, False, False, False, True, True] y_pred = [False, False, True, True, False, False, True, False, True, False] recall = recall_score(y_true, y_pred) print("Sensitivity or Recall: ", recall)
Here, y_true indicates the true values of the target variable. And y_pred indicates the predicted values of the target variable. The output of the above program will be:
Sensitivity or Recall: 0.6
0 Comments