Tech Stories

EP-22 How Deep Learning will ReCall-Score my New Resolution in 2022


Listen Later

Happy New year!!!

Every year we commit for new resolution to ourself, How deep learning measure the accuracy, precision , senstivity of them

In this episode , I covered following performance metrics of Machine learning

 Confusion Matrix:It is the easiest way to measure the performance of a classification problem where the output can be of two or more type of classes. A confusion matrix is nothing but a table with two dimensions viz. “Actual” and “Predicted” and furthermore, both the dimensions have “True Positives (TP)”, “True Negatives (TN)”, “False Positives (FP)”, “False Negatives (FN)”

  • True Positives (TP) − It is the case when both actual class & predicted class of data point is 1.
  • True Negatives (TN) − It is the case when both actual class & predicted class of data point is 0.
  • False Positives (FP) − It is the case when actual class of data point is 0 & predicted class of data point is 1.
  • False Negatives (FN) − It is the case when actual class of data point is 1 & predicted class of data point is 0.
  • Accuracy

    It is most common performance metric for classification algorithms. It may be defined as the number of correct predictions made as a ratio of all predictions made. We can easily calculate it by confusion matrix with the help of following formula −

    Accuracy=TP+TN/TP+FP+FN+TN

    Precision

    Precision, used in document retrievals, may be defined as the number of correct documents returned by our ML model. We can easily calculate it by confusion matrix with the help of following formula −

    Precision=TP/TP+FP

    Recall or Sensitivity

    Recall may be defined as the number of positives returned by our ML model. We can easily calculate it by confusion matrix with the help of following formula −

    Recall=TP/TP+FN

    Specificity

    Specificity, in contrast to recall, may be defined as the number of negatives returned by our ML model. We can easily calculate it by confusion matrix with the help of following formula −

    Specificity=TN/TN+FP

    F1 Score

    This score will give us the harmonic mean of precision and recall. Mathematically, F1 score is the weighted average of the precision and recall. The best value of F1 would be 1 and worst would be 0. We can calculate F1 score with the help of following formula −

    𝑭𝟏 = 𝟐 ∗ (𝒑𝒓𝒆𝒄𝒊𝒔𝒊𝒐𝒏 ∗ 𝒓𝒆𝒄𝒂𝒍𝒍) / (𝒑𝒓𝒆𝒄𝒊𝒔𝒊𝒐𝒏 + 𝒓𝒆𝒄𝒂𝒍𝒍)

    F1 score is having equal relative contribution of precision and recall.


    Listen the episode on all podcast platform and share your feedback as comments here

    Do check the episode on various platform

    • follow me on instagram  https://www.instagram.com/podcasteramit
    • Apple https://podcasts.apple.com/us/podcast/id1544510362
    • Huhopper Platform https://hubhopper.com/podcast/tech-stories/318515
    • Amazon https://music.amazon.com/podcasts/2fdb5c45-2016-459e-ba6a-3cbae5a1fa4d
    • Spotify https://open.spotify.com/show/2GhCrAjQuVMFYBq8GbLbwa
    • ...more
      View all episodesView all episodes
      Download on the App Store

      Tech StoriesBy Amit Bhatt

      • 5
      • 5
      • 5
      • 5
      • 5

      5

      1 ratings