Difference between revisions of "Evaluation - Measures"

From
Jump to: navigation, search
Line 1: Line 1:
[http://www.youtube.com/results?search_query=Evaluation+Matrics+Confusion+Matrix+Precision+Recall+score+ROC+Curves+Classification+Performance+Precision+Recall+Error+Metric+artificial+intelligence YouTube search...]
+
[http://www.youtube.com/results?search_query=Evaluation+Matrices+Confusion+Matrix+Precision+Recall+Score+ROC+Classification+Performance+Error+Metric+artificial+intelligence YouTube search...]
  
 
* [http://medium.com/greyatom/performance-metrics-for-classification-problems-in-machine-learning-part-i-b085d432082b Performance Metrics for Classification problems in Machine Learning | Mohammed Sunasra = Medium]
 
* [http://medium.com/greyatom/performance-metrics-for-classification-problems-in-machine-learning-part-i-b085d432082b Performance Metrics for Classification problems in Machine Learning | Mohammed Sunasra = Medium]
Line 23: Line 23:
 
<youtube>sHHnCmy6q00</youtube>
 
<youtube>sHHnCmy6q00</youtube>
  
=== Precision & Recall ===
 
[http://www.youtube.com/results?search_query=Precision+Recall+artificial+intelligence YouTube search...]
 
  
(also called positive predictive value) is the fraction of relevant instances among the retrieved instances, while recall (also known as sensitivity) is the fraction of relevant instances that have been retrieved over the total amount of relevant instances. Both precision and recall are therefore based on an understanding and measure of relevance. [http://en.wikipedia.org/wiki/Precision_and_recall Precision and recall | Wikipedia]
 
 
https://upload.wikimedia.org/wikipedia/commons/thumb/2/26/Precisionrecall.svg/525px-Precisionrecall.svg.png
 
 
<youtube>iIjtgrjgAug</youtube>
 
<youtube>j-EB6RqqjGI</youtube>
 
  
 
=== Accuracy ===
 
=== Accuracy ===
Line 42: Line 34:
 
<youtube>g3sxDtlGlAM</youtube>
 
<youtube>g3sxDtlGlAM</youtube>
 
<youtube>rdiQy119Wp4</youtube>
 
<youtube>rdiQy119Wp4</youtube>
 +
 +
=== Precision & Recall ===
 +
[http://www.youtube.com/results?search_query=Precision+Recall+artificial+intelligence YouTube search...]
 +
 +
(also called positive predictive value) is the fraction of relevant instances among the retrieved instances, while recall (also known as sensitivity) is the fraction of relevant instances that have been retrieved over the total amount of relevant instances. Both precision and recall are therefore based on an understanding and measure of relevance. [http://en.wikipedia.org/wiki/Precision_and_recall Precision and recall | Wikipedia]
 +
 +
http://cdn-images-1.medium.com/max/600/1*KhlD7Js9leo0B0zfsIfAIA.png
 +
 +
http://upload.wikimedia.org/wikipedia/commons/thumb/2/26/Precisionrecall.svg/525px-Precisionrecall.svg.png
 +
 +
<youtube>iIjtgrjgAug</youtube>
 +
<youtube>j-EB6RqqjGI</youtube>
  
 
=== F1 Score (F-Measure) ===
 
=== F1 Score (F-Measure) ===

Revision as of 12:10, 22 September 2018

YouTube search...

Error Metric

YouTube search...

Predictive Modeling works on constructive feedback principle. You build a model. Get feedback from metrics, make improvements and continue until you achieve a desirable accuracy. Evaluation metrics explain the performance of a model. An important aspects of evaluation metrics is their capability to discriminate among model results. 7 Important Model Evaluation Error Metrics Everyone should know | Tavish Srivastava

Confusion Matrix

YouTube search...

A performance measurement for machine learning classification Understanding Confusion Matrix | Sarang Narkhede - Medium

1*7EYylA6XlXSGBCF77j_rOA.png


Accuracy

YouTube search...

The number of correct predictions made by the model over all kinds predictions made.

1*5XuZ_86Rfce3qyLt7XMlhw.png

Precision & Recall

YouTube search...

(also called positive predictive value) is the fraction of relevant instances among the retrieved instances, while recall (also known as sensitivity) is the fraction of relevant instances that have been retrieved over the total amount of relevant instances. Both precision and recall are therefore based on an understanding and measure of relevance. Precision and recall | Wikipedia

1*KhlD7Js9leo0B0zfsIfAIA.png

525px-Precisionrecall.svg.png

F1 Score (F-Measure)

YouTube search...

Receiver Operator Curves (ROC) and Area Under the Curve (AUC)

YouTube search...

Example Use: Tradeoffs

'Sensitivity' & 'Specificity':

'True Positive Rate' & 'False Positive Rate':