Difference between revisions of "Evaluation - Measures"

From
Jump to: navigation, search
Line 5: Line 5:
 
* Error Metric - Predictive Modeling works on constructive feedback principle. You build a model. Get feedback from metrics, make improvements and continue until you achieve a desirable accuracy. Evaluation metrics explain the performance of a model. An important aspects of evaluation metrics is their capability to discriminate among model results. [http://www.analyticsvidhya.com/blog/2016/02/7-important-model-evaluation-error-metrics/ 7 Important Model Evaluation Error Metrics Everyone should know | Tavish Srivastava]
 
* Error Metric - Predictive Modeling works on constructive feedback principle. You build a model. Get feedback from metrics, make improvements and continue until you achieve a desirable accuracy. Evaluation metrics explain the performance of a model. An important aspects of evaluation metrics is their capability to discriminate among model results. [http://www.analyticsvidhya.com/blog/2016/02/7-important-model-evaluation-error-metrics/ 7 Important Model Evaluation Error Metrics Everyone should know | Tavish Srivastava]
  
* Precision & Recall - (also called positive predictive value) is the fraction of relevant instances among the retrieved instances, while recall (also known as sensitivity) is the fraction of relevant instances that have been retrieved over the total amount of relevant instances. Both precision and recall are therefore based on an understanding and measure of relevance. [http://en.wikipedia.org/wiki/Precision_and_recall Precision and recall | Wikipedia]
+
 
  
 
https://upload.wikimedia.org/wikipedia/commons/thumb/2/26/Precisionrecall.svg/525px-Precisionrecall.svg.png
 
https://upload.wikimedia.org/wikipedia/commons/thumb/2/26/Precisionrecall.svg/525px-Precisionrecall.svg.png
Line 20: Line 20:
 
== Precision & Recall ==
 
== Precision & Recall ==
 
[http://www.youtube.com/results?search_query=Precision+Recall+artificial+intelligence YouTube search...]
 
[http://www.youtube.com/results?search_query=Precision+Recall+artificial+intelligence YouTube search...]
 +
 +
(also called positive predictive value) is the fraction of relevant instances among the retrieved instances, while recall (also known as sensitivity) is the fraction of relevant instances that have been retrieved over the total amount of relevant instances. Both precision and recall are therefore based on an understanding and measure of relevance. [http://en.wikipedia.org/wiki/Precision_and_recall Precision and recall | Wikipedia]
  
 
<youtube>iIjtgrjgAug</youtube>
 
<youtube>iIjtgrjgAug</youtube>

Revision as of 11:29, 22 September 2018

YouTube search...

Confusion Matrix, Precision, Recall, F Score, ROC Curves, trade off between True Positive Rate and False Positive Rate.

  • Error Metric - Predictive Modeling works on constructive feedback principle. You build a model. Get feedback from metrics, make improvements and continue until you achieve a desirable accuracy. Evaluation metrics explain the performance of a model. An important aspects of evaluation metrics is their capability to discriminate among model results. 7 Important Model Evaluation Error Metrics Everyone should know | Tavish Srivastava


525px-Precisionrecall.svg.png

Confusion Matrix

YouTube search...

Precision & Recall

YouTube search...

(also called positive predictive value) is the fraction of relevant instances among the retrieved instances, while recall (also known as sensitivity) is the fraction of relevant instances that have been retrieved over the total amount of relevant instances. Both precision and recall are therefore based on an understanding and measure of relevance. Precision and recall | Wikipedia

F Score

"F score"+artificial+intelligence YouTube search...

Receiver Operator Curves (ROC) and Area Under the Curve (AUC)

YouTube search...

Tradeoff between sensitivity and specificity