Difference between revisions of "Logistic Regression (LR)"
m (BPeat moved page Logistic Regression to Logistic Regression (LR) without leaving a redirect) |
|||
Line 1: | Line 1: | ||
− | [http://www.youtube.com/results?search_query= | + | [http://www.youtube.com/results?search_query=Logistic+Regression+artificial+intelligence YouTube search...] |
+ | [http://www.google.com/search?q=Logistic+Regression+machine+learning+ML ...Google search] | ||
* [[AI Solver]] | * [[AI Solver]] |
Revision as of 00:49, 6 January 2019
YouTube search... ...Google search
Logistic Regression is another technique borrowed by machine learning from the field of statistics. It is the go-to method for binary classification problems (problems with two class values). Logistic regression is like linear regression in that the goal is to find the values for the coefficients that weight each input variable. Unlike linear regression, the prediction for the output is transformed using a non-linear function called the logistic function. The logistic function looks like a big S and will transform any value into the range 0 to 1. This is useful because we can apply a rule to the output of the logistic function to snap values to 0 and 1 (e.g. IF less than 0.5 then output 1) and predict a class value. Because of the way that the model is learned, the predictions made by logistic regression can also be used as the probability of a given data instance belonging to class 0 or class 1. This can be useful on problems where you need to give more rationale for a prediction. Like linear regression, logistic regression does work better when you remove attributes that are unrelated to the output variable as well as attributes that are very similar (correlated) to each other. It’s a fast model to learn and effective on binary classification problems. A Short Introduction – Logistic Regression Algorithm | EOF (The Ultimate Computing & Technology Blog)
Logistic Regression is used where a discreet output is expected such as the occurrence of some event (Ex. predict whether rain will occur or not). Usually, Logistic regression uses some function to squeeze values to a particular range. “Sigmoid” (Logistic function) is one of such function which has “S” shape curve used for binary classification. It converts values to the range of 0, 1 which interpreted as a probability of occurring some event. y = e^(b0 + b1*x) / (1 + e^(b0 + b1*x)) this is a simple logistic regression equation where b0, b1 are constants. While training values for these will be calculated such that the error between prediction and actual value become minimum. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium
Linear Discriminant Analysis (LDA)
This is a branch of the logistic regression model that can be used when more than 2 classes can exist in the output. Statistical properties of the data, like the mean value for every class separately and the total variance summed up for all classes, are calculated in this model. The predictions allow to calculate the values for each class and determine the class with the most value. To be correct, this model requires the data to be distributed according to the Gaussian bell curve, so all the major outliers should be removed beforehand. This is a great and quite simple model for data classification and building the predictive models for it. Vladimir Fedak | DZone: Top 10 Most Popular AI Models