Logistic regression
Logistic regression, also known as logit regression or logit model, is a mathematical model used in statistics to estimate (guess) the probability of an event occurring having been given some previous data. Logistic regression works with binary data, where either the event happens (1) or the event does not happen (0). So given some feature x it tries to find out whether some event y happens or not. So y can either be 0 or 1. In the case where the event happens, y is given the value 1. If the event does not happen, then y is given the value of 0. For example, if y represents whether a sports team wins a match, then y will be 1 if they win the match or y will be 0 if they do not. This is known as binomial logistic regression. There is also another form of logistic regression which uses multiple values for the variable y. This form of logistic regression is known as multinomial logistic regression.
Logistic regression uses the logistic function to find a model that fits with the data points. The function gives an 'S' shaped curve to model the data. The curve is restricted between 0 and 1, so it is easy to apply when y is binary. Logistic regression can then model events better than linear regression, as it shows the probability for y being 1 for a given x value. Logistic regression is used in statistics and machine learning to predict values of an input from previous test data.
Basics
[change | change source]Logistic regression is an alternative method to use other than the simpler linear regression. Linear regression tries to predict the data by finding a linear – straight line – equation to model or predict future data points. Logistic regression does not look at the relationship between the two variables as a straight line. Instead, Logistic regression uses the natural logarithm function to find the relationship between the variables and uses test data to find the coefficients. The function can then predict the future results using these coefficients in the logistic equation.
Logistic regression uses the concept of odds ratios to calculate the probability. This is defined as the ratio of the odds of an event happening to its not happening. For example, the probability of a sports team to win a certain match might be 0.75. The probability for that team to lose would be 1 – 0.75 = 0.25. The odds for that team winning would be 0.75/0.25 = 3. This can be said as the odds of the team winning are 3 to 1.[1]
The odds can be defined as:
The natural logarithm of the odds ratio is then taken in order to create the logistic equation. The new equation is known as the logit:
In Logistic regression the logit of the probability is said to be linear with respect to x, so the logit becomes:
Using the two equations together then gives the following:
This then leads to the probability:
This final equation is the logistic curve for Logistic regression. It models the non-linear relationship between x and y with an ‘S’-like curve for the probabilities that y =1 - that event the y occurs. In this example a and b represent the gradients for the logistic function just like in linear regression. The logit equation can then be expanded to handle multiple gradients. This gives more freedom with how the logistic curve matches the data. The multiplication of two vectors can then be used to model more gradient values and give the following equation:
In this equation w = [ w0 , w1 , w2 , ... , wn ] and represents the n gradients for the equation. The powers of x are given by the vector x = [ 1 , x , x2 , .. , xn ] . These two vectors give the new logit equation with multiple gradients. The logistic equation then can then be changed to show this:
This is then a more general logistic equation allowing for more gradient values.
References
[change | change source]- ↑ "What is Logistic Regression? - University of Strathclyde". Archived from the original on 2015-05-08. Retrieved 2015-05-01.
- ↑ "Logistic Regression". faculty.cas.usf.edu.