Logistic Regression Prediction Tool
Define Pre-Trained Model & New Observation
Prediction Output
Understanding Logistic Regression & Export
Logistic Regression for Binary Outcomes
Logistic Regression is a statistical method used to model the probability of a binary outcome (an event with two possible results, e.g., yes/no, pass/fail, win/lose) based on one or more predictor (independent) variables.
Unlike linear regression which predicts a continuous value, logistic regression predicts a probability, which is always between 0 and 1.
The Logistic Function (Sigmoid):
The core of logistic regression is the logistic function (or sigmoid function), which takes any real-valued number and maps it to a value between 0 and 1:
P(Y=1) = 1 / (1 + e-z)
Where:- P(Y=1) is the probability of the outcome being 1.
- e is the base of the natural logarithm (Euler's number, approx. 2.71828).
- z is the linear combination of the input variables and their coefficients (also known as the logit or log-odds):
z = β₀ + β₁X₁ + β₂X₂ + ... + βnXn- β₀ is the intercept (bias).
- β₁, β₂, ..., βn are the coefficients for the predictor variables X₁, X₂, ..., Xn. These coefficients are typically learned from data using a training process (not performed by this simplified tool).
Interpretation:
- The output P(Y=1) is the estimated probability that the event of interest will occur given the specific values of the predictor variables.
- This probability can then be used to make a classification by comparing it to a threshold (commonly 0.5). If P(Y=1) > threshold, the outcome is classified as 1; otherwise, it's classified as 0.
Note: This tool requires you to provide the intercept (β₀) and coefficients (βᵢ) from an already trained logistic regression model.