Logistic Regression Probability Explorer
Paste in logistic regression coefficients to translate log-odds into an easy-to-read probability for the observation you are analysing. The calculator multiplies each coefficient by its feature value, adds the intercept, and applies the logistic function.
Examples
- β₀ = -5.3, β₁ = 0.04 with x₁ = 42, β₂ = 0.00005 with x₂ = 68,000 ⇒ Result: 44.52% predicted probability
- β₀ = 0.7, β₁ = 1.2 with x₁ = 1 ⇒ Result: 86.99% predicted probability
FAQ
How do I include categorical variables?
Use the encoded numeric dummy (0/1) values and multiply by their coefficients. Add separate coefficient/value pairs for each dummy column.
Can I convert the probability back to log-odds?
Yes. Take the output probability p and compute ln(p / (1 − p)). That equals the linear predictor before the logistic transform, which is useful for checking manual calculations.
What about more than three features?
Summarise additional predictors by combining them into the intercept term or reuse the calculator multiple times to inspect their marginal contributions, especially when you want to showcase odds shifts feature by feature.
Additional Information
- Output is the logistic function applied to the linear predictor, expressed as a percentage with two decimals.
- Leave optional coefficient/value pairs blank to ignore unused predictors; they default to zero contribution.
- Probabilities automatically fall between 0% and 100% regardless of the input coefficients.
- For marginal effects, compute the derivative separately: p × (1 − p) × β for the feature of interest.
- Result unit: %