1 / 8

Logistic Regression: Understanding Confidence and Predicted Probability

Join Brian Chu for an informative session on logistic regression, exploring the concept of confidence and how a linear combination of features can predict the probability of an outcome. Don't miss out on this opportunity to enhance your understanding of this important statistical technique.

dnazario
Download Presentation

Logistic Regression: Understanding Confidence and Predicted Probability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 189 Brian Chu brian.c@berkeley.edu Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge) twitter: @brrrianchu brianchu.com

  2. Agenda • Email me for slides • Questions? • Random / HW • Why logistic regression • Worksheet

  3. Questions • Any grad students? • Thoughts on final project? • Who would be able to make my 12-1pm section? • Lecture / worksheet split section • Questions? Concerns? • Lecture pace / content / coverage?

  4. Features • sklearn hog, sklearntfidf, bag of words, etc.

  5. Terminology • Shrinkage (regularization) • Variable with a hat (ŷ)  estimated/predicted • P(Y | X) ∝ P(X |Y) P(Y) • posterior ∝ likelihood * prior

  6. Why logistic regression • Odds • measure of relative confidence • P = .9998; 4999:1 • P = .9999; 9999:1 • Doubled confidence! • .5001%  .5002; 1.0004:1  1.0008:1 • (basically no change in confidence) • “relative increase or decrease of a factor by one unit becomes more pronounced as the factors absolute difference increases.”

  7. Log-odds (calculations in base 10) • (0, 1)  (-∞, ∞) • Symmetric: .99 ≈ 2, .01 ≈ -2 • X units of log-odds  same Y % change in confidence • 0.5  0.91 ≈ 0  1 • .999  .9999 ≈ 3  4 • “Log-odds make it clear that increasing from 99.9% to 99.99% is just as hard as increasing from 50% to 91%” Credit: https://dl.dropboxusercontent.com/u/34547557/log-probability.pdf

  8. Logistic Regression • w •x = lg[ P(Y=1|x) / (1 – P(Y=1|x) ] • Intuition: some linear combination of the features tells us the log-odds that Y = 1 • Intuition: some linear combination of the features tells us the “confidence” that Y = 1

More Related