• MLG 007 Logistic Regression

  • Feb 19 2017
  • Length: 35 mins
  • Podcast

MLG 007 Logistic Regression

  • Summary

  • Try a walking desk to stay healthy while you study or work!

    Full notes at ocdevel.com/mlg/7. See Andrew Ng Week 3 Lecture Notes

    Overview
    • Logistic Function: A sigmoid function transforming linear regression output to logits, providing a probability between 0 and 1.
    • Binary Classification: Logistic regression deals with binary outcomes, determining either 0 or 1 based on a threshold (e.g., 0.5).
    • Error Function: Uses log likelihood to measure the accuracy of predictions in logistic regression.
    • Gradient Descent: Optimizes the model by adjusting weights to minimize the error function.
    Classification vs Regression
    • Classification: Predicts a discrete label (e.g., a cat or dog).
    • Regression: Predicts a continuous outcome (e.g., house price).
    Practical Example
    • Train on a dataset of house features to predict if a house is 'expensive' based on labeled data.
    • Automatically categorize into 0 (not expensive) or 1 (expensive) through training and gradient descent.
    Logistic Regression in Machine Learning
    • Neurons in Neural Networks: Act as building blocks, as logistic regression is used to create neurons for more complex models like neural networks.
    • Composable Functions: Demonstrates the compositional nature of machine learning algorithms where functions are built on other functions (e.g., logistic built on linear).
    Show More Show Less

What listeners say about MLG 007 Logistic Regression

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.