課程信息
4.8
3,962 個評分
764 個審閱
專項課程

第 1 門課程(共 1 門)

100% 在線

100% 在線

立即開始,按照自己的計劃學習。
可靈活調整截止日期

可靈活調整截止日期

根據您的日程表重置截止日期。
完成時間(小時)

完成時間大約為36 小時

建議:6 weeks of study, 5-8 hours/week...
可選語言

英語(English)

字幕:英語(English), 阿拉伯語(Arabic)

您將獲得的技能

Linear RegressionRidge RegressionLasso (Statistics)Regression Analysis
專項課程

第 1 門課程(共 1 門)

100% 在線

100% 在線

立即開始,按照自己的計劃學習。
可靈活調整截止日期

可靈活調整截止日期

根據您的日程表重置截止日期。
完成時間(小時)

完成時間大約為36 小時

建議:6 weeks of study, 5-8 hours/week...
可選語言

英語(English)

字幕:英語(English), 阿拉伯語(Arabic)

教學大綱 - 您將從這門課程中學到什麼

1
完成時間(小時)
完成時間為 1 小時

Welcome

Regression is one of the most important and broadly used machine learning and statistics tools out there. It allows you to make predictions from data by learning the relationship between features of your data and some observed, continuous-valued response. Regression is used in a massive number of applications ranging from predicting stock prices to understanding gene regulatory networks.<p>This introduction to the course provides you with an overview of the topics we will cover and the background knowledge and resources we assume you have....
Reading
5 個視頻 (總計 20 分鐘), 3 個閱讀材料
Video5 個視頻
Welcome!1分鐘
What is the course about?3分鐘
Outlining the first half of the course5分鐘
Outlining the second half of the course5分鐘
Assumed background4分鐘
Reading3 個閱讀材料
Important Update regarding the Machine Learning Specialization10分鐘
Slides presented in this module10分鐘
Reading: Software tools you'll need10分鐘
完成時間(小時)
完成時間為 3 小時

Simple Linear Regression

Our course starts from the most basic regression model: Just fitting a line to data. This simple model for forming predictions from a single, univariate feature of the data is appropriately called "simple linear regression".<p> In this module, we describe the high-level regression task and then specialize these concepts to the simple linear regression case. You will learn how to formulate a simple regression model and fit the model to data using both a closed-form solution as well as an iterative optimization algorithm called gradient descent. Based on this fitted function, you will interpret the estimated model parameters and form predictions. You will also analyze the sensitivity of your fit to outlying observations.<p> You will examine all of these concepts in the context of a case study of predicting house prices from the square feet of the house....
Reading
25 個視頻 (總計 122 分鐘), 5 個閱讀材料, 2 個測驗
Video25 個視頻
Regression fundamentals: data & model8分鐘
Regression fundamentals: the task2分鐘
Regression ML block diagram4分鐘
The simple linear regression model2分鐘
The cost of using a given line6分鐘
Using the fitted line6分鐘
Interpreting the fitted line6分鐘
Defining our least squares optimization objective3分鐘
Finding maxima or minima analytically7分鐘
Maximizing a 1d function: a worked example2分鐘
Finding the max via hill climbing6分鐘
Finding the min via hill descent3分鐘
Choosing stepsize and convergence criteria6分鐘
Gradients: derivatives in multiple dimensions5分鐘
Gradient descent: multidimensional hill descent6分鐘
Computing the gradient of RSS7分鐘
Approach 1: closed-form solution5分鐘
Approach 2: gradient descent7分鐘
Comparing the approaches1分鐘
Influence of high leverage points: exploring the data4分鐘
Influence of high leverage points: removing Center City7分鐘
Influence of high leverage points: removing high-end towns3分鐘
Asymmetric cost functions3分鐘
A brief recap1分鐘
Reading5 個閱讀材料
Slides presented in this module10分鐘
Optional reading: worked-out example for closed-form solution10分鐘
Optional reading: worked-out example for gradient descent10分鐘
Download notebooks to follow along10分鐘
Reading: Fitting a simple linear regression model on housing data10分鐘
Quiz2 個練習
Simple Linear Regression14分鐘
Fitting a simple linear regression model on housing data8分鐘
2
完成時間(小時)
完成時間為 3 小時

Multiple Regression

The next step in moving beyond simple linear regression is to consider "multiple regression" where multiple features of the data are used to form predictions. <p> More specifically, in this module, you will learn how to build models of more complex relationship between a single variable (e.g., 'square feet') and the observed response (like 'house sales price'). This includes things like fitting a polynomial to your data, or capturing seasonal changes in the response value. You will also learn how to incorporate multiple input variables (e.g., 'square feet', '# bedrooms', '# bathrooms'). You will then be able to describe how all of these models can still be cast within the linear regression framework, but now using multiple "features". Within this multiple regression framework, you will fit models to data, interpret estimated coefficients, and form predictions. <p>Here, you will also implement a gradient descent algorithm for fitting a multiple regression model....
Reading
19 個視頻 (總計 87 分鐘), 5 個閱讀材料, 3 個測驗
Video19 個視頻
Polynomial regression3分鐘
Modeling seasonality8分鐘
Where we see seasonality3分鐘
Regression with general features of 1 input2分鐘
Motivating the use of multiple inputs4分鐘
Defining notation3分鐘
Regression with features of multiple inputs3分鐘
Interpreting the multiple regression fit7分鐘
Rewriting the single observation model in vector notation6分鐘
Rewriting the model for all observations in matrix notation4分鐘
Computing the cost of a D-dimensional curve9分鐘
Computing the gradient of RSS3分鐘
Approach 1: closed-form solution3分鐘
Discussing the closed-form solution4分鐘
Approach 2: gradient descent2分鐘
Feature-by-feature update9分鐘
Algorithmic summary of gradient descent approach4分鐘
A brief recap1分鐘
Reading5 個閱讀材料
Slides presented in this module10分鐘
Optional reading: review of matrix algebra10分鐘
Reading: Exploring different multiple regression models for house price prediction10分鐘
Numpy tutorial10分鐘
Reading: Implementing gradient descent for multiple regression10分鐘
Quiz3 個練習
Multiple Regression18分鐘
Exploring different multiple regression models for house price prediction16分鐘
Implementing gradient descent for multiple regression10分鐘
3
完成時間(小時)
完成時間為 2 小時

Assessing Performance

Having learned about linear regression models and algorithms for estimating the parameters of such models, you are now ready to assess how well your considered method should perform in predicting new data. You are also ready to select amongst possible models to choose the best performing. <p> This module is all about these important topics of model selection and assessment. You will examine both theoretical and practical aspects of such analyses. You will first explore the concept of measuring the "loss" of your predictions, and use this to define training, test, and generalization error. For these measures of error, you will analyze how they vary with model complexity and how they might be utilized to form a valid assessment of predictive performance. This leads directly to an important conversation about the bias-variance tradeoff, which is fundamental to machine learning. Finally, you will devise a method to first select amongst models and then assess the performance of the selected model. <p>The concepts described in this module are key to all machine learning problems, well-beyond the regression setting addressed in this course....
Reading
14 個視頻 (總計 93 分鐘), 2 個閱讀材料, 2 個測驗
Video14 個視頻
What do we mean by "loss"?4分鐘
Training error: assessing loss on the training set7分鐘
Generalization error: what we really want8分鐘
Test error: what we can actually compute4分鐘
Defining overfitting2分鐘
Training/test split1分鐘
Irreducible error and bias6分鐘
Variance and the bias-variance tradeoff6分鐘
Error vs. amount of data6分鐘
Formally defining the 3 sources of error14分鐘
Formally deriving why 3 sources of error20分鐘
Training/validation/test split for model selection, fitting, and assessment7分鐘
A brief recap1分鐘
Reading2 個閱讀材料
Slides presented in this module10分鐘
Reading: Exploring the bias-variance tradeoff10分鐘
Quiz2 個練習
Assessing Performance26分鐘
Exploring the bias-variance tradeoff8分鐘
4
完成時間(小時)
完成時間為 3 小時

Ridge Regression

You have examined how the performance of a model varies with increasing model complexity, and can describe the potential pitfall of complex models becoming overfit to the training data. In this module, you will explore a very simple, but extremely effective technique for automatically coping with this issue. This method is called "ridge regression". You start out with a complex model, but now fit the model in a manner that not only incorporates a measure of fit to the training data, but also a term that biases the solution away from overfitted functions. To this end, you will explore symptoms of overfitted functions and use this to define a quantitative measure to use in your revised optimization objective. You will derive both a closed-form and gradient descent algorithm for fitting the ridge regression objective; these forms are small modifications from the original algorithms you derived for multiple regression. To select the strength of the bias away from overfitting, you will explore a general-purpose method called "cross validation". <p>You will implement both cross-validation and gradient descent to fit a ridge regression model and select the regularization constant....
Reading
16 個視頻 (總計 85 分鐘), 5 個閱讀材料, 3 個測驗
Video16 個視頻
Overfitting demo7分鐘
Overfitting for more general multiple regression models3分鐘
Balancing fit and magnitude of coefficients7分鐘
The resulting ridge objective and its extreme solutions5分鐘
How ridge regression balances bias and variance1分鐘
Ridge regression demo9分鐘
The ridge coefficient path4分鐘
Computing the gradient of the ridge objective5分鐘
Approach 1: closed-form solution6分鐘
Discussing the closed-form solution5分鐘
Approach 2: gradient descent9分鐘
Selecting tuning parameters via cross validation3分鐘
K-fold cross validation5分鐘
How to handle the intercept6分鐘
A brief recap1分鐘
Reading5 個閱讀材料
Slides presented in this module10分鐘
Download the notebook and follow along10分鐘
Download the notebook and follow along10分鐘
Reading: Observing effects of L2 penalty in polynomial regression10分鐘
Reading: Implementing ridge regression via gradient descent10分鐘
Quiz3 個練習
Ridge Regression18分鐘
Observing effects of L2 penalty in polynomial regression14分鐘
Implementing ridge regression via gradient descent16分鐘
4.8
764 個審閱Chevron Right
職業方向

41%

完成這些課程後已開始新的職業生涯
工作福利

40%

通過此課程獲得實實在在的工作福利
職業晉升

17%

加薪或升職

熱門審閱

創建者 PDMar 17th 2016

I really enjoyed all the concepts and implementations I did along this course....except during the Lasso module. I found this module harder than the others but very interesting as well. Great course!

創建者 CMJan 27th 2016

I really like the top-down approach of this specialization. The iPython code assignments are very well structured. They are presented in a step-by-step manner while still being challenging and fun!

講師

Avatar

Emily Fox

Amazon Professor of Machine Learning
Statistics
Avatar

Carlos Guestrin

Amazon Professor of Machine Learning
Computer Science and Engineering

關於 华盛顿大学

Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the preeminent research universities in the world....

關於 机器学习 專項課程

This Specialization from leading researchers at the University of Washington introduces you to the exciting, high-demand field of Machine Learning. Through a series of practical case studies, you will gain applied experience in major areas of Machine Learning including Prediction, Classification, Clustering, and Information Retrieval. You will learn to analyze large and complex datasets, create systems that adapt and improve over time, and build intelligent applications that can make predictions from data....
机器学习

常見問題

  • 注册以便获得证书后,您将有权访问所有视频、测验和编程作业(如果适用)。只有在您的班次开课之后,才可以提交和审阅同学互评作业。如果您选择在不购买的情况下浏览课程,可能无法访问某些作业。

  • 您注册课程后,将有权访问专项课程中的所有课程,并且会在完成课程后获得证书。您的电子课程证书将添加到您的成就页中,您可以通过该页打印您的课程证书或将其添加到您的领英档案中。如果您只想阅读和查看课程内容,可以免费旁听课程。

還有其他問題嗎?請訪問 學生幫助中心