Chevron Left
返回到 Machine Learning: Regression

學生對 华盛顿大学 提供的 Machine Learning: Regression 的評價和反饋

4.8
5,480 個評分

課程概述

Case Study - Predicting Housing Prices In our first case study, predicting house prices, you will create models that predict a continuous value (price) from input features (square footage, number of bedrooms and bathrooms,...). This is just one of the many places where regression can be applied. Other applications range from predicting health outcomes in medicine, stock prices in finance, and power usage in high-performance computing, to analyzing which regulators are important for gene expression. In this course, you will explore regularized linear regression models for the task of prediction and feature selection. You will be able to handle very large sets of features and select between models of various complexity. You will also analyze the impact of aspects of your data -- such as outliers -- on your selected models and predictions. To fit these models, you will implement optimization algorithms that scale to large datasets. Learning Outcomes: By the end of this course, you will be able to: -Describe the input and output of a regression model. -Compare and contrast bias and variance when modeling data. -Estimate model parameters using optimization algorithms. -Tune parameters with cross validation. -Analyze the performance of the model. -Describe the notion of sparsity and how LASSO leads to sparse solutions. -Deploy methods to select between models. -Exploit the model to form predictions. -Build a regression model to predict prices using a housing dataset. -Implement these techniques in Python....

熱門審閱

KM

2020年5月4日

Excellent professor. Fundamentals and math are provided as well. Very good notebooks for the assignments...it’s just that turicreate library that caused some issues, however the course deserves a 5/5

PD

2016年3月16日

I really enjoyed all the concepts and implementations I did along this course....except during the Lasso module. I found this module harder than the others but very interesting as well. Great course!

篩選依據:

776 - Machine Learning: Regression 的 800 個評論(共 984 個)

創建者 emmanuel p

2019年8月4日

great

創建者 李真

2016年2月19日

Great

創建者 Vaibhav K

2020年9月20日

good

創建者 YASA S K R

2020年8月31日

good

創建者 ANKAN M

2020年8月16日

nice

創建者 Saurabh A

2020年7月19日

good

創建者 Keyur M

2020年6月9日

good

創建者 Vaibhav S

2020年5月16日

Good

創建者 Vansh S

2019年5月10日

nice

創建者 王曾

2017年9月25日

good

創建者 Birbal

2016年10月13日

good

創建者 Vasthavayi a V

2022年1月28日

nyc

創建者 FW Y

2017年8月16日

做中学

創建者 Ablaikhan N

2021年3月14日

A+

創建者 Ganji R

2018年11月8日

E

創建者 Anunathan G S

2018年8月28日

L

創建者 IDOWU H A

2018年5月20日

I

創建者 Ruchi S

2017年11月8日

e

創建者 Alessandro B

2017年9月27日

e

創建者 Navinkumar

2017年2月17日

g

創建者 ngoduyvu

2016年2月16日

v

創建者 Miguel P

2015年12月2日

I

創建者 manuel S

2017年8月13日

Interesting course. However, I have some mixed feelings:

I have a BS in mathematics, in Mexico (a "licenciatura", which is just between "BS" and "MS")

So, I'd say I have pretty good knowledge of statistics. So, now it is "training" instead of "fitting". It's "overfitting" instead of "multi colinearity". There are some algorithms to remove/add features (Ridge/Lasso), which -as noted- induce bias in the parameters. However, more "formal" methods susch as stepwise regression and bayesian sequences, are completely ignored.

That'd be fine except for the fact that there not even the slightest attempt to approach statistic significant, neither for the model nor for the individual parameters.

Some other methods (moving averages, Henderson MA, Mahalanobis distances) should also be covered.

So, in summary, an interesting course in the sense that ti gives an idea as to where lies the state of the art, but a little bit disappointing in the sense that -except for some new labels for the same tricks, and a humongous computing power- there is still nothing new under the sun. Still, worth the time invested

創建者 Grant V

2016年2月29日

An excellent and quite extensive foray into regression analyses from single-variable linear regression to nearest-neighbor and kernel regression techniques, including how to use gradient vs. coordinate descent for optimization and proper L1 and L2 regularization methods. The lecture slides have some questionable pedagogical and aesthetic qualities, and they could use some more polish from someone who specializes in teaching presentation methods, but the meat of the course comes from its quizzes and programming assignments, which are well split between practical use (via Graphlab Create and SFrame) and a nuts-and-bolts assignment that have you implement these methods from scratch. An extremely valuable course for someone who wants to use these for a data science application but also wants to understand the mathematics and statistics behind them to an appreciable degree.

創建者 William K

2017年8月23日

The only complaint I have is that the programming exercises were not challenging enough. The lecture videos were great to build up an understanding from fundamentals, but the assignments did not fully test the concepts. There were too many exercises that were fill-in-the-blank with most of the code already written. I would appreciate more rigorous programming exercises to facilitate an in-depth understanding of the topics. Moreover, the programming exercises were not applicable to real-world applications because all the data was already neatly presented and the desired outcome was known ahead of time. In order to mimic real-world machine learning problems, we should be required to clean the data and answer open-ended questions that require exploring and understanding the data before developing machine learning models to extract usable information.