Chevron Left
返回到 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

學生對 提供的 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization 的評價和反饋

40,056 個評分
4,261 個審閱


This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization....



Oct 09, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation


Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow\n\nThanks.


126 - Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization 的 150 個評論(共 4,191 個)

創建者 Tan T J

Jan 03, 2019

I found the content is very well organised and comprehensive. It truely is a good place to quickly build up the foundation as a Machine Learning Engineer.

創建者 Hafize g g

Jan 03, 2019

Excellent course to understand how NN works without extensively using any deep learning frameworks (except tensorflow at the end)


Jan 02, 2019

Valuable content :)


Jan 03, 2019

Very good course.

創建者 pratik a

Jan 04, 2019

A great course!!

創建者 Subham K

Jan 05, 2019

It was so awesome .I got to know the minute details which would certainly help me in making a better deep learning model.

創建者 Deepinder D

Jan 04, 2019

Awesome course!

創建者 Ricardo A

Jan 04, 2019

Very practical tools to apply to training!

創建者 Duy-Hung N

Jan 04, 2019

Thank you Andrew Ng you are my idol.

創建者 Shayan A B

Jan 05, 2019

Another well-taught course. Cant wait to complete more in the specialization.

創建者 Jorge B

Jan 04, 2019

Very practical, clear and useful.


Jan 04, 2019

Great Learning . Thanks a Lot

創建者 Amir K

Jan 18, 2019

To the point and effective!

創建者 Kirk B

Jan 17, 2019

Andrew Ng is hands down the best teacher in this space. Excellent lectures and a well run course.

創建者 Raj

Jan 17, 2019

Awesome course.

創建者 Elvis S

Jan 18, 2019

Loved this part of the code... it allowed me to understand more about the optimization and regularization tricks such as RMSprop and Dropout.

創建者 Shravan M

Jan 17, 2019

Thank You!!!

創建者 Chen N

Jan 18, 2019

Awesome as always.

創建者 Ayush S

Feb 16, 2019

One of the best curses on Hyperparameters,Regularization and others.

創建者 Yosuke N

Feb 15, 2019

Great course to learn basis of the DNN, Tensorflow, and especially direction of hyperparameter tuning will be very useful knowledge on my job. It will help us escape from maze of parameter optimization.

創建者 Md. Y H

Feb 17, 2019

I learned a lot from this course. Thank you so much for offering such thorough course.

創建者 Myunggwan C

Feb 17, 2019

I'm on the road to improvement with my deep learning skills with the current specialization.

Thank you for providing such a great quality course online.

I also appreciate the mentors who comment to every post in discussion group.

Keep up the good work!


Feb 14, 2019

Awesome course :)

創建者 Guillermo F

Feb 15, 2019

Excellent course, thank you!

創建者 Erick A

Feb 16, 2019

Great, really enjoy it.