Chevron Left
返回到 Sequence Models

學生對 deeplearning.ai 提供的 Sequence Models 的評價和反饋

4.8
17,761 個評分
1,931 個審閱

課程概述

This course will teach you how to build models for natural language, audio, and other sequence data. Thanks to deep learning, sequence algorithms are working far better than just two years ago, and this is enabling numerous exciting applications in speech recognition, music synthesis, chatbots, machine translation, natural language understanding, and many others. You will: - Understand how to build and train Recurrent Neural Networks (RNNs), and commonly-used variants such as GRUs and LSTMs. - Be able to apply sequence models to natural language problems, including text synthesis. - Be able to apply sequence models to audio applications, including speech recognition and music synthesis. This is the fifth and final course of the Deep Learning Specialization. deeplearning.ai is also partnering with the NVIDIA Deep Learning Institute (DLI) in Course 5, Sequence Models, to provide a programming assignment on Machine Translation with deep learning. You will have the opportunity to build a deep learning project with cutting-edge, industry-relevant content....

熱門審閱

AM

Jul 01, 2019

The course is very good and has taught me the all the important concepts required to build a sequence model. The assignments are also very neatly and precisely designed for the real world application.

JY

Oct 30, 2018

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

篩選依據:

1851 - Sequence Models 的 1875 個評論(共 1,910 個)

創建者 Hans E

Mar 03, 2018

Great lectures, great teacher!

I would have given 5 stars but for the problems in the exercises / grader. Some problems that are know for weeks or even months are not resolved. This causes many wasted hours for many hundreds of students. Please solve this and make it a 5 star course.

Many thanks to Andrew Ng and the mentors!

創建者 Tushar B

Jun 12, 2018

Issues with assignments. Took more than 4 hours to figure out the problem.

創建者 Kerry D

May 15, 2018

Too many thing introduced in programming assignments without explanation. Why the high dropout values? Why sometimes one dropout layer, sometimes two? Many things are just given as a formula, and not explained in a way that would let me make my own network for my own problem.

創建者 Lyn S

Apr 04, 2019

Quite a few bugs or abstractions in this course, in comparison to the others the projects feel a bit rushed and pushed together. Andrews's explanations and video lectures were still great though.

創建者 Bradly M

Apr 17, 2019

The scope of this course was highly relevant to me, but unfortunately many of the class materials were broken or otherwise incorrect, making some ungraded portions of the assignments difficult or impossible to achieve. Activity on the discussion boards indicates many people have tripped over this for at least the better part of a year, but no corrections have been made. This was quite frustrating and wasted a good amount of my time.

創建者 Sravan

Apr 19, 2019

Works as a primer. Assignments aren't that great.

創建者 Yun W

Apr 06, 2019

I feel this course is not as carefully designed as previous courses

創建者 Romain L

Mar 25, 2019

The course was great, as ever. But some of the programming exercises were very frustrating. Oscillating from very easy to very difficult, with some unclear (and sometimes erroneous) instructions. I felt this was in sharp contrast with the previous 4 courses of this specialisation, for which the course and exercises were perfect.

創建者 Eero L

Jun 07, 2019

The course content and Andrew Ng are great. The submission process of the assignments is absolutely dreadful. You might get 0 points for correct answers or not, depeding on...well, I have no idea on what. Maybe it's Jupyter Notebook, maybe it's Keras or maybe it's something else. But you must have good search engine skills, since you will most likely spend a lot of time in searching the discussion forum for answers.

創建者 Saumya T

Jun 09, 2019

Codes are not explained. Some codes files are given

創建者 Gautam D

Jun 17, 2019

To be completely honest, I loved Dr. Andrew's method of teaching. But the assignments just flew over my head because I didn't have enough hours of practice of Keras under my belt. I know Keras is there to make things easy but it's very difficult to just trying to pass the grader. To goal of assignments was fantastic, I mean, generating music, etc. sounds really amazing but I feel that if there was some more time given to make us better in Keras and other technicalities then I would've loved this course much more!

創建者 Gaetan J d B

Jun 17, 2019

fairly more complex and deeper as previous courses. Nice ex. however.

創建者 Farzad E

Jun 19, 2019

I gave 5 stars to other courses in this series but this one doesn't deserve 5 stars. There were many typos and bugs in the assignments compared to the other courses of the specialization.

創建者 karishma d

Jun 20, 2019

very basic ..would have wanted much advance level .

創建者 Ben R

Jun 27, 2019

Courses had some issues with the grader, and there were some instances where the expected output in the assignment didn't match the actual output, despite it being correct.

See forums for a range of complaints on the matter.

創建者 Rudolf S

Jul 15, 2019

Quite a lot of bugs in the first week examples. It took me too much time until I browsed the discussion forums.

創建者 Yevgen S

Jul 22, 2019

I took this course after a long pause after I finished the first 3 courses. I would NOT recommend doing it that way. As a result, I felt rusty on some of the coding practices.

I think the course gives great introductory information on RNNs and LSTMs. The first two weeks of the course are spot on. However, I think the third week is lacking. I had hard time making a connection between the lecture material and the assignments.

創建者 Yue

Apr 26, 2019

Esperaba que los ejemplos fueran de otra forma

創建者 Aditya B

May 09, 2019

Really interesting course with fascinating applications. However, in terms of difficulty, it is a significant step up from all the previous courses. A lot of time is spent figuring out the syntax even though the concepts are crystal clear. ( Probably as it is a collaboration with NVIDIA). The programming assignments could be improved.

創建者 Archana A

Oct 07, 2019

This felt the the least prepared and organized course of the series, unfortunately.

創建者 赵凌乔

Sep 20, 2019

The lecture was great but the errors in the programming assignment (especially in formal-typed formulas) really wasted a lot of time and make me confusing at first.

創建者 Bill F

Sep 17, 2019

Toward the end of the specialization, there seemed to be a noticeable drop in both the quality of instruction and the programming assignments. Course 5 on sequence models was much more "hand wavy" than Course 4 on convolution models. At the end of Course 5, I'm still not sure if I learned anything meaningful other than filling in a few blank lines of code to complete the assignment. There was much less intuition provided about the nature of recurrent nets, and then translating that to code was foggy. More attention needs to be paid to how and what the framework is actually doing, not just giving hints at filling the blanks.

Finally, the grader especially in week 3 caused me many, many hours of wasted time and frustration chasing phantom problems in the notebook. Coursera and/or deeplearning.ai does not pay much attention if any to solving the grader or other systemic problems.

創建者 Ragav S

Sep 18, 2019

Would like to learn a bit on how back-prop works when using attention.

創建者 Jorge B S

Sep 23, 2019

This course gives a nice overview of sequence models. If it is true that I do not have an engineering background, I felt it got sometimes a little bit too abstract as compared to other courses of the specialisation. However, I recommend it.

創建者 Loic R W

Sep 26, 2019

The course was especially interesting in week 2 and 3, but the assignments for week 1 were confusing and sometimes it was hard to follow where the logic was coming from.