Chevron Left
返回到 Natural Language Processing with Sequence Models

學生對 提供的 Natural Language Processing with Sequence Models 的評價和反饋

892 個評分
178 條評論


In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....




Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.



This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.


26 - Natural Language Processing with Sequence Models 的 50 個評論(共 187 個)

創建者 Julian D


I​n comparison to the earlier courses I found rewriting the generator each assignment a little bit repetitive. In contrast, I had to understand little of the Tripleloss to complete the task as it was very specific in the instructions. Either don't make the loss function this intricate, provide it or let the learners try hard a little but this way it is a bit scripted instead of figuring stuff out. Overall, I love the series and would encourage to extend it to include some background knowledge. Like 10 % more math :-).

創建者 Li Z


LSTM explanation is not very clear. Have to revisit some external links. The coding exercises are frustrating, even run properly step by step, got much glitches when submitting them. Time spent on fixing the submission issues is longer than taking the lessons.

創建者 Paul J L I


There were a lot of strange errors. Issue with the model. Weird language about elementwise vector addition (all vector addition is elementwise). Quizzes with incorrect language.

創建者 Corine M


The excercises were very easy and did not really give the understanding of NLP.

創建者 Rutvij W


the course could be more in depth

創建者 Vincent R


Very superficial information about different types of neural networks and their uses. Use of Trax makes it nearly impossible to Google anything helpful - a lot of the assignments just tell you to read the documentation. To finish the assignments, you can basically copy-paste the code you're given to set up Trax neural networks and generators without having any idea what you're doing (because, again, the content doesn't go over it).

For example, Week 1 in this course introduces you to Trax (can't be run on Windows), covers aspects of object-oriented programming, and talks at a very high level about how to do things in Trax before moving onto a cursory discussion of generators. Then the assignment has you increment counters, set limits in FOR loops, copy negative-sentiment code from positive-sentiment code that's already completed, and fill in some code that's basically given right above where you write it.

Overall, any code you write is usually very simple but often easy to get wrong because of the lack of direction, e.g. model(x,y) vs model((x,y)). The discussion boards are invaluable because the mistake might have been 3 functions earlier that the built-in tests didn't catch. It feels like a good effort was put in all around to establish this course, but it feels like a first draft course that was never updated.

創建者 Julio W


I​ learn to hate Trax in this course. In the assignements, Trax is used only for toy problems, and then, we use a precomputed model. Even the fastnp is used in a very slow mode. Why to learn NLP in a obscure and really bad documented framework if we will finish to use a precomputed models?

Moreover, I try to replicate the results in my own machine (or even in colab) it does not work, because Trax change a lot between versions. Again, why to use, in a course, a framework that is not stable?

I​n my opinion, using a new and obscure framework to tech new concepts, only because you love it, is (at least) antipedadogical.

創建者 DANG M K


This course material is not good compare to the Deeplearning Specialization. I hope the instructor will write down to explain detail, not just reading from slide

創建者 bdug


I was disapointed by this course:

I did not like at all the use of Trax. At our level (student), we need a well established and documented library like Keras or Pytorch to illustrate the concepts. Trax is badly documented. And since the installation of the Trax version used in the assignement fails in Google Colab (!!), I had hard time reproducing the assignements in google colab.

Week 3 is just a scam since it says "go and read this blog" or "watch this video in another specialization". At that moment I simply felt robbed.

創建者 Dimitry I


Very superficial course, just like the rest in the specialization. Quizzes and assignments are a joke. Didn't want to give negative feedback at first, but now that I am doing course #4 in the specialization, which covers material I don't know much about (Attention), I've realized how bad these courses are. Very sad.

創建者 Yuri C


Among the first three courses of the NLP specialization, this is by far the most exciting. I enjoyed very much all the four weeks and the learning syllabus as a whole! Although many complained about the use of trax as a DL framework, I must say, I found fantastic to be able to learn it from people involved in the development! This per se is already an A+. I congratulate the team in taking this decision and pushing this forward. Trax is intuitive and *very* elegant. Chapeau for the devs! If it is as performant as they say for large data sets, this is the future and I am very pleased that the instructors decided to prepare us for it. Apart from all this positive side, I saw in this third course again some content at the end of the assignments that was not introduced during the corresponding week. For example, the Gumbel sampling at the end of Week 2. This was not a graded exercise, therefore it is not a major problem. Nevertheless, it comes out of the blue for the student and it is hard to connect the dots and understand why are we performing this operation at all for the text generation. So, there are a couple of loose threads here and there along the course. But it is a minimal problem. On the other hand, the presentation and discussion of the sequential models in all 4 weeks are very good, again an optimal balance between mathematical formalism, intuition and ease to code. Moreover, the choice of applications in the four week are just right, classification, generation, NER and one shot learning. All in all an awesome package, congratulations!

創建者 John Y


This was another great course. I previously put on my to do list learning or reviewing about classes and I was happy to see it covered here. I enjoyed learning about data manipulation, sampling, and iteration or generation process and Trax. At first I was a little hesitant about learning a new program or library like Trax but I found Lukas' talk to be helpful and convincing. I feel Trax does simplify the coding process quite nicely. The homework seemed repetitive but I found that approach to be very useful because I think the intent was to help us familiarize with the coding process and Trax more quickly. I previously completed the DL Specialization and appreciated this course very much. Imo, someone new to DL and RNN might find this course confusing because the concepts are not explained as much in depth as in the DL Course.

創建者 Nishant M K


Great course! I needed to check in on some of the discussions in the discussion forums for this one, so the discussion forums are especially useful (for assignments for weeks 3 and 4). As in the first 2 courses in this specialization, this one also adds most value in its 'lab' and assignment Jupyter notebooks. The videos serve as a gentle introduction to the topics and the concepts from the lectures are emphasized upon in the assignments/labs. Great introductory course overall!

創建者 James M


Very good course. The only issue I have is when you have some questions about the code or you have issues if no one else has your problem you seem to be on your own. Sometimes I had just some conceptual coding questions and you can't ask why the code is doing what it is doing. I did learn a lot and for the price it is still worth it.

創建者 Dustin Z


A really good and detailed course on sequence models. This was definitely the most challenging course in the specialization so far in part because of the use of the Trax framework. I really enjoyed reading the source code of Trax and understanding how the ML framework was constructed. This was a very unique part of this course.



Excellent course, I would like to learn a little more to know how to adjust the classification threshold in the Siamese network, tuning of parameters in the LSTM network, and how to solve common error problems in the models performance. This course is a good base to introduce you to the sequence models.

創建者 Ram N P


Would have been more useful if all the code snippets, labs, assignments were in Tensorflow or Pytorch. I understand that Trax is more easy to use and deploy. But untill companies really start using this library it is of very less benefit to learners.

創建者 Sarwar A


Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

創建者 Ahammad U


This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.

創建者 Nikesh B


Awesome course!! Younes explains all the concepts very nicely :) I enjoyed this course a lot and learned many new things, which I am planning to use in my current project. Thanks a lot, Younes

創建者 Hieu D T


This course is much more difficult than the 2 previous ones in the series. Not because of the way instructor transferring but in the knowledge itself. Totally worth taking this course

創建者 Sebastián G A


Excellent course on sequence models and how to solve problems in industry and academia with them. Beautifully structured assignments and well-explained lectures, quite enjoyable!

創建者 Christopher R


I wish the neural networks would be described in greater detail.

Everything else is really nice, Younes explains very well. Assignments are very nicely prepared.

創建者 Sabita B


amazing course. material is very well presented and explained! really loved the data generator part of the code - really drilled in the importance of it!

創建者 Shaida M


Interesting course. I like this specialization very much. I don't understand why one instructor introduces the topic and another instructor explains it.