Great introductory to GANs, focused on the building blocks to neural net/ GANs, and a bit of frequently used models. Might need a small update on what's considered "state-of-the-art" in the course.
The course provides good insight into the world of GANs. I really enjoyed Sharon's explanations which were deep and easy to understand. I really recommend this course to anyone interested in AI.
創建者 Si T P•
Perfect GANs course.Deep explanations,useful code assignments.Thank you.
創建者 Sayak P•
Right mix of theory, practical exercises, and most importantly fun!
創建者 Yongzhong X•
Her explanation was clear but deep, I really enjoyed this course.
It will be better to provide more details for week4's content.
創建者 Sriharsha V•
learned a lot about Generative Adversarial Networks
創建者 guillaume s•
Very good introductory course
創建者 Buoy R•
I enjoyed this course a lot
創建者 Yunfeng C•
Basic introduction to GAN
創建者 Đạt Đ T•
This course is awesome.
創建者 Phillip L•
Good Introduction to GANs. Concepts are explained very well, however this course does not go into depth. But the lecturers provide you with enough references if you want to dive deeper.
The obvious philosophy of DeepLearning.ai is to make Machine Learning easy and accessible for anyone. This is an honorable goal, however it is also dangerous, because at the end of the course you might believe you have mastered GANs when in truth you did not understand much at all. For intance, in the last week I was a little tired, so instead of trying to understand each line of code, I just did the exercise, and I solved it at the first try without really understanding the code. 95% of the code is already there, you have to code less than 5% by yourself. There is not even a final exam with a longer and harder task.
The problem with these easy courses is the fact that the certificates have zero value. If it was just about the certificates, I could do the entire course in one day. No company will take coursera certificates serious because of such easy courses. At least they course creators should be more honest and declare this as a one week course.
創建者 Daniel Y•
This is generally a good course to take. However, compare to the Deep Learning Specialization, there are few lacking points. First, the course touches only high-level concepts, which is good in some point but I expected more low-level as well. Second, Sharon speaks way too fast. Later in the course, I set the speed as 0.75x and it was better. I feel like Andrew spoke little slow in Deep Learning courses and now I feel slower is better than fast. Lastly, I hope that the course offers ppt slides available so that we can refer to it later. Moreover, some slow handwriting interaction would be good (like Andrew).
創建者 Bhargav D P•
It is good but this is not up to the mark like professor Andrew ng. detailed explanation like him is necessary.
Another thing is mam is talking too fast. she should take a break while explaining the things.
創建者 Zaneta S -•
the teacher is speaking too fast...
創建者 Jordan M B•
Started to audit the course, but all the meaningful content is locked unless you subscribe. Pointless.
創建者 Huynh N H•
Very poor support from Mentors. They didn't answer my questions.
創建者 Aladdin P•
I've just completed the specialization and my thoughts are that everyone should take it (that are interested in GANs! I feel Sharon is a great teacher and the entire team did a really good job on putting togethor these courses. After completing it I definitely have a much better view of GANs, their architectures, successes and limitations, and have a solid background to tackle reading papers and implementing them on my own. Thank you for making this specialization!
With all the positives (which is why I rate it 5/5) there are in my opinion things that can be improved. Especially I think there is too much hand holding for the labs, out of 100 rows of codes I code maybe 2-3%. Many of these don't give much value coding but I want to feel like I did it! Unfortunately now I am left guessing if I have truly mastered the material (and I'm quite sure I haven't, so I will need to re-implement these on my own). Also since you state that calculus and linear algebra are prerequisites then stick with it! You are trying to be too inclusive and there are several part of the courses where I thought it was entirely unecessary because everyone taken Calc and Linalg already has this knowledge. I would prefer instead if you spend this time making other videos where you go in more depth, perhaps going through some of the difficult math etc. Hopefully you try to improve this for future courses done by deeplearning.ai
創建者 Vivek V•
The course is an introduction to GANs. You won't build anything particularly powerful but it provides a springboard to the future courses in the series. This course is light on video and instruction and relies more on exercises. This is fine and possibly better since presumably you already understand neural networks well and are just looking to understand how to build GANs. If you do not have a good foundation in deep learning, you should check out Andrew Ng's courses on deep learning first.
The exercises can be easier than they should, if you will. Sometimes, the setup of the code that they give you "for free" includes critical insights. Make sure to carefully read over and understand the code outside of the few lines that you need to code for each assignment.
Also, if you are interested, I encourage you to read some of the works cited, each of which made important contributions. Focus on those that are most relevant to your work. Personally, I found "Interpreting the Latent Space of GANs for Semantic Face Editing" the most compelling.
創建者 B S C•
Good class, it actually touches on mathematical aspects, and the text comprises contemporary work in the field. The programming assignments are well-designed so that, while there are usually only 10-20 lines of code to fill in (at most), one must actually think carefully about what the algorithms are doing, read the pytorch manual, and try some test scripts to make sure tensors are being handled correctly.
This is my first experience with PyTorch, and so far I like working with it better in this context than I have working with TF in other classes and books - pytorch seems to be more of a straightforward extension to the numpy / pandas / sklearn paradigm. The focus is on "what the algorithm does" rather than on "the mechanics of the framework" - although part of that may be due to instructional styles as well.
創建者 Anri L•
Sharon Zhou is one of the best teachers I've ever had. She (1) reaches and almost surpasses Andrew's standards, (2) has a great history of being a great learning herself (similar to Andrew) and (3) is overall infectuously enthusiastic. As an incoming softmore in University, I could say this course is likely the best resource to start learning GANS I've come accross, and I've scoured the internet.
For students that completed Andrew Ng's deep learning course or had a similar course in University, this course only builds on it and most difficult concepts are easy to grasp with the background.
I cannot speak for experts and if this course will benefit you. I'll hazard a prediction and say "yes" or "try the programming assignments and see if you could breeze through or need to learn some".
創建者 Martynov E E•
I would like to dive deeper into the GANs math more deeply, because in modern research it matters to understand ideas lying behind these methods through math. I saw some great examples of that while I was completing cs231n course from Stanford University. Would like to see more here! And also, I think that there is too little programming. I think, usually you would expect people with kind of strong (by that I mean stronger than beginner) background in DL and probably experienced in PyTorch, so in next courses I would like to see more of "hand work" with coding, because it is so important to do stuff yourself to actually learn it. Thank you guys for the great course!
創建者 Archil K•
This is the best course ever.
Before this course I don't know anything about GAN, but now I can understand the GAN.
In week -1 I have learned about Basics of the GAN and other Generative model and their components.
In week -2 I have learned Deep convolution GAN which is now a days used in many applications.
In week -3 I have learned about wasserstein loss and it's important to GAN.
In week -4 I have learned about Controllable generation using GAN, where we can control any of the feature of the GAN.
This course was best for me, I have learned a lot from this course,
I want to thank prof. Andrew for this Amazing Course.
創建者 Chen G•
While much of the basic GAN theory should be well known to people in the DL community, not many have actually had relevant hands on experience. Therefore the exercises in this course are priceless. Not only they let you avoid A LOT of boilerplate code, they also set your expectations as to what the GAN can ACTUALLY produce (often pretty bad results). Also, the course did a great job providing intuition for some of the more mathematically perplexing sections (e.g. Wasserstein GAN). Overall I would probably recommend it to a colleague.
創建者 ARTEM B•
Great course. Only thing I don't like is your try to increase level of challenging in assignments. I know that some people in reviews complain about too easy assignments with just one-line code changes. But in my opinion it's not that bad, if an assignment is well designed with emphasis on important things. In the evening after a long day at work, it can be very exhausting to spend time figuring out what params I have to pass in torch.norm to make test working. I think at least hints could be done more helpful.
創建者 Iván H G•
En general, es un curso básico que brinda los elementos necesarios para entender el funcionamiento de las GAN. Requiere conocimientos de Python para un avance más rápido, ya que las actividades a realizar son 100% programación usando Pytorch. Al contenido le hace falta más rigor matemático, aunque se complementa con los artículos que se citan para mayor profundidad en los temas tratados. Aún así, creo que podría mejorar si se desarrollan más los puntos teóricos (en el sentido matemático).