Loading...

Get to the point! Summarization with pointer-generator networks

Course video 36 of 43

Nearly any task in NLP can be formulates as a sequence to sequence task: machine translation, summarization, question answering, and many more. In this module we will learn a general encoder-decoder-attention architecture that can be used to solve them. We will cover machine translation in more details and you will see how attention technique resembles word alignment task in traditional pipeline.

国立高等经济大学
4.6(366 個評分) | 35K 名學生已註冊
課程 6(共 7 門,高级机器学习 專項課程

關於 Coursera

課程、專項課程和在線學位均由全世界一流大學和教育機構的頂尖授課教師教授。

Community
Join a community of 40 million learners from around the world
Certificate
Earn a skill-based course certificate to apply your knowledge
Career
Gain confidence in your skills and further your career