Masked Self Attention

video-placeholder
Loading...
查看授課大綱

您將學習的技能

Reformer Models, Neural Machine Translation, Chatterbot, T5+BERT Models, Attention Models

審閱

4.3(727 個評分)

  • 5 stars
    67.26%
  • 4 stars
    14.16%
  • 3 stars
    9.07%
  • 2 stars
    5.36%
  • 1 star
    4.12%

SB

2020年11月20日

Filled StarFilled StarFilled StarFilled StarFilled Star

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

AM

2020年10月12日

Filled StarFilled StarFilled StarFilled StarFilled Star

Great course! I really enjoyed extensive non-graded notebooks on LSH attention. Some content was pretty challenging, but always very rewarding!\n\nThank you!

從本節課中

Text Summarization

Compare RNNs and other sequential models to the more modern Transformer architecture, then create a tool that generates text summaries.

教學方

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

探索我們的目錄

免費加入並獲得個性化推薦、更新和優惠。