Loading...

Explicit and implicit matrix factorization

Course video 21 of 43

This module is devoted to a higher abstraction for texts: we will learn vectors that represent meanings. First, we will discuss traditional models of distributional semantics. They are based on a very intuitive idea: "you shall know the word by the company it keeps". Second, we will cover modern tools for word and sentence embeddings, such as word2vec, FastText, StarSpace, etc. Finally, we will discuss how to embed the whole documents with topic models and how these models can be used for search and data exploration.

国立高等经济大学
4.6(366 個評分) | 35K 名學生已註冊
課程 6(共 7 門,高级机器学习 專項課程

關於 Coursera

課程、專項課程和在線學位均由全世界一流大學和教育機構的頂尖授課教師教授。

Community
Join a community of 40 million learners from around the world
Certificate
Earn a skill-based course certificate to apply your knowledge
Career
Gain confidence in your skills and further your career