Deep Learning NLP: Training GPT-2 from scratch

4.1
49 個評分
提供方
Coursera Project Network
3,477 人已註冊
在此指導項目中,您將:

Understand the history of GPT-2 and Transformer Architecture Basics

Learn the requirements for a custom training set

Learn how to use functions available in public repositories to fine-tune or train GPT-2 on custom data and generate text

Clock2 hours
Beginner初級
Cloud無需下載
Video分屏視頻
Comment Dots英語(English)
Laptop僅限桌面

In this 1-hour long project-based course, we will explore Transformer-based Natural Language Processing. Specifically, we will be taking a look at re-training or fine-tuning GPT-2, which is an NLP machine learning model based on the Transformer architecture. We will cover the history of GPT-2 and it's development, cover basics about the Transformer architecture, learn what type of training data to use and how to collect it, and finally, perform the fine tuning process. In the final task, we will discuss use cases and what the future holds for Transformer-based NLP. I would encourage learners to do further research and experimentation with the GPT-2 model, as well as other NLP models! Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

您要培養的技能

Artificial Intelligence (AI)TensorflowMachine LearningNatural Language Processing

分步進行學習

在與您的工作區一起在分屏中播放的視頻中,您的授課教師將指導您完成每個步驟:

  1. Introducing GPT-2

  2. Intro to Transformers

  3. Gathering a Dataset

  4. Training and Fine Tuning our Model

  5. Use Cases and the Future

指導項目工作原理

您的工作空間就是瀏覽器中的雲桌面,無需下載

在分屏視頻中,您的授課教師會為您提供分步指導

審閱

來自DEEP LEARNING NLP: TRAINING GPT-2 FROM SCRATCH的熱門評論

查看所有評論

常見問題

常見問題

還有其他問題嗎?請訪問 學生幫助中心