Generating New Recipes using GPT-2

4.5
18 個評分
提供方
Coursera Project Network
2,270 人已註冊
在此指導項目中,您將:

Clean and preprocess text data for modeling

Create datasets for large-scale language generation

Fine-tune large-scale language model on small and niche task of generating recipes

Clock90-120 minutes
Intermediate中級
Cloud無需下載
Video分屏視頻
Comment Dots英語(English)
Laptop僅限桌面

In this 2 hour long project, you will learn how to preprocess a text dataset comprising recipes, and split it into a training and validation set. You will learn how to use the HuggingFace library to fine-tune a deep, generative model, and specifically how to train such a model on Google Colab. Finally, you will learn how to use GPT-2 effectively to create realistic and unique recipes from lists of ingredients based on the aforementioned dataset. This project aims to teach you how to fine-tune a large-scale model, and the sheer magnitude of resources it takes for these models to learn. You will also learn about knowledge distillation and its efficacy in use cases such as this one. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

您要培養的技能

Python ProgrammingMachine LearningNatural Language Processing

分步進行學習

在與您的工作區一起在分屏中播放的視頻中,您的授課教師將指導您完成每個步驟:

  1. Introduction to the task and demo

  2. Exploratory data analysis and visualizations

  3. Dataset preparation

  4. GPT-2 theory and related machine learning concepts

  5. Model training on Google Colab

  6. Evaluating model performance empirically

指導項目工作原理

您的工作空間就是瀏覽器中的雲桌面,無需下載

在分屏視頻中,您的授課教師會為您提供分步指導

審閱

來自GENERATING NEW RECIPES USING GPT-2的熱門評論

查看所有評論

常見問題

常見問題

還有其他問題嗎?請訪問 學生幫助中心