Use the Apache Spark Structured Streaming API with MongoDB

提供方
Coursera Project Network
在此指導項目中,您將:

Use the Apache Spark Structured Streaming API with Python to stream data from two different sources

Use the Apache Spark Structured Streaming API with Python to store a dataset in the MongoDB database and join two datasets

Clock2 hours
Intermediate中級
Cloud無需下載
Video分屏視頻
Comment Dots英語(English)
Laptop僅限桌面

By the end of this project, you will use the Apache Spark Structured Streaming API with Python to stream data from two different sources, store a dataset in the MongoDB database, and join two datasets. The Apache Spark Structured Streaming API is used to continuously stream data from various sources including the file system or a TCP/IP socket. One application is to continuously capture data from weather stations for historical purposes.

您要培養的技能

  • Apache Spark SQL
  • Mongodb
  • Apache Spark Structured Streaming API
  • Apache Spark Schema
  • Apache Spark

分步進行學習

在與您的工作區一起在分屏中播放的視頻中,您的授課教師將指導您完成每個步驟:

  1. Create a Python PySpark program to read streaming structured data.

  2. Persist Apache Spark data to MongoDB.

  3. Use Spark Structured Query Language to query data.

  4. Use Spark to stream from two different structured data sources.

  5. Use the Spark Structured Streaming API to join two streaming datasets.

指導項目工作原理

您的工作空間就是瀏覽器中的雲桌面,無需下載

在分屏視頻中,您的授課教師會為您提供分步指導

常見問題

常見問題

還有其他問題嗎?請訪問 學生幫助中心