Create a Streaming Data Pipeline with Cloud Pub/Sub

Lab Details:

  1. This lab walks you through Creating a Streaming Data Pipeline with Cloud Pub/Sub

  2. You have created a Dataflow Job.

  3. You have processed the data into the BigQuery table.

  4. Region: us-central1 

  5. Duration: 60 minutes

Note: Do not refresh the page after you click Start Lab, wait for a few seconds to get the credentials.
In case while login into Google, if it asks for verification, please enter your mobile number and verify with OTP, Don't worry this Google Account will be deleted after the lab.

What is Data Streaming Pipeline?

  • Simply, it means the data is processed whenever any data is received in the pipeline without any manual intervention. Here you will publish the data using Cloud Pub/Sub, then using the Dataflow job, the data is processed and stored in BigQuery Table. The flow where the data will be processed is created in the Dataflow job.

Lab Tasks:

  1. Creating a Cloud Pub/Sub Topic.

  2. Creating a BigQuery Dataset and Table.

  3. Creating a Cloud Storage Bucket.

  4. Creating a Dataflow Job.

  5. Publishing the Data.

  6. Checking the published Data

Join Whizlabs_Hands-On to Read the Rest of this Lab..and More!

Step 1 : Login to My-Account
Step 2 : Click on "Access Now" to view the course you have purchased
Step 3 : You will be taken to our Learn Management Solution (LMS) to access your Labs,Quiz and Video courses

Open Console