Import data from Storage Bucket to Cloud SQL using Cloud Function

Lab Details:

  1. This lab walks you through GCP Cloud Function, Storage Bucket, and Cloud SQL

  2. Make sure, you have completed the previous labs about GCP Cloud Function, Storage Bucket, and Cloud SQL because here we will not go into detail about theory but we will connect these three services. If you have not completed the previous lab, we recommend you complete the previous lab first. 

  3. In this lab, we will create a bucket and upload a sample CSV file into the bucket ( We will provide the file)

  4. We will then create a Cloud SQL instance and database where the table will be created and records will be inserted using Cloud Function.

  5. We will then create a Cloud Function which will connect with Cloud SQL and Storage Bucket.

  6. Duration: 60 minutes

Note: Do not refresh the page after you click Start Lab, wait for a few seconds to get the credentials.
In case while login into Google, if it asks for verification, please enter your mobile number and verify with OTP, Don't worry this Google Account will be deleted after the lab.

Lab Overview:

In this lab, we will create a Google Cloud Storage Bucket where we will upload a Sample CSV containing user details, using Cloud Function we will first read the GCS Object first and then using the same Function we will connect with Cloud SQL and create a table and insert the data into Database.

Lab Tasks:

  1. Login into GCP Console.

  2. Creating a Cloud Storage Bucket.

  3. Creating a Cloud SQL Instance.

  4. Create a SQL Database.

  5. Creating a Cloud Function, deploy, and test.

  6. Using Cloud Shell, connect to DB Instance to check whether records are inserted successfully or not.

 

Launching Lab Environment:

  1. Launch the lab environment by clicking on . This will create a GCP environment with the resources and roles required for this lab.

  2. Once the lab environment is ready,  will be active. Click on , this will open Google Sign-In page. (Make sure you are using an Incognito mode).

  3. On your previous tab, Click on   and copy your credentials to Sign-in to the console. 

  4. Once logged in, you will be redirected to the GCP Console.

Note : If you have completed one lab, make sure to sign out of the GCP account before starting a new lab. If you face any issues, please go through FAQs and Troubleshooting for Labs.

Creating a Bucket and uploading a Sample file to it:

  1. Click on the hamburger icon on the top left corner 

  2. In the left sidebar, click on Cloud Storage under the Storage section.                                                                          

  3. Click on Create Bucket 

  4. Enter the name of the bucket as whizlabs-bucket. Make sure you enter the same name or you must include this name in any other name you enter, like whizlabs-bucket-martha or test-whizlabs-bucket. It is required to validate the lab. Keep a note of the bucket name, later you'll need this.

  5. Choose the location type as Multi-region and location as the US and click on Continue.                                      

  6. Make sure you choose the Standard storage class. Our goal is for a short term.                                                                        

  7. Uncheck bucket public access prevention enforcement                                                                                                  

  8. Choose access control as a Uniform to get the same access for all the objects in the bucket.                                                                                                      

  9. Keep the encryption as the default Google-managed key.                                                                                                      

  10. Click on Create to create the bucket

  11. Once bucket creation is done, click on Upload files and Upload the sample CSV file provided in the supporting documents or you can download it by clicking here.

  12. You can see the file is uploaded successfully. Let move on to the creation of Cloud SQL Instance.

Creating a Google Cloud SQL Instance:

  1. Click on the hamburger icon on the top left corner 

  2. Click on SQL.

  3. Click on Create Instance to create your database instance.                                      

  4. Choose your database engine as MySQL.                                                                         

  5. Enter instance id like import-csv-<your-role>. Make a note of the instance id, you'll need this later in the lab.

  6. Set any password for your root user and keep a note of it, It is required at a later stage. Make a note of the password, you'll need this later in the lab.

  7. Choose database version as MySQL 5.7 

  8. Select region as us-central1 and Any zone. Please note, User Interface might differ for you. You should choose an us-central1 region and zone as us-central1-a.                                                                                                      

  9. Click on show configuration options 

  10. Make sure the machine type is lightweight and has 1 Core with 3.75 GB of memory. If the machine type is different, Change the machine type by clicking the down arrow on the right side.

  11. Click on the dropdown to change the machine type as Lightweight and select 1 vCPU, 3.75 GB. If UI is different for you, please choose machine type as db-n1-standard-1. Kindly do not choose any other option. Doing so might block your account

  12. Change the Storage type by clicking the down arrow on the right side.

  13. You can choose any storage type. In our case, we are choosing SSD and storage capacity as 10 GB                                                

  14. Expand the Backups Tab.

  15. There will be two Advanced Options, Expand the first Advanced Options

  16. Choose the location as region and us-central1.                                                                                                                                                

  17. Keep the other options as is.

  18. Click on Create instance to finally create your SQL DB Instance.

Creating a Google Cloud SQL Database:

  1. Once you create DB Instance, click on databases 

  2. Click on Create Database. 

  3. Enter the database name as csv_db. Make a note of the database name, you'll need this later in the lab.

  4. Keep the character set as utf8 and Collation as Default collation.                                                                                  

  5. Click on create

Creating a Cloud Function:

  1. Click on Cloud Function under Compute section.                                                                                                                      

  2. Click on Create Function.                                                                                         

  3. Enter the function name as import_csv. Do not choose any other name, required to validate the lab. 

  4. Choose the region as us-central1

  5. Choose trigger type as HTTP

  6. Choose authentication as Allow unauthenticated invocations. This is for learning purposes, in real-time you should always use authentication

  7. Click on Save to continue

  8. Click on Next

  9. Choose runtime as Python 3.7

  10. Change the entry point as import_csv

  11. Paste the below code in main.py. Do the necessary changes like connection name, password, database name, bucket name and review the code. 

    import sqlalchemy
    from google.cloud import storage
    import pandas as pd
    import pymysql
    from sqlalchemy import create_engine
    connection_name = "<project_id>:us-central1:<cloud_sql_instance_id>"  #TODO
    db_name = "csv_db" #TODO
    db_user = "root"
    db_password = "WhizSql123" #TODO
    # If your database is MySQL, uncomment the following two lines:
    driver_name = 'mysql+pymysql'
    query_string = dict({"unix_socket": "/cloudsql/{}".format(connection_name)})
    def import_csv(request):
        # it is mandatory initialize the storage client
        client = storage.Client()
        #please change the file's URI
        temp = pd.read_csv('gs://<bucket_name>/samplecsv.csv', encoding='utf-8') #TODO
        columns = list(temp.head(0))
        request_json = request.get_json()
     # creating connection with DB instance
        db = sqlalchemy.create_engine(
          sqlalchemy.engine.url.URL(
            drivername=driver_name,
            username=db_user,
            password=db_password,
            database=db_name,
            query=query_string,
          ),
          pool_size=5,
          max_overflow=2,
          pool_timeout=30,
          pool_recycle=1800
        )
        try:
            with db.connect() as conn:
                create = "CREATE TABLE user_details (id INT NOT NULL AUTO_INCREMENT, username VARCHAR(255),email VARCHAR(255), age INT, PRIMARY KEY(id))"
                conn.execute(create)
                for row in temp.itertuples():
                    stmt = "INSERT INTO user_details (id,username,email,age) VALUES ({id},'{username}','{email}',{age})"
                    stmt = stmt.format(id=row.id,username=row.username,email=row.email,age=row.age)
                    conn.execute(stmt)
        except Exception as e:
            return 'Error: {}'.format(str(e))
        return 'ok'
  12. Paste the below code in the requirements.txt file.                                                                                                                                        

    google-cloud-storage==1.30.0
    gcsfs==0.6.2
    pandas==1.1.0
    sqlalchemy
    pymysql
  13. Click on deploy to deploy your function

  14. Click on import_csv i.e. your function name, once your function is successfully deployed.

  15. Navigate to the Testing tab.

  16. Click on Test the function.

  17. If you see the output as Ok, means you are able to create the table and insert the records.

  18. Now click on the top right corner to Activate Cloud Shell. 

  19. Enter the below query to connect your database with your instance : 

    gcloud sql connect import-csv-admin --user=root
  20. Click on Authorize to authorize the request 

  21. It will prompt for the password which you entered while creating the DB Instance.

  22. Enter the below command to select the database :

    use csv_db;
  23. Enter the below query to fetch the records from the table :

    select * from user_details;
  24. You should see the output below.                                                                                                            

Completion and Conclusion:

  1. In this lab, you have connected Cloud Function with Cloud Storage and Cloud SQL

  2. You have inserted records using Cloud Function into Cloud SQL.

End Lab:

  1. You have successfully completed the lab.

  2. Once you have completed the steps click on  from your whizlabs dashboard.



Join Whizlabs_Hands-On to Read the Rest of this Lab..and More!

Step 1 : Login to My-Account
Step 2 : Click on "Access Now" to view the course you have purchased
Step 3 : You will be taken to our Learn Management Solution (LMS) to access your Labs,Quiz and Video courses

Open Console