Getting started with Azure Databricks

Introduction

What is Azure Databricks?

Azure Databricks is the same Apache Databricks, but a managed version by Azure. This managed service allows data scientists, developers, and analysts to create, analyse and visualize data science projects in cloud.

Databricks is a user friendly, analytics platform built on top of Apache Spark. Databricks acts as an UI layer, a WYSIWYG dashboard where you can create clusters, manage notebooks, write code and analyse data without knowing the internals of the system. Apache Spark is a unified analytics engine for large scale data processing and currently it supports popular languages such as Python, Scala, SQL and R.

About the article

If you know Apache Databricks already, then a tutorial is not necessary to get started because Azure Databricks also uses the same management portal used by Databricks.

Though there are different strategies possible to create and manage Databricks projects, I have followed below flow in this article:

image

Screenshots and steps provided in this article are valid as on 20 Sept 2018. Advancement in technology happening at a faster pace so as the Azure portal upgrades. So, please be aware of any portal flow changes when you try out the same. I will try to keep this tutorial up to date.

Login to Azure Portal

You must be having at least a trial account to get started. Visit Azure home page to get one – https://azure.microsoft.com/

Step 1: Create your first Databricks workspace

First step in creating a Databricks project is by creating a Workspace.

Typical steps will be to click “+ Create a resource” à “Analytics” à Azure Databricks

image

In the workspace creation wizard, you will have to provide below details:

A. Workspace name: Give a unique name (retry until you get a green tick mark at the right. You get a red X mark because someone already took your favourite names).

B. Subscription: Choose an appropriate subscription plan, or leave the default value if you do not know what this is about

C. Resource Group: Choose an existing resource group, or give a new one. (Provide a new name if you do not know what this box is about)

D. Location: This is the data center. Select your nearest location in the dropdown, or keep the default

E. Pricing Tier: Now this is about cost so be careful. I would prefer to go with a Free trial if I am doing this for learning purpose. You can read more about the pricing tiers here.

image

Click “Create” button and wait till the workspace get created. This will take couple of minutes and you will get the notification once it is completed.

image

Once he workspace is created, you can go to “All resources” and click your newly created workspace name in the list.

image

The resource dashboard will look like this:

image

Now it is time for some action. Click “Launch Workspace” button, and you will be directed to a new browser page. You will be signed into the portal automatically.

Your Azure Databricks journey starts here.

image

From here, there are different strategies possible to execute projects. Since a full-fledged project which includes a meaningful data analysis is out of scope of this article, we will try out a simple example like querying a dataset or plotting a bar chart.

Let us load a dataset and visualize using a notebook.

For the purpose, I have downloaded a dataset from internet, which is about the literacy rate in India. You may also download a freely available one, or create a dataset of your own. We are not going to do any complex analysis in this example so this simple dataset is enough. May note that the values in the dataset are not real values. My CSV file looks like this, with first row as header row.

image

Create Cluster

For storing the data and doing processing, we need some powerful machines. Let us call it clusters and create one in this section.

On the dashboard, click on “New Cluster

I am giving the cluster a name “MyFirstCluster”. If you are good in Azure portal already then you know most of the input parameters mentioned in the page. Otherwise if you are a beginner, I suggest you to leave all the other settings ‘as it is’ and click “Create Cluster” button to proceed further.

image

It will take some time to complete the cluster creation. For me it took about 5-10 minutes. You can see the status of cluster creation in next screen.

image

Once the cluster is created, the status will change from “Pending” to “Running

image

Once the cluster is crated then we are read to upload data or creating notebooks. Let us upload the data first.

Upload data

Upload the already prepared/downloaded dataset to the newly created cluster.

Go back to the dashboard and click “Upload Data

image

In the next screen, give the dataset a name and upload the dataset. In my case I am using a CSV file with some 35 rows. Your dataset can be a bigger one but note that depending on the size of the dataset the upload and processing can take more time.

image

Once upload is completed, you can create the Notebook.

Create Notebook

A Notebook in the context is an interactive web based editor which allows data scientists, analysts and developers to write and collaborate scripts and notes to analyse and visualize.

You can either create the Notebook by clicking “Create Table” in the Dashboard screen, or as the continuation of the last step. When you click “Create Table in Notebook” button in the above screen, Databricks service will create sample notepad for you with sufficient sample code, with python as the default language.

image

Make sure that you have the cluster attached to this notepad. If you see “Detached” status at left-top side, then make sure to choose a cluster by clicking on the “detached” text. Without a cluster, you cannot run the scripts.

image

Now it is time to test the script. You can see the sample python scripts in various script boxes in the page. You can click on the play button you see on right-top side of any script snippet box:

image

You should be able to see the script getting executed and result will be displayed below in the form of a table. If there are errors, you will be provided with proper error messages which you can use to debug the script.

image

Now it is your time for experimenting and more learning.

As a bonus, let us see how to visualize the same data using a bar chart. Click on the bar chart icon. If you do not see any charts auto generated, then click “Plot Options” and play around with the parameters.

image

image

Click “Apply”, and now you can see the bar chart updated in the Notebook.

image

Happy Learning!

References:

  1. https://docs.microsoft.com/en-us/azure/azure-databricks/what-is-azure-databricks
  2. https://databricks.com/
  3. http://spark.apache.org/

100-day Azure Trainings at Orion

I have started an Azure training series under the brand ‘Tech Hour’ at the company I work for – Orion. Plan is to deliver short sessions of 30 to 45 minutes which spans 100 days. Below are the azure topics planned to cover:

  • Day 001 Azure: Cloud Computing
  • Day 002 Azure: Portal
  • Day 003 Azure: XaaS
  • Day 004 Azure: Web Apps
  • Day 005 Azure: App Service
  • Day 006 Azure: Virtual Machines
  • Day 007 Azure: Linux
  • Day 008 Azure: Functions – Part I
  • Day 009 Azure: Functions – Part II
  • Day 010 Azure: SQL Database – Part I
  • Day 011 Azure: SQL Database – Part II
  • Day 012 Azure: Storage – Part I
  • Day 013 Azure: Storage – Part II
  • Day 014 Azure: Storage – Part II
  • Day 015 Azure: Logic Apps – Part I
  • Day 016 Azure: Logic Apps – Part II
  • Day 017 Azure: Logic Apps – Part III
  • Day 018 Azure: Service Fabric
  • Day 019 Azure: Cloud Services
  • Day 020 Azure: Cognitive Services – Part I
  • Day 021 Azure: Cognitive Services – Part II
  • Day 022 Azure: Cognitive Services – Part III
  • Day 023 Azure: Cognitive Services – Part IV
  • Day 024 Azure: Key Vault
  • Day 025 Azure: Data and BigData
  • Day 026 Azure: Data Factory – Part I
  • Day 027 Azure: Data Factory – Part II
  • Day 028 Azure: HDInsight
  • Day 029 Azure: API Management
  • Day 030 Azure: Machine Learning – Part I
  • Day 031 Azure: Machine Learning – Part II
  • Day 032 Azure: Application Insights
  • Day 033 Azure: Unstructured Data
  • Day 034 Azure: Cosmos DB
  • Day 035 Azure: Spark for HDInsight
  • Day 036 Azure: Storm for HDInsight
  • Day 037 Azure: R Server for HDInsight
  • Day 038 Azure: IoT Suite
  • Day 039 Azure: Active Directory
  • Day 040 Azure: Mobile Services
  • Day 041 Azure: CDN
  • Day 042 Azure: SQL Data Warehouse
  • Day 043 Azure: Multi-Factor Authentication
  • Day 044 Azure: Media Services
  • Day 045 Azure: Stream Analytics
  • Day 046 Azure: Event Hubs
  • Day 047 Azure: Service Bus
  • Day 048 Azure: Scheduler
  • Day 049 Azure: Notification Hub
  • Day 050 Azure: Automation
  • Day 051 Azure: Log Analytics
  • Day 052 Azure: Redis Cache
  • Day 053 Azure: Search
  • Day 054 Azure: Application Gateway
  • Day 055 Azure: Data Catalog
  • Day 056 Azure: Data Lake Store
  • Day 057 Azure: Data Lake Analytics
  • Day 058 Azure: Bot Service
  • Day 059 Azure: Containers
  • Day 060 Azure: Container Service
  • Day 061 Azure: SQL Server Stretch Database
  • Day 062 Azure: Media Player
  • Day 063 Azure: Monitor
  • Day 064 Azure: Insight & Analytics
  • Day 065 Azure: Analysis Services
  • Day 066 Azure: Time Series Insights
  • Day 067 Azure: MySQL
  • Day 068 Azure: PostgreSQL
  • Day 069 Azure: Virtual Machine Scale Sets
  • Day 070 Azure: Bing integration
  • Day 071 Azure: PowerShell – Part I
  • Day 072 Azure: PowerShell – Part II
  • Day 073 Azure: Cloud Shell
  • Day 074 Azure: Service Bus
  • Day 075 Azure: Serverless computing
  • Day 076 Azure: High Availability
  • Day 077 Azure: DevOps
  • Day 078 Azure: Load Balancer
  • Day 079 Azure: Virtual Private Networks
  • Day 080 Azure: Web App for Containers
  • Day 081 Azure: SQL Elastic database pool
  • Day 082 Azure: DaaS for MongoDB
  • Day 083 Azure: Developer Tools
  • Day 084 Azure: Design of applications
  • Day 085 Azure: ASP.NET Core – Part I
  • Day 086 Azure: ASP.NET Core – Part II
  • Day 087 Azure: Securing your Apps
  • Day 088 Azure: Event Grid
  • Day 089 Azure: Stack
  • Day 090 Azure: Cost Consciousness
  • Day 091 Azure: vs Amazon Web Services
  • Day 092 Azure: with PowerBI
  • Day 093 Azure: Flow vs LA vs Functions
  • Day 094 Azure: Best Practices
  • Day 095 Azure: Resouce Manager
  • Day 096 Azure: Deep Learning with CNTK
  • Day 097 Azure: Case Study – Part I
  • Day 098 Azure: Case Study – Part II
  • Day 099 Azure: Case Study – Part III
  • Day 100 Azure: Learning Roadmap

Free video course: Architecting Distributed Cloud Applications

Learn how to architect distributed cloud applications with the correct developer mindset using the right technologies and the best cloud patterns. This technology-agnostic course begins by explaining the benefits of distributed cloud applications with an emphasis on maintaining high-availability and scalability in a cost-effective way, while also dealing with inevitable hardware and software failures.

More Info here.