Creating Azure Data Lake Analytics Using Microsoft Azure Portal

Introduction

In this article, I am going to demostrate how to create an Azure Data Lake Analytics and how to use it.
 
Azure Data Lake

Microsoft Azure Data Lake is a highly scalable data storage and analytics service. Because Azure Data Lake is a cloud computing service, it gives customers a faster and more efficient alternative for deploying and managing big data infrastructure within their own data centers.

Prerequisites
  • Azure Account. Get Azure portal Trial.
  • Azure Data-Lake-Analytics account. (This account is created while creating the Data Lake project).
To create a Data Lake Analytics account, follow these steps -

Step 1

Sign in to the Azure portal.



Step 2

Click NEW(+) --> DATA+ANALYTICS--> DATA LAKE ANALYTICS.



Step 3

To create the DATA LAKE ANALYTICS account, press Data Lake store and choose the "CREATE NEW DATA LAKE STORE". Then, give a name for your data lake store account and press OK to create the store.



Step 4

After creating the store account, give the Data-Lake-Analytics name, resource group, and select the "Pin to Dashboard" tile.
 
Press "Create" button to create the Data Lake Analytics Project.



Step 5

Our Data Lake Analytics is deployed successfully and shown in the Notification Tab.



In the Dashboard, our Data Lake project blade is shown.Click on it open the project.



Step 6

It opens the mydatalakenew blade and our Data Lake Analytics using Azure is created successfully. To give the additional options for using Data Lake install  sample data.



After pressing the Sample Scripts, the samples are updated on the Azure portal.



Step 7

In the Sample Scripts blade, press the "Query a TSV file" menu. Then, choose the Data Explorer option to explore our sample data.



To explore sample data, hit DATA EXPLORER. It opens the scripts where you can choose any file. Press Upload to upload that file into your project.



Choose the SearchLog.tsv file and click it to explore its log (data). In this, you can download, rename, or delete the particular file log.



Step 8

Return your mydatalakenew project and press the "Submit Job" button to submit the new job. We can give this a name of our desire. The query is Default for our selected "searchLogs.tsv" file.



Step 9

When you press the Submit Job button, it opens the job working blade. Just wait until the job state is changed into succeeded.



It processes the file and saves the result into the output file "SearchLog_output.tsv". To explore the output file, just open it.



Summary

I hope you understood how to create Data-Lake-store account and Data-Lake-Analytics using Azure.


Similar Articles