Understanding Azure Data Lake Analytics

Introduction

This article will cover the following things,
  • Brief concepts and Pre-requisites
  • Implementation – Create a Data Lake Analytics account
  • Implementation – Create a U-SQL job and executing that job on the Data Lake account

Brief concepts and prerequisites

What is Azure Data Lake?

  • It is highly scalable data storage and analytics service.
  • It is hosted in Azure, Microsoft's public cloud, and is largely intended for Big Data storage and analysis.
  • Data Lake is a cloud computing service and gives customers a faster and more efficient alternative to deploying and managing Big Data infrastructure within their own data centers.

Pre-requisites

Read the following article to create Azure account and for some basic information about Azure to get started.

Implementation – Create a Data Lake Analytics account

Steps to be followed

  • Open the Azure portal and click on Add >> “Data + Analytics”.

  • Click on the Data Lake Analytics and then give the account name, Use the existing resource group and then "Create".

  • Also, before that, we need to create “Data Lake Store” and click on the "OK" button.

  • See the Data Lake Analytics section for the newly created “data-lake060817”.

Implementation – Create a U-SQL job and executing U-SQL query on the Data Lake account.

Steps to be followed
  • From the Data Lake Analytics account, click "New Job".

  • Write a simple query of storing the customer and its amount using this job and put into the .csv file, followed by a click on “Submit Job”.

  • See the job details with the graph after that.

  • Click on the Output tab and click “customer-amount.csv” to see the file.

Happy learning!