Microsoft Fabric  

Seamless Data Orchestration in Microsoft Fabric Using Shared Access Signature (SAS)

Imagine this: your organization stores terabytes of data across multiple Azure Data Lake Storage (ADLS) containers, and you need to automate movement, transformation, and integration of that data securely—without exposing credentials. Enter Shared Access Signature (SAS), the unsung hero of secure, fine-grained access in the cloud, now empowering modern data orchestration within Microsoft Fabric Data Pipelines.

With the rise of Fabric’s unified data platform, data engineers and analysts can now build end-to-end data pipelines that integrate seamlessly with OneLake, Azure Blob, or ADLS Gen2—while maintaining strict governance and access control through SAS tokens.

In this article, we’ll break down how to use Shared Access Signatures (SAS) effectively to orchestrate your data using Microsoft Fabric Data Pipeline—from setup to execution.

What is a Shared Access Signature (SAS)?

A Shared Access Signature is a secure, time-limited URL token that grants controlled access to Azure Storage resources—like containers, blobs, or files—without exposing your primary account keys.

For example, you can use a SAS token to allow your Fabric Data Pipeline to read from a storage account for 2 hours, or write processed data back into a different container.

This means:

  • No need to store or rotate storage account keys.

  • Fine-grained access control (read, write, delete, list, etc.).

  • Expiry-based access revocation.

Why SAS for Fabric Data Pipeline?

Fabric Data Pipelines enable low-code orchestration of ingestion, transformation, and movement of data across cloud environments. But when dealing with external storage systems like Azure Data Lake, using SAS tokens offers these advantages:

  • Security: SAS tokens are temporary and scoped.

  • ⚙️ Automation: Easily injected into pipeline parameters.

  • 🔄 Scalability: Connect multiple data sources securely.

  • 🧩 Flexibility: Supports both source and destination authentication.

Step-by-Step: Using SAS in Microsoft Fabric Data Pipeline

For this demonstration, I have provisioned Fabric Lakehouse, which is the destination for landing the data, and Fabric data pipeline, which is needed for the data integration, as seen below

1

In addition, I have three fact_work_event 2020 to 2022 csv files in the Azure Data Lake Storage container, as seen below

2

Next, I launched the fabric data pipeline canvas and selected the Copy data activity, which is named Copy data using SAS, as can be seen in the General tab

3

Switched to the Source tab, and I searched for the Azure Data Lake Storage Gen2 connector, as seen below

4

After selecting the ADLS Gen2 source, I need to provide the url that points to the ADLS Gen2. Back to the ADLS Gen2 container, I searched for the endpoint and copied the primary endpoint, which is what I need. I pasted in the Fabric Data Pipeline source URL box

5

Since the focus of this article is to learn how to use Shared Access Signature to authenticate to the ADLS Gen2, I navigated back to the Azure portal Storage Account and searched for shared. Clicked on it and allowed the SAS to be used for Blob and File. Then, I clicked on Generate SAS and Connection String.

6

Next, I copied the SAS token as seen below.

7

Then, I navigated back to Fabric Data Pipeline connection settings and pasted the SAS token as seen below.

8

Then, I typed in sales in the file, which is the same as the container name that contains the three csv files. I also turned off recursively, as that is not applicable. Then, I selected Delimited Text as the file format, as seen below

9

Next, I configure the lakehouse destination as seen below.

10

Then, I ran the pipeline, which executed successfully as seen below.

11

Finally, I checked the data in the Lakehouse, which was integrated perfectly.

12