Azure : Managed Identity Integration with Storage account

Prashanth Kumar
5 min readJan 10, 2025

--

Problem Statement

Managing and processing data in Azure Storage Containers can be a complex task when dealing with secure and automated operations. Specifically:

File Listing:

Automating the listing of files in a source container for processing or auditing.’

File Upload:

Uploading new or processed files to a destination container in a secure and scalable way.

Secure Authentication:

Ensuring sensitive credentials like keys or secrets are not exposed in code while maintaining seamless integration with Azure services.
The challenge lies in creating a solution that is secure, efficient, and utilizes Azure’s robust features, such as Managed Identity and Role-Based Access Control (RBAC).

And the most important we dont want to use Traditional way of using Azure Storage account key.

Solution Overview

To address the problem, I have built a Python-based automation solution that which can be run independently or can be used as Azure Function

Generates Secure Access Tokens: Uses Managed Identity to obtain SAS (Shared Access Signature) tokens for source and destination containers.
Lists Files: Retrieves and prints the list of blobs (files) in a specific Azure Storage container.
Uploads Files: Automates the creation and upload of a new type of file(s) to a designated container.
The solution leverages Azure services, including Azure Storage, DefaultAzureCredential for Managed Identity, and the Azure SDK for Python.

Key Components and Steps

1. Setting Up Managed Identity

Managed Identity allows the application to authenticate with Azure services without storing any secrets or credentials in the code. Here’s how we set it up:

Enable Managed Identity for the Azure environment (e.g., Virtual Machine, Azure Function, etc.).
Grant the Managed Identity the appropriate Role Assignments:
Storage Blob Data Reader: For listing files in the source container.
Storage Blob Data Contributor: For uploading files to the destination container.

2. Validating Managed Identity

To test and validate the Managed Identity setup I have used Azure Cloud Shell.

Log in to Azure Cloud Shell, you can get ClientID from your Managed Identity.

az login --identity --username "465a6036-xxxx-xxx-xxxx-ae94d85f8489"

for Assigning roles to your managed identity you can use below command

az role assignment create \
--assignee "465a6036-xxxx-xxxx-xxxx-ae94d85f8489" \
--role "Storage Blob Data Reader" \
--scope "/subscriptions/935d32fb-xxxx-xxxx-xxxx-xxxxx/resourceGroups/myresourcegroup/providers/Microsoft.Storage/storageAccounts/prademostgvs001"

3. Generating SAS Tokens Securely

The application uses Managed Identity to authenticate and generate SAS tokens dynamically. The generate_sas_token() function:

Acquires an Azure AD token using DefaultAzureCredential.
Calls Azure’s REST API to generate a service-level SAS token with required permissions (e.g., read, list, write).

4. Listing Files in Source Container

The function list_files_in_source_container():

Connects to the Azure Storage account using the SAS token.
Lists blobs in the source container using the Azure Blob SDK.

5. Uploading Files to Destination Container

The function upload_text_file_to_destination_container():

Connects to the destination container using the SAS token.
Uploads a .txt file with sample content to the specified container.

6. Python Implementation

below is the code which i have used

from azure.identity import DefaultAzureCredential
from azure.storage.blob import BlobServiceClient
import requests
from datetime import datetime, timedelta

# Azure Configuration
STORAGE_ACCOUNT_NAME = "your_storage_account"
SOURCE_CONTAINER_NAME = "source_container"
DESTINATION_CONTAINER_NAME = "destination_container"
NEW_FILE_NAME = "example_file.txt"
NEW_FILE_CONTENT = "Sample content for upload"

def generate_sas_token(container_name, permissions):
"""Generates a SAS token using Managed Identity."""
credential = DefaultAzureCredential()
token = credential.get_token("https://management.azure.com/.default").token
expiry_time = (datetime.utcnow() + timedelta(hours=1)).strftime("%Y-%m-%dT%H:%M:%SZ")
sas_url = (
f"https://management.azure.com/subscriptions/your_subscription_id/"
f"resourceGroups/your_resource_group/providers/Microsoft.Storage/"
f"storageAccounts/{STORAGE_ACCOUNT_NAME}/listServiceSas/?api-version=2021-09-01"
)
body = {
"signedVersion": "2020-02-10",
"canonicalizedResource": f"/blob/{STORAGE_ACCOUNT_NAME}/{container_name}",
"signedResource": "c",
"signedPermission": permissions,
"signedProtocol": "https",
"signedExpiry": expiry_time
}
response = requests.post(sas_url, headers={"Authorization": f"Bearer {token}"}, json=body)
if response.status_code == 200:
return response.json()["serviceSasToken"]
else:
raise Exception(f"Failed to generate SAS token: {response.text}")

def list_files_in_container():
"""Lists all files in the source container."""
client = BlobServiceClient(f"https://{STORAGE_ACCOUNT_NAME}.blob.core.windows.net", credential=source_sas_token)
container_client = client.get_container_client(SOURCE_CONTAINER_NAME)
print(f"Files in {SOURCE_CONTAINER_NAME}:")
for blob in container_client.list_blobs():
print(f" - {blob.name}")

def upload_file_to_container():
"""Uploads a sample text file to the destination container."""
client = BlobServiceClient(f"https://{STORAGE_ACCOUNT_NAME}.blob.core.windows.net", credential=destination_sas_token)
container_client = client.get_container_client(DESTINATION_CONTAINER_NAME)
blob_client = container_client.get_blob_client(NEW_FILE_NAME)
blob_client.upload_blob(NEW_FILE_CONTENT, overwrite=True)
print(f"File {NEW_FILE_NAME} uploaded to {DESTINATION_CONTAINER_NAME}.")

if __name__ == "__main__":
# Generate SAS tokens
source_sas_token = generate_sas_token(SOURCE_CONTAINER_NAME, "rl") # Read and List
destination_sas_token = generate_sas_token(DESTINATION_CONTAINER_NAME, "rw") # Read and Write

# Execute operations
list_files_in_container()
upload_file_to_container()

Conclusion

This solution demonstrates how to securely automate Azure Storage operations using Managed Identity and the Azure SDK for Python. With the ability to dynamically generate SAS tokens and authenticate securely via Azure AD, the process eliminates hardcoded credentials and reduces security risks.

By leveraging Azure Cloud Shell for validation and testing, we ensured that the solution operates seamlessly in real-world scenarios. This integration approach showcases the power of Azure’s Managed Identity and how it simplifies secure access to Azure services.

References:

This shell script really helped me to build my python code

https://gist.githubusercontent.com/ChrisRomp/844de85b40caca878a81e22ff93faefe/raw/8986c05b3ae95631f7443c2b925c23a4ce337974/script.sh

Difference between using Management.azure.com vs storage.azure.com scope

The token scope (.default endpoint) you use depends on the API you are interacting with:

https://management.azure.com/.default:

Used when interacting with Azure Resource Manager (ARM) APIs.
This is required for operations like generating a Service SAS using the listServiceSas API since it’s part of ARM.
https://storage.azure.com/.default:

Used when interacting directly with Azure Storage APIs (e.g., reading/writing blobs without a SAS token).
This is required when accessing blobs using an identity (Managed Identity or Service Principal) without generating a SAS token.

Why management.azure.com/.default is used here

In this scenario, when we are generating a Service SAS using the ARM API, which manages Azure resources, including storage accounts.
To call the listServiceSas endpoint, it needs to be token scoped to management.azure.com.
If you use https://storage.azure.com/.default, the token will not have sufficient permissions for the listServiceSas API because that token is scoped only for direct access to storage APIs, not ARM.

When to Use storage.azure.com/.default
You would use storage.azure.com/.default if you want to:

Directly read/write blobs using the Azure Storage SDK or REST API without using a SAS token.
Use DefaultAzureCredential for direct authentication to Azure Blob Storage, assuming the Managed Identity has the Storage Blob Data Contributor role.

Also if you want to reduce the depency of using Management.azure.com scope and directly use Managed Identity from your Azure function then you can use below format

MANAGED_IDENTITY_CLIENT_ID = "xxx-xxx-xxx-xxx-xxx"
app = func.FunctionApp(http_auth_level=func.AuthLevel.ANONYMOUS)


blob_service_client = None

@app.route(route="triggername")
def asrtrigger(req: func.HttpRequest) -> func.HttpResponse:
global blob_service_client
logging.info('Python HTTP trigger function processed a request.')


try:
user_assigned_identity = DefaultAzureCredential(managed_identity_client_id=MANAGED_IDENTITY_CLIENT_ID)
blob_service_client = BlobServiceClient(account_url=STORAGE_ACCOUNT_URL, credential=user_assigned_identity)
logging.info("Successfully authenticated with Blob Storage using Managed Identity.")
except Exception as e:
logging.error(f"Failed to authenticate with Blob Storage: {e}")
return func.HttpResponse("Authentication with Blob Storage failed.", status_code=500)

name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')

# Process files
try:
DefinitionName()
return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.")
except Exception as e:
logging.error(f"Error processing files: {e}")
return func.HttpResponse(f"Error processing files: {e}", status_code=500)

Summary

For this specific use case of generating a SAS token using listServiceSas:

management.azure.com/.default is the correct scope.
If you later need to interact with storage APIs directly using DefaultAzureCredential, you can switch to storage.azure.com/.default.

--

--

Prashanth Kumar
Prashanth Kumar

Written by Prashanth Kumar

IT professional with 20+ years experience, feel free to contact me at: Prashanth.kumar.ms@outlook.com

No responses yet