WELCOME TO CLOUD MATTER

Search
  • Roman Guoussev-Donskoi

Cross-organization data sharing with Azure SAS

SAS access to storage account is very convenient and easy and while Microsoft recommends that you use Azure AD credentials when possible as security best practice still SAS sometimes hard to avoid.


Lets consider a simple scenario for example we are sharing Azure Storage data with multiple organizations when some of them do store and process their data on-premises (it would be so much easier to secure data sharing if all organizations had been in Azure but unfortunately this is not our scenario. )




While easy to use SAS requires an effort and care to ensure storage access is secure.

Consider for example following quotes from Microsoft documentation:

Any client that possesses a valid SAS can access data in your storage account as permitted by that SAS. It's important to protect a SAS from malicious or unintended use. Use discretion in distributing a SAS, and have a plan in place for revoking a compromised SAS.

in Microsoft documentation About the user delegation SAS

or

Storage doesn't track the number of shared access signatures that have been generated for a storage account, and no API can provide this detail. If you need to know the number of shared access signatures that have been generated for a storage account, you must track the number manually.

in Microsoft Best practices when using SAS

When your application design requires shared access signatures, Microsoft recommends use Azure AD credentials to create a user delegation SAS for superior security.


For our scenario we only allow users

  • access data in Storage that is shared with their organization only

  • control allowed client IPs for organization used to access shared data

  • limit maximum SAS timeframe


We could use the following approach:

  1. Create managed identity representing each organization for generating delegation SAS (must be assigned the Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey action)

  2. Use persistent storage (e.g. storage account dedicated for configuration data) to keep information associating users with their organization

  3. Create a simple Web app that allows users to request SAS for their organization and tracks each allocated SAS provided to a user.


Something like below:


Monitoring

Azure storage allows easy monitoring access and when using OAuth authentication identity of the access requester will be present in StorageBlobLogs records. Now how do we track access requests that leverage specific SAS? In occurs it is actually quite easy.


Lets download file from storage using SAS and look at hash values in StorageBlobLogs in Log Analytics

./azcopy.exe copy "https://[your-storage-account].blob.core.windows.net/adf-staging/testcommentname?[Your-URL-escaped-SAS]" "[local-file-path]" --overwrite=prompt --check-md5 FailIfDifferent --from-to=BlobLocal --recursive --trusted-microsoft-suffixes=;

$SASUnescaped=[uri]::UnescapeDataString("[Your-azcopy-SAS]")



If we compare "st"(signedStart) and"se"(signedExpiry) fields in "$SASUnescaped" and "StorageBlobLogs" we will see they match e.g.

st=2021-11-14T23:19:25Z

Alternatively we can compute SHA-256 hash from azcopy SAS signature and use it to find respective records in Log Analytics. Just include part after "sig=" in block below and it will produce hash which can be used to search records in StorageBlobLogs .


$azcopysig=[uri]::UnescapeDataString([your-SAS-signature])
$stringAsStream = [System.IO.MemoryStream]::new()
$writer = [System.IO.StreamWriter]::new($stringAsStream)
$writer.write($azcopysig)
$writer.Flush()
$stringAsStream.Position = 0
Get-FileHash -InputStream $stringAsStream | Select-Object Hash

Then use the resulting value to find respective records in Log Analytics


StorageBlobLogs  | where AuthenticationType == 'SAS'   |
project TimeGenerated, OperationName, AuthenticationType, StatusCode, caller=iff(isempty(RequesterUpn), RequesterObjectId, RequesterUpn), AuthenticationHash |
extend SigHash=extract("SasSignature[(](.*)[)]", 1, AuthenticationHash) |
where SigHash == [Output-from-Get-FileHash-above] |
sort by  TimeGenerated desc

Web Application Code example

We used Tutorial From Microsoft - Web app accesses storage by using managed identities - Azure App Service | Microsoft Docs

In Github https://github.com/Azure-Samples/ms-identity-easyauth-dotnet-storage-graphapi.git


The only change we did in StorageHelper.cs is user-assigned identity for credentials


string userAssignedClientId = "[YOUR-CLIENT-ID]";
var credential = new DefaultAzureCredential(new DefaultAzureCredentialOptions { ManagedIdentityClientId = userAssignedClientId });
 
BlobContainerClient containerClient = new BlobContainerClient(new Uri(containerEndpoint), credential);
                                                                

That is all.

With much gratitude to David Ma from Microsoft for provided guidance while we have been working on this scenario.


Appendix

Authorization

A very useful feature of Authorization of a user delegation SAS is that the permissions granted to a client who possesses the SAS are the intersection of the permissions granted to the security principal that requested the user delegation key and the permissions granted to the resource on the SAS token using the signedPermissions (sp) field.


Examples

PowerShell example for user delegation SAS is here: Generate a User Identity container SAS token with storage context based on OAuth authentication




87 views0 comments

Recent Posts

See All

Databricks is an amazing platform for data engineering, data science and machine learning. One of the critical requirements of secure data processing is data audit - the ability to identity what data

 

Subscribe

 

CONTACT

Your details were sent successfully!

Computers