top of page


  • Writer's pictureRoman Guoussev-Donskoi

Monitor access to sensitive data: easy with Azure Storage

These days much of sensitive data is stored in the cloud including Azure storage accounts. Monitoring access to sensitive data is a critical requirement. Fortunately Microsoft provides preview capability to track each request made against Azure Storage or Azure Data Lake (ADLS2). These logs indicate the operation performed, requester identity and IP address, whether a request was made anonymously, by using an OAuth 2.0 token, by using Shared Key, or by using a shared access signature (SAS) as well as many other useful attributes and metrics.


In Azure Storage account go to Monitoring->"Diagnostic settings (preview)" and click on service you need to monitor: in our case blob (which also includes ADLS2 monitoring)

Then click to "Add diagnostics setting"

Specify destination (in our case this is Sentinel Log Analytics workspace to leverage storage access audit data for creation of Sentinel rules). Also specify access types to be monitored using Categories or Groups.

Setup complete.

Accessing logs in a Log Analytics workspace

In Log Analytics workspace storage access audit data is stored in the StorageBlobLog table. Logs for Data Lake Storage Gen2 do appear in the same table. That's because Data Lake Storage Gen2 is not service. It's a set of capabilities that you can enable in your storage account. If you've enabled those capabilities, logs will continue to appear in the StorageBlobLog table.

Best practice

To provide better audit and traceability (identify principal accessing the storage in the audit record) we should restrict and if possible completely block SAS and Shared Key Authorization and instead leverage OAuth as much as possible.

This is documented by Microsoft in Prevent Shared Key authorization for an Azure Storage account.

Column "AuthenticationType" in StorageBlobLog shows type of authentication used to access storage. When AuthenticationType is "OAuth" we can monitor identity of the caller accessing storage. (if you have to use SAS we can apply mitigations which will describe in another blog.)

In "OperationName" column we can see the operation performed : for example "PutBlob" shows object has been uploaded to a storage.

Lets look at data attributes in more detail:

StorageBlobLogs | where AuthenticationType == 'OAuth' and RequesterUpn contains '@' and not(OperationName has_any('GetBlobServiceProperties', 'GetUserDelegationKey', 'CreatePathDir', 'ListBlobs')) |

project TimeGenerated, Protocol, OperationName, AuthenticationType, Uri, RequesterUpn, CallerIpAddress, Category, TlsVersion

  1. Operation is PutBlob

  2. The specific Blob accessed

  3. Identity of requestor that performed the operation

  4. IP address of the requester

  5. TLS version (now TLS1.2 and above should only be acceptable)

This is all for now. Hope you find useful.

413 views0 comments

Recent Posts

See All

Databricks is an amazing platform for data engineering, data science and machine learning. One of the critical requirements of secure data processing is data audit - the ability to identity what data

SAS access to storage account is very convenient and easy and while Microsoft recommends that you use Azure AD credentials when possible as security best practice still SAS sometimes hard to avoid. Le

Home: Blog2


Home: GetSubscribers_Widget


Your details were sent successfully!

Home: Contact
bottom of page