Preface
Since version 2.4.165, the Custom Application Connector can consume data stored in Azure Storage Accounts. This opens the door to retrieving many Azure data sources that were not previously available.
With this addition, any Azure data source that can put its data into an
- Azure Storage Account's Blob store
- Its "Kind" must be StorageV2, so it can have the feature to alert its queue on new blobs
- In a 1-event-per-line files in clear text - i.e. .log, .txt, .json for JSON-line format - each line is a valid event in JSON format
Prepare the Storage Account
Go to the the storage account where that data that you want to consume is at -
Navigate to https://portal.azure.com
Navigate to Storage Accounts -> your storage account. Verify it's kind is "StorageV2" in the storage accounts table, or in the Overview -> Account Kind
Configure a Queue in that storage account to be notified when new Blobs are added to its blob store
-
Create a queue dedicated for the connector
- Navigate to Queues
- Click "+ Queue", name it something unique, e.g. "sk4queue", and OK
- Keep that queue name aside
-
Navigate to "Events"
-
Click "+ Event Subscription"
-
Enter useful name like "notify-sk4queue-on-new-blobs"
-
Events Schema should be the default "Event Grid Schema"
-
Under Event Types -> "Filter to Event Types", make sure only the "Blob Created" is selected.
-
Under Endpoint Details
- In "Endpoint Type" select "Storage Queues"
- In "Endpoint" click "Select an endpoint"
- In the panel that opened, select the subscription where the current Storage Account is at
- Select the storage account
- Select the queue created in the first step, e.g. "sk4queue"
- Click "Confirm Selection"
-
Click "Create"
It can technically be configured such that the queue and the blob will be in differrent storage accounts. for the sake of simplicity they're configured in the same storage account
Create Minimal-Permissions SAS Tokens
- Navigate to "Shared access signature", under "Settings"
- Create a SAS for the queue
- Select only the "Queue" from the "Allowed services" section
- Select only the "Object" from the "Allowed resource types"
- Select only the "Process" from the "Allowed permissions"
- Set the End time to a time far in the future, e.g. 10 years in the future, so the connector doesn't stop working because of SAS Token invalidation
- Click "Generate SAS and connection string"
- Copy the value from the "SAS Token" field
- Example value is: "?sv=2018-03-28&ss=q&srt=o&sp=p&se=2019-09-23T17:52:57Z&st=2019-09-23T09:52:57Z&spr=https&sig=aASkYB8%2BGf21fMXl3Bnf0Mod7n81Dq8E%2FeggjEQr%2BD8%3D"
- Create a SAS for the blob storage
- Select only the "Blob" from the "Allowed services" section
- Select only the "Object" from the "Allowed resource types"
- Select only the "Read" from the "Allowed permissions"
- Set the End time to a time far in the future, e.g. 10 years in the future, so the connector doesn't stop working because of SAS Token invalidation
- Click "Generate SAS and connection string"
- Copy the value from the "SAS Token" field
- Example value is: "?sv=2018-03-28&ss=b&srt=o&sp=r&se=2029-09-23T17:52:57Z&st=2019-09-23T09:52:57Z&spr=https&sig=2NfpJEsdxkLCfMHBXm9Z7RvojSfLKWiHuMi6vY%2FSDQ4%3D"
Obtain other Storage Account properties
- Navigate to "Properties"
- Copy the value from the "Primary Blob Service Endpoint" field, under the "Blob service" section
- Copy the value from the "Primary Queue Service Endpoint" field, under the "Queue service" section
Use the obtained credentials to onboard a Custom Application Connector
- In the Cloud Connectors UI, navigate to Settings -> Accounts
- Click "+ Add Account" and select "Custom Application" from the services list
- Select "Azure Storage - SAS Tokens" in "Authentication Method"
- Fill in the other obtained credentials and properties
- Under "Sync. Strategy" select the appropriate method
- Select "Continuous - Azure Duration Auto-Detection" if the blobs put into the storage are in a format of a duration. e.g. PT1H.json , or PT15M.json .
These blobs are updated by the service sending the data for a long duration and needs to be synced continuously for a period of time. By selecting this option the application will auto-detect the right duration each such blob needs to be monitored for changes, and only collect the added delta to prevent duplicates.
NOTE: Some azure services, e.g. NSG Flow Logs, update the file but only have 1 line in it that gets updated. Such services are NOT compliant with this connector.
Only services that add new lines, 1 per event, are compliant with this connector. - Select "Once" if the blobs put into the storage will not be updated after they are put there.
- Select "Continuous - Azure Duration Auto-Detection" if the blobs put into the storage are in a format of a duration. e.g. PT1H.json , or PT15M.json .
- Click "Test Connection" to test the configuration is successful
- Click "Done" to finish the account creation and "Start" to start its sync. operation
Done!
Comments
0 comments
Please sign in to leave a comment.