Azure Connector
Integrate with Microsoft Azure — VMs, Storage, Functions, SQL, and Service Bus
Azure Connector
Integrate your workflows with Microsoft Azure and tap into powerful cloud services like Blob Storage, Azure Functions, Service Bus, SQL Database, and Virtual Machines.
Overview
The Azure connector gives you 17 operations across core Azure services. Whether you're storing files in Blob Storage, triggering serverless functions, or managing virtual machines, this connector has you covered.
Authentication
Azure uses service principals for authentication. Think of a service principal as a "machine user" that can authenticate securely without needing actual user credentials.
Option 1: Service Principal (Recommended)
This is the standard way to authenticate. You'll need credentials from your Azure AD:
auth_type: service_principal
tenant_id: "your-tenant-id"
client_id: "your-app-id"
client_secret: "your-client-secret"
subscription_id: "your-subscription-id"
Don't have these? Here's how to create them:
- Go to Azure AD > App registrations > New registration
- Give it a name like "DeepChain Connector"
- Copy the Application (client) ID and Directory (tenant) ID
- Create a Client secret and copy its value
- Assign roles under Subscriptions > Access control (IAM)
Tip: Grant only the roles your workflows need. Use the Contributor role for testing, then dial it back to Blob Contributor, Function App Contributor, etc. for production.
Option 2: Managed Identity (If Running on Azure)
If you're running DeepChain on an Azure VM or in Azure Container Instances, use managed identity—no secrets needed!
auth_type: managed_identity
subscription_id: "your-subscription-id"
Available Operations
Blob Storage (File Management)
Store and retrieve files in the cloud:
| Operation | What It Does |
|---|---|
listContainers |
List all storage containers |
listBlobs |
List blobs in a specific container |
getBlob |
Download a blob |
uploadBlob |
Upload a file |
deleteBlob |
Remove a blob |
Azure Functions (Serverless Compute)
Trigger custom code without managing servers:
| Operation | What It Does |
|---|---|
invokeFunction |
Trigger a function and get the result |
listFunctions |
List functions in your function app |
Service Bus (Messaging)
Decouple services with reliable messaging:
| Operation | What It Does |
|---|---|
sendMessage |
Send a message to a queue or topic |
receiveMessage |
Retrieve messages from a queue |
SQL Database (Relational Data)
Run queries against your SQL databases:
| Operation | What It Does |
|---|---|
executeQuery |
Execute SQL queries and get results |
Virtual Machines (Infrastructure)
Manage VM lifecycle from your workflows:
| Operation | What It Does |
|---|---|
listVMs |
List all VMs in a resource group |
startVM |
Power on a VM |
stopVM |
Power off a VM |
getVMStatus |
Check if a VM is running |
Practical Examples
Example 1: Upload a Report to Blob Storage
Save daily reports with automatic organization:
- id: save_report
type: azure_connector
config:
operation: uploadBlob
storageAccount: "myaccount"
container: "reports"
blobName: "{{ formatDate(now(), 'yyyy/MM') }}/report-{{ formatDate(now(), 'yyyy-MM-dd') }}.json"
content: "{{ json_stringify(input.report_data) }}"
contentType: "application/json"
This creates a folder structure by date, making it easy to find reports later.
Example 2: Trigger an Azure Function
Use a function to do complex work (data processing, ML predictions, etc.):
- id: process_order
type: azure_connector
config:
operation: invokeFunction
functionApp: "order-processor"
functionName: "ProcessOrder"
body:
orderId: "{{ input.order_id }}"
customerId: "{{ input.customer_id }}"
items: "{{ input.items }}"
Example 3: Query Your SQL Database
Fetch customer data for personalization:
- id: get_customer_orders
type: azure_connector
config:
operation: executeQuery
server: "myserver.database.windows.net"
database: "customers"
query: "SELECT * FROM Orders WHERE CustomerId = @customerId AND OrderDate > DATEADD(month, -1, GETDATE())"
parameters:
customerId: "{{ input.customer_id }}"
Example 4: Start a VM
Spin up a VM when you need additional capacity:
- id: start_processing_vm
type: azure_connector
config:
operation: startVM
resourceGroup: "my-resource-group"
vmName: "batch-processor"
After starting, the VM takes a minute or two to fully boot.
Example 5: Send a Message via Service Bus
Decouple your workflow by queuing work for another service:
- id: queue_email
type: azure_connector
config:
operation: sendMessage
namespace: "my-namespace"
queueOrTopic: "email-queue"
message:
body: "{{ json_stringify({
to: input.email,
subject: 'Welcome!',
template: 'welcome'
}) }}"
Rate Limits
Azure services have different limits. Here are the main ones:
- Blob Storage: 20,000 requests per second per account
- Azure Functions: Automatic scaling (consumption plan)
- Service Bus: 1,000 messages per second per unit
- SQL Database: Depends on your DTU tier
Note: DeepChain automatically handles retries, so you won't usually hit these limits unless you're doing something truly massive.
Error Handling
Common Azure Errors
| Error | What It Means | How to Fix |
|---|---|---|
AuthenticationFailed |
Bad service principal credentials | Check tenant ID, client ID, and client secret |
InvalidSubscriptionId |
Subscription doesn't exist or you don't have access | Verify the subscription ID in your Azure portal |
BlobNotFound |
The blob doesn't exist | Check the blob name and container |
ContainerNotFound |
The container doesn't exist | Create the container first in your storage account |
ResourceNotFound |
VM, function, or database doesn't exist | Verify the name and resource group |
Debugging
Turn on debug mode to see exactly what's happening:
Node Configuration:
debug: true
logRequest: true
logResponse: true
Check the execution logs for the full request/response details.
Best Practices
1. Organize Blobs with Folders (Prefixes)
Blob storage doesn't have "real" folders, but you can organize with naming conventions:
# Good naming that looks like folders
blobName: "reports/2025/02/report.json"
blobName: "logs/application/2025-02-10.log"
# Create a logical structure
blobName: "{{ input.category }}/{{ input.year }}/{{ input.month }}/{{ input.filename }}"
2. Use Connection Strings Carefully
Keep your storage account connection string in credentials, never in your workflow:
# Good
credential_id: cred_azure_prod
# Bad
connectionString: "DefaultEndpointsProtocol=https;AccountName=..."
3. Set Timeouts for Long-Running Functions
Azure Functions can take a while. Set appropriate timeouts:
- id: long_running_function
type: azure_connector
config:
operation: invokeFunction
functionApp: "data-processor"
functionName: "HeavyLifting"
body: "{{ input }}"
timeout: 300 # 5 minutes
4. Check VM Status Before Operations
Always verify a VM is running before doing something with it:
- id: check_vm
type: azure_connector
config:
operation: getVMStatus
resourceGroup: "my-resource-group"
vmName: "my-vm"
- id: wait_if_needed
type: conditional
config:
condition: "{{ vm_1.response.status != 'running' }}"
true_path: "wait_node"
false_path: "proceed"
Next Steps
- Connectors Overview — See all available connectors
- GCP Connector — Compare with Google Cloud
- Azure Documentation — Official Azure docs
- Service Bus vs Event Hubs — Choose the right messaging service