Azure Blob Storage as a Target
Overview
You can configure Azure Blob Storage as a Target in Data Integration to write data from your process into a blob container within your Azure Storage Account.
Data Integration connects to Azure using a Shared Access Signature (SAS) token, which provides secure, time-bound access to your storage resources.
Use this configuration when you want to:
- Export data from Data Integration to Azure for analytics or archival purposes.
- Store output files, logs, or processed data securely in the cloud.
- Integrate Azure Blob as part of a broader ETL or data pipeline.
You will define the connection details (such as the container name, subfolder path, and SAS token) in the Data Inetgartion Connection and Operation components for the Azure Blob Storage connector.
Prerequisites
Before configuring Azure Blob Storage as a target in Data Integration, ensure the following prerequisites are met. Each item is required to establish a secure and valid connection between Data Integration and your Azure environment.
- Azure subscription and access to the target Storage Account.
- Boomi Data Integration environment with permission to configure connections.
- Target container bucket name and optional subfolder path.
- Decision on SAS scope (Account SAS vs. Container (Service) SAS).
Minimum permissions
The minimum permissions refer to the access rights included in the SAS token that Boomi uses to connect to your Azure Blob Storage.
These permissions determine what actions Boomi can perform on your storage.
You’ll configure these permissions when generating the SAS token in the Azure Portal, Azure Storage Explorer, or via the Azure CLI.
Recommended SAS permissions
- Write-only target: Include Create (c) and Write (w) permissions to allow Boomi to upload new blobs.
- If your process lists the container, for example, test connection, validations, or post-write checks: Add the List (l) permissions.
- Protocol: Restrict to HTTPS only (
spr=https) to ensure secure data transfer. - Expiry: Keep the SAS token short-lived (hours to days) and rotate it regularly for security.
UTC clock skew: When generating the SAS, set the start time (st) about 15 minutes in the past and the expiry (se) in the future (UTC).
Tokens with a start time in the future will be rejected by Azure.
Key concepts
- Container ("bucket"): The top-level scope within a storage account where blobs are stored.
- SAS: A time-limited credential granting specific permissions without exposing the storage account key.
- Endpoints: Always use the Blob endpoint for uploads, even when your account uses ADLS Gen2 (Hierarchical Namespace). For example,
https://<account>.blob.core.windows.net/<container>/<path>
Choosing Account SAS vs. Container (Service) SAS
Quick decision matrix
| Criterion | Account SAS (Account-level) | Container SAS (Service SAS, container scope) |
|---|---|---|
| Blast radius | Broader (account-scope; limit to Blob + Service/Container/Object) | Narrow (only the specified container) |
| Granularity | Good | Best (naturally least-privilege) |
| Operational simplicity | Simpler when many containers share one token | Simpler when writing to one target container |
| Least privilege | Good | Best |
| Revocation/rotation | One token may affect multiple containers | Rotate per-container; minimal collateral impact |
| Typical use | Many targets/containers, centralized control | Single target container per Boomi process |
- If your Boomi process writes to one specific container, prefer a Container (Service) SAS for least privilege.
- If you manage many containers or need a single credential for several targets, an Account SAS can reduce operational overhead—limit it to the Blob service and Service/Container/Object resource types.
Step 1: Create the storage account & target container in Azure portal
- In Azure portal, create or select a General-purpose v2 (Standard) storage account.
- Go to Data storage > Containers > + Container.
- Name the container, for example, boomi-target and set Public access level = Private (no anonymous access).
Works with or without Hierarchical Namespace (ADLS Gen2); always use the blob endpoint for uploads.
Step 2: Generate SAS token in Azure Portal
You can generate the SAS token directly through the Azure Portal. Use this method if you prefer a graphical interface.
Account SAS
Use when Boomi needs access across multiple containers in the same storage account.
- Go to Storage account > Security + networking > Shared access signature.
- Configure the following:
- Services: Blob
- Resource types: Service, Container, Object
- Permissions: Create (c), Write (w), List (l) if required
- Protocol: HTTPS only
- Set UTC times for start and expiry
- Click Generate SAS.
Container (Service) SA
Use when your integration writes only to one container and you want minimal access scope.
- Go to Storage account > Data storage > Containers > [container name] > Generate SAS.
- Configuration the following:
- Permissions: Create (c), Write (w), List (l) if required
- Protocol: HTTPS only
- Set UTC times for start and expiry
- Click Generate SAS.
Generate SAS token using Azure CLI
Use this method if you prefer automation or need to integrate SAS generation into scripts or CI/CD pipelines. The az storage * generate-sas commands require either an account key or AAD (user delegation) context.
A) Account SAS (using Account Key)
ACCOUNT_NAME="<your_storage_account>"
EXPIRY="2025-12-31T00:00:00Z" # UTC, rotate regularly
ACCOUNT_KEY="$(az storage account keys list -n "$ACCOUNT_NAME" --query [0].value -o tsv)"
az storage account generate-sas \
--account-name "$ACCOUNT_NAME" \
--account-key "$ACCOUNT_KEY" \
--services b \
--resource-types sco \
--permissions cw \
--https-only \
--expiry "$EXPIRY"
The output is the token (without the leading ?).
Full URL example:
https://<account>.blob.core.windows.net/<container>?<token>
B) Container (Service) SAS (using AAD User Delegation)
Copy code
ACCOUNT_NAME="<your_storage_account>"
CONTAINER_NAME="<your_container>"
EXPIRY="2025-12-31T00:00:00Z" # UTC
az login # if not already logged in
az storage container generate-sas \
--account-name "$ACCOUNT_NAME" \
--name "$CONTAINER_NAME" \
--permissions cw \
--https-only \
--expiry "$EXPIRY" \
--auth-mode login \
--as-user
Alternative: If using the account key instead of AAD, add --account-key "$ACCOUNT_KEY".
SAS token format:
- The token begins with sv=... (no leading ?).
- When appending to a URL, prefix it with ?.
- Some UIs require only the token, while others require the full URL.
Step 3: (Optional) Target folder strategy
Use clear and predictable folder structures for your target paths, for example, landing/YYYY=MM=DD/HH=mm/.
If using partitioned formats, for example, Parquet, align folder naming conventions with downstream analytics engines for easier querying.
Step 4: Boomi file options
- Supported formats depend on your components such as CSV, JSON, Parquet, etc.
- Overwrite or append behavior is configured in your process settings.
- Large files are uploaded as Block Blobs by default.
Step 5: Pre-connection validation checklist
Before configuring the Boomi connection, confirm the following:
- Target container exists and is Private.
- SAS token (Account or Container) is active and includes:
- c, w (and l if your process lists)
- spr=https
- Valid UTC start (st) and expiry (se) times
- Correct scope (ss=b, srt=sco for Account SAS; sr=c for Container SAS)
- No restrictive IP rules (sip) unless intentionally enforced.