Blob storage setup
Azure Blob Storage integration allows your Jyro scripts to persist files beyond the default 1-hour in-memory expiration. Use it for scheduled exports, report archives, file sharing, and any scenario requiring persistent file storage.
Overview
By default, files generated by GenerateExcel() are stored in memory and expire after 1 hour. Blob storage provides:
- Persistent storage - Files remain until explicitly deleted
- Scalable capacity - Not limited by web server memory
- File management - List, download, and delete files programmatically
- Secure sharing - Generate time-limited download URLs (SAS tokens)
Prerequisites
Before configuring blob storage for your tenant, you need:
- Azure Storage Account - An Azure subscription with a Storage Account
- Container - A blob container in the storage account (will be auto-created if it doesn’t exist)
- Connection String - The storage account connection string
- Admin Access - Tenant administrator permissions in Iris
Configuration Steps
Step 1: Locate your Azure Storage Account
Locate or create an Azure Storage Account.
Step 2: Get Connection String
- Open your Storage Account in Azure Portal
- Go to Security + networking > Access keys
- Click Show next to Connection string for key1
- Copy the connection string (starts with
DefaultEndpointsProtocol=https;...)
This connection string contains a password. Anyone with this string has full access to your storage account.
Step 3: Store Connection String in Iris
Store the connection string as an encrypted secret in Iris:
- Navigate to System > Secrets
- Click + Create Secret
- Configure:
- Name:
blobstorage.connectionstring - Description: Azure Blob Storage connection string
- Value: Paste your connection string
- Name:
- Click Save
The connection string is encrypted at rest and only decrypted when blob storage operations are performed.
Step 4: Configure Tenant Properties
Set up the blob storage properties for your tenant:
- Navigate to Administration > Tenant Settings > Properties
- Configure the following properties:
| Property Name | Value | Description |
|---|---|---|
blobstorage.enabled | true | Enables blob storage for the tenant |
blobstorage.container.name | iris-exports | Container name (auto-created if needed) |
blobstorage.maxfilesize | 100 | Maximum file size in MB |
Property Details:
blobstorage.enabled
- Type: Boolean (
trueorfalse) - Default:
false - Description: Master switch to enable blob storage. Must be
truefor any blob storage functions to work.
blobstorage.container.name
- Type: String
- Default:
iris-exports - Description: The name of the blob container. Will be created automatically if it doesn’t exist. Must be lowercase, 3-63 characters, containing only letters, numbers, and hyphens.
blobstorage.maxfilesize
- Type: Integer
- Default:
100 - Description: Maximum file size limit in megabytes. Uploads exceeding this size will fail.
Step 5: Verify Configuration
Test your configuration with a simple script:
- Navigate to Administration > Scripting > Ad-hoc Scripts
- Create a test script:
# Test blob storage configuration
if IsBlobStorageEnabled() then
Log("Information", "Blob storage is enabled!")
# Test write
var testContent = "Hello from Iris! " + Now()
var success = WriteToBlobStorage("test", "hello.txt", testContent)
if success then
Log("Information", "Write test: SUCCESS")
# Test read
var content = ReadFromBlobStorage("test", "hello.txt")
if content != null then
Log("Information", "Read test: SUCCESS")
Log("Information", "Content: " + content)
else
Log("Error", "Read test: FAILED")
end
# Test list
var files = ListBlobStorage("test", "*")
Log("Information", "Files in test folder: " + Length(files))
# Test SAS URL
var sasUrl = GetBlobStorageSasUrl("test", "hello.txt", 5)
Log("Information", "SAS URL generated (valid 5 minutes)")
# Cleanup
DeleteFromBlobStorage("test", "hello.txt")
Log("Information", "Cleanup complete")
else
Log("Error", "Write test: FAILED - check connection string")
end
else
Log("Warning", "Blob storage is NOT enabled")
Log("Information", "Set blobstorage.enabled = true in tenant properties")
end
- Click Execute
- Check the log output for success messages
Troubleshooting
“Blob storage is not enabled for this tenant”
Cause: The blobstorage.enabled property is not set to true.
Solution:
- Go to Administration > Tenant Settings > Properties
- Set
blobstorage.enabledtotrue
“Blob storage connection string not configured”
Cause: The blobstorage.connectionstring secret is missing or empty.
Solution:
- Go to Administration > Tenant Settings > Secrets
- Add the
blobstorage.connectionstringsecret with your Azure connection string
“Failed to connect to blob storage”
Cause: The connection string is invalid or the storage account is inaccessible.
Solution:
- Verify the connection string is correct (copy it again from Azure Portal)
- Check if the storage account has firewall rules blocking access
- Ensure the storage account exists and hasn’t been deleted
“File size exceeds maximum allowed size”
Cause: The file you’re trying to upload is larger than blobstorage.maxfilesize.
Solution:
- Increase the
blobstorage.maxfilesizeproperty value - Or reduce the file size before uploading
“Invalid container name”
Cause: The container name doesn’t meet Azure naming requirements.
Solution: Container names must be:
- 3-63 characters long
- Start with a letter or number
- Contain only lowercase letters, numbers, and hyphens
- Not contain consecutive hyphens
Security Best Practices
1. Use Dedicated Storage Account
Create a dedicated storage account for Iris exports rather than sharing with other applications. This provides:
- Clear cost allocation
- Isolated access control
- Simpler security auditing
2. Enable Soft Delete
Configure soft delete on your storage account to protect against accidental deletions:
- Go to your Storage Account in Azure Portal
- Navigate to Data management > Data protection
- Enable Soft delete for blobs (7-365 days retention)
3. Configure Access Tiers
For cost optimization, configure lifecycle management:
- Go to Data management > Lifecycle management
- Create rules to move old files to Cool or Archive tiers
Example rule: Move files older than 30 days to Cool tier.
4. Monitor Access
Enable Azure Storage Analytics or Azure Monitor to track:
- Who is accessing files
- Download patterns
- Failed access attempts
5. Rotate Connection String
Periodically rotate your storage account keys:
- Generate a new key in Azure Portal
- Update the
blobstorage.connectionstringsecret in Iris - Delete the old key
File Organization
Path Conventions
Organize files using meaningful paths:
# By date
WriteToBlobStorage("exports/2025/01", "report.xlsx", bytes)
# By type
WriteToBlobStorage("reports/monthly", "january.xlsx", bytes)
WriteToBlobStorage("backups/daily", "data.json", bytes)
# By source
WriteToBlobStorage("dynamics-bc/exports", "customers.xlsx", bytes)
WriteToBlobStorage("scheduled/nightly", "summary.xlsx", bytes)
Naming Conventions
Include timestamps in filenames for uniqueness:
var fileName = "EXPORT_" + FormatDate(Now(), "yyyyMMdd_HHmmss") + ".xlsx"
# Result: EXPORT_20250111_153045.xlsx
Usage Examples
Persistent Excel Export
# Generate Excel with persistent storage
var worksheets = [
{
"name": "Data",
"data": GetAllOrganizations(),
"tableName": "OrganizationsTable"
}
]
var bytes = GenerateExcelBytes(worksheets)
var fileName = "organizations_" + FormatDate(Now(), "yyyyMMdd") + ".xlsx"
if WriteToBlobStorage("exports", fileName, bytes) then
# Generate download link valid for 7 days
var downloadUrl = GetBlobStorageSasUrl("exports", fileName, 10080)
Data._payload = {
"success": true,
"downloadUrl": downloadUrl,
"fileName": fileName,
"expiresIn": "7 days"
}
Data._statusCode = 200
else
Data._payload = {}
Data._payload.error = "Failed to save file"
Data._statusCode = 500
end
List Recent Exports
# List all Excel files from this month
var month = FormatDate(Now(), "yyyy/MM")
var files = ListBlobStorage("exports/" + month, "*.xlsx")
var results = []
foreach file in files do
var item = {
"name": file.name,
"size": file.sizeBytes,
"modified": file.lastModified,
"downloadUrl": GetBlobStorageSasUrl("exports/" + month, file.name, 60)
}
Append(results, item)
end
Data._payload = {
"month": month,
"fileCount": Length(results),
"files": results
}
Data._statusCode = 200
Cleanup Old Files
# Delete files older than 30 days
var files = ListBlobStorage("exports", "*")
var now = Now()
var deletedCount = 0
foreach file in files do
var ageMs = now - file.lastModified
var ageDays = ageMs / (1000 * 60 * 60 * 24)
if ageDays > 30 then
if DeleteFromBlobStorage("exports", file.name) then
deletedCount = deletedCount + 1
Log("Information", "Deleted: " + file.name + " (age: " + ageDays + " days)")
end
end
end
Log("Information", "Cleanup complete. Deleted " + deletedCount + " files.")
Available Functions
Once configured, these functions become available in your scripts:
| Function | Description |
|---|---|
| IsBlobStorageEnabled | Check if blob storage is enabled |
| WriteToBlobStorage | Upload file to blob storage |
| ReadFromBlobStorage | Download file from blob storage |
| ListBlobStorage | List files with wildcard support |
| DeleteFromBlobStorage | Delete file from blob storage |
| BlobStorageExists | Check if file exists |
| GetBlobStorageSasUrl | Generate time-limited download URL |
| GenerateExcelBytes | Generate Excel as bytes for storage |
Cost Considerations
Azure Blob Storage costs depend on:
- Storage capacity - GB stored per month
- Operations - Read/write/list operations
- Data transfer - Egress from Azure (downloads)
For typical Iris usage (reports, exports):
- Hot tier: ~$0.02/GB/month
- Cool tier: ~$0.01/GB/month (for older files)
Use Azure Cost Management to monitor actual costs.
Next Steps
- GenerateExcelBytes - Generate Excel files for blob storage
- Dynamic Endpoints - Create file download endpoints
- Script Scheduling - Schedule automated exports