Automatically propagating Global Parameter changes on publish
Make sure to check this Manage hub setting: 'Manage hub' ->' ARM template' -> 'Include global parameters in ARM template'
, and your global parameters will be created, updated, or deleted whenever you make changes to them on dev and decide to publish to prod.
– via Microsoft TechCommunity
Importing Snowflake data
Note that for direct copy you need an Azure Blob Storage linked service with SAS token auth, as highlighted here. If you’re using ADLS Gen 2, you can create a Blob linked service with SAS auth and take it from there.
If you encounter error such as “Snowflake Export Copy Command validation failed: ‘The Snowflake copy command payload is invalid. Cannot specify property: column mapping”
Just reset the mapping from the copy activity. That’s it. That’s what worked for me (and others 🤷🏻♂️).
– via Stack Overflow
For errors such as “ErrorCode=UserErrorOdbcOperationFailed,‘Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [22000] Max file size (16777216) exceeded for unload single file mode. We recommend that you increase the max file size parameter, or disable single-file mode in the unload command and combine the unloaded files into a single file after you download them.”
Edit the Additional Snowflake copy options
, and add property name SINGLE
, with the value FALSE
. Alternatively, add the MAX_FILE_SIZE
property and set it to something like 268435456
.
– via Microsoft Q&A
CI/CD
In-depth tutorial is here, don’t forget to add Azure PowerShell tasks for stopping/starting the triggers before/after deployment.
$triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
$triggersADF | ForEach-Object { Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.name -Force }
.....
$triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
$triggersADF | ForEach-Object { Start-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.name -Force }
Messages such as “Job failed due to reason: at Sink ‘UpdateSink’: Cannot insert explicit value for identity column in table ‘X’ when IDENTITY_INSERT is set to OFF.” in your data flows
Set Skip writing key columns
to true
in the sink settings.