Table of Contents


How to use app settings when configuring function bindings

For example, how to manage the queue name of a queue-triggered function using the app settings

Mark the app setting name with percent signs, i.e. %queue_name%, as seen below.

[FunctionName("QueueTriggeredFunc")]
public static void Run(
    [QueueTrigger("%queue_name%")]string queueItem)
{
    //...
}

or using function.json:

{
  "bindings": [
    {
      "name": "order",
      "type": "queueTrigger",
      "direction": "in",
      "queueName": "%queue_name%",
    }
  ]
}

– via Microsoft Docs

How to connect to Azure Data Lake Storage Gen 2 using a Service Principal

Assuming you’ve already configured everything, you need to use an instance of DefaultAzureCredential.

var credentials = new DefaultAzureCredential();            
var serviceClient = new DataLakeServiceClient("https://<your_storage_account_name>.dfs.core.windows.net/", credentials);

– via Microsoft Docs

Retrying failed executions

With retry policies (all triggers)

Use either fixedDelay or exponentialBackoff, as documented here and here.

With maxDequeueCount (for queue triggers)

Use maxDequeueCount as documented here.

System.Net.Sockets.SocketException in function apps

Try setting the environment variable DOTNET_SYSTEM_NET_HTTP_USESOCKETSHTTPHANDLER to 0, as documented here.

– via Stack Overflow and GitHub

Configuring a queue-triggered function app to execute messages one at a time

Set batchSize to 1, and newBatchThreshold to 0. Per the docs, “the maximum number of concurrent messages being processed per function is batchSize plus newBatchThreshold. This limit applies separately to each queue-triggered function."

{
  "extensions": {
    "queues": {
      "batchSize": 1,
      "newBatchThreshold": 0,
    }
  },
  ...

To get this to work when testing the function locally, set this in your local.settings.json, as documented here and explained here.

{
  "Values": {
    "AzureFunctionsJobHost__extensions__queues__batchSize": 1
  }
}

CI/CD

In-depth tutorial is here, don’t forget to add Azure PowerShell tasks for stopping/starting the triggers before/after deployment.


$triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName

$triggersADF | ForEach-Object { Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.name -Force }

.....

$triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName

$triggersADF | ForEach-Object { Start-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.name -Force }