How to Set Global Parameters in an Existing Azure Data Factory Using PowerShell
Global parameters in Azure Data Factory (ADF) let you define constants that can be reused across all pipelines in a factory — think environment names, retry counts, API base URLs, and more. Managing them via PowerShell gives you repeatability, version control, and automation that clicking through the Azure Portal simply cannot match.
In this post, I'll walk you through a clean, production-ready script that sets global parameters on an existing ADF instance without recreating it.
Why Global Parameters?
ADF global parameters solve a common problem: hardcoded values scattered across dozens of pipelines. Instead of updating each pipeline individually when you promote from UAT to Production, you update a single global parameter and every pipeline picks it up automatically.
Common use cases:
Environment—Dev,UAT,ProductionMaxRetryCount— control retry logic centrallyStorageAccountUrl— swap endpoints per environmentNotificationEmail— route alerts without pipeline changes
Prerequisites
- PowerShell 5.1 or later
- An Azure subscription with Contributor access on the ADF resource
- The
Az.DataFactoryPowerShell module (the script installs it automatically)
The Script
if (-not (Get-Module -ListAvailable -Name Az.DataFactory)) {
Install-Module -Name Az.DataFactory -Scope CurrentUser -Force
}
Import-Module Az.DataFactory
Connect-AzAccount -SubscriptionId "6f50c2eb-40ba-45e8-be26-98443a01899f"
# Define Resource Details
$resourceGroupName = "rg1"
$dataFactoryName = "adf1"
# Add or remove parameters in this JSON array
$paramJson = @'
[
{ "Name": "Environment", "Type": "String", "Value": "Production" },
{ "Name": "MaxRetryCount", "Type": "Int", "Value": 3 }
]
'@ | ConvertFrom-Json
$globalParams = New-Object "System.Collections.Generic.Dictionary[string, Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]"
foreach ($entry in $paramJson) {
$spec = New-Object Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification
$spec.Type = $entry.Type
$spec.Value = $entry.Value
$globalParams.Add($entry.Name, $spec)
}
# Fetch the factory's real location — avoids location mismatch errors
$existingFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
$location = $existingFactory.Location
Set-AzDataFactoryV2 `
-ResourceGroupName $resourceGroupName `
-Name $dataFactoryName `
-GlobalParameterDefinition $globalParams `
-Location $location `
-Force
Key Design Decisions
1. Install the Module Only When Missing
if (-not (Get-Module -ListAvailable -Name Az.DataFactory)) {
Install-Module -Name Az.DataFactory -Scope CurrentUser -Force
}
Running Install-Module unconditionally adds several seconds of overhead on every run. This guard makes the script safe to call repeatedly in a CI/CD pipeline.
2. Inline JSON Array for Parameters
Parameters are defined as an inline JSON array, making it trivial to add a new one — no extra files, no hashtable type-conversion issues:
[
{ "Name": "Environment", "Type": "String", "Value": "Production" },
{ "Name": "MaxRetryCount", "Type": "Int", "Value": 3 }
]
Supported Type values: String, Int, Float, Bool, Array, Object.
3. Dynamically Fetch the Factory Location
A common gotcha: Set-AzDataFactoryV2 requires the -Location parameter, and if you hard-code the wrong display name (e.g., "East US" instead of "East US 2"), Azure throws a Conflict / InvalidResourceLocation error. Fetching the location from the existing factory object eliminates this entirely:
$location = (Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName).Location
4. -Force Skips the Confirmation Prompt
Without -Force, PowerShell asks "A data factory with this name exists. Are you sure?" — which breaks unattended runs. Since we are intentionally updating an existing factory, -Force is the correct flag to use.
Adding a New Global Parameter
Open the script and add a line to the JSON array:
[
{ "Name": "Environment", "Type": "String", "Value": "Production" },
{ "Name": "MaxRetryCount", "Type": "Int", "Value": 3 },
{ "Name": "StorageAccountUrl", "Type": "String", "Value": "https://mystorage.blob.core.windows.net" }
]
Run the script. Done.
Note: Set-AzDataFactoryV2 replaces the entire global parameter set. Parameters not included in the array will be removed. Always include all parameters you want to keep.
Running the Script
.\Untitled-1.ps1
A browser window opens for Azure sign-in. After authentication, the script fetches the factory, applies the parameters, and exits.
Integrating with CI/CD
For automated pipelines (GitHub Actions, Azure DevOps), replace browser sign-in with a service principal:
Connect-AzAccount -ServicePrincipal ` -TenantId $env:ARM_TENANT_ID ` -Credential (New-Object PSCredential($env:ARM_CLIENT_ID, (ConvertTo-SecureString $env:ARM_CLIENT_SECRET -AsPlainText -Force)))
Store credentials as pipeline secrets, never in source code.
Conclusion
Managing ADF global parameters through PowerShell is straightforward once you know the two key pitfalls — location mismatch and the replace-all behavior. With an inline JSON array as your parameter source, the script stays readable, easily extendable, and pipeline-friendly.
#AzureDataFactory, #PowerShell, #ADF, #Azure, #DataEngineering, #CloudAutomation, #ETL, #DevOps, #AzureAutomation, #DataPipeline