
What the heck are they?
How to create Global Parameters
1. Once logged into your Data Factory workspace, navigate to the Manage tab on the left-hand side, then to the Global Parameters section.
2. Click on the “+ New” button just underneath the page heading.



3. In the popup window that appears to the right hand side of the screen:
- Supply the name of the variable (avoid spaces and dashes in the name, this causes runtime errors at the moment due to how parameter names are referenced at runtime)
- Define the data type of the variable e.g. string, integer etc.
- Provide the value. Note, this is a read only parameter and cannot be updated at runtime like a variable
4. Once created, you can edit it if need be. If you need to apply bulk changes, the “Edit All” button can assist with that.


Referencing the Global Parameters at runtime
Store source and target resources connection details
- Connect the UAT data factory to the GIT repo’s UAT branch.
- Once all code changes have been made as needed, perform a pull request from the UAT branch to the PROD branch.
- Perform a commit in the PROD branch to update the json file containing the global parameters to the PROD values.
- Connect the PROD data factory to the PROD branch.
- Publish the PROD branch’s code to the Live Mode of the PROD Data factory.
When using CI/CD
- Choose to include them in the ARM template generated at publish time and alter them using Azure Pipelines
- Update them using post deploy PowerShell scripts
Choose to include them in the ARM template generated at publish time

Update the parameters using post deploy PowerShell scripts
{
"name": "MyDataFactory",
"properties": {
"globalParameters": {
"AzureDataLakeURL": {
"type": "string",
"value": "https://mydatalakename.dfs.core.windows.net/"
}
}
},
"location": "southafricanorth",
"identity": {
"type": "SystemAssigned",
"principalId": "12345678-4321-0987-6748-123456789076",
"tenantId": "12345678-4321-0987-6748-123456789076"
}
}
Please note, the official documentation here appears to be outdated in the method these global variables are stored.
The article mentions that the global variables are stored in a dedicated json file in the git repo in a folder called globalParameters.
However, when I set this up myself, I found it was stored as noted above in the factory config file as a nested attribute.
Microsoft Provided PowerShell for dedicated globalParamters file alteration
# Define the paramters you will provide values for at runtime
param
(
[parameter(Mandatory = $true)] [String] $globalParametersFilePath,
[parameter(Mandatory = $true)] [String] $resourceGroupName,
[parameter(Mandatory = $true)] [String] $dataFactoryName
)
# Import the cmdlets you will need below
Import-Module Az.DataFactory
# Define a new jsonb object you will populate and use as new content for the file
$newGlobalParameters = New-Object 'system.collections.generic.dictionary[string,Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]'
# Write out a message for debugging and logging purposes
Write-Host "Getting global parameters JSON from: " $globalParametersFilePath
# Get the current content of the object
$globalParametersJson = Get-Content $globalParametersFilePath
# Write out a message for debugging and logging purposes
Write-Host "Parsing JSON..."
# Convert the JSON String of the variable to a JSON OBject types variable
$globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
# For each paramter in the file you need to alter, perform some actions in a for each loop
foreach ($gp in $globalParametersObject.GetEnumerator()) {
# Write out a message for debugging and logging purposes
Write-Host "Adding global parameter:" $gp.Key
# Get the value of the current loop iteration's referenced paramter
$globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification])
# Add the current loop iteration's referenced paramter to a collection
$newGlobalParameters.Add($gp.Key, $globalParameterValue)
}
# Get the context of the target data factory to update
$dataFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
# Assign your new paramter values to this context value in memory
$dataFactory.GlobalParameters = $newGlobalParameters
# Write out a message for debugging and logging purposes
Write-Host "Updating" $newGlobalParameters.Count "global parameters."
# Force update the target data factory
Set-AzDataFactoryV2 -InputObject $dataFactory -Force
Custom PowerShell for factory config file globalParamters attribute alteration
# Define the paramters you will provide values for at runtime
param
(
[parameter(Mandatory = $true)] [String] $globalParametersFilePath,
[parameter(Mandatory = $true)] [String] $resourceGroupName,
[parameter(Mandatory = $true)] [String] $dataFactoryName
)
# Import the cmdlets you will need below
Import-Module Az.DataFactory
# Define a new jsonb object you will populate and use as new content for the file
$newGlobalParameters = New-Object 'system.collections.generic.dictionary[string,Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]'
# Write out a message for debugging and logging purposes
Write-Host "Getting global parameters JSON from: " $globalParametersFilePath
# Get the current content of the json file
$factoryFileJson = Get-Content $globalParametersFilePath
# Write out a message for debugging and logging purposes
Write-Host "Parsing JSON..."
# Convert the JSON String of the variable to a JSON OBject types variable
$factoryFileObject = [Newtonsoft.Json.Linq.JObject]::Parse($factoryFileJson)
# For each paramter in the file you need to alter, perform some actions in a for each loop
foreach ($gp in $factoryFileObject.properties.globalParameters.GetEnumerator()) {
# Write out a message for debugging and logging purposes
Write-Host "Adding global parameter:" $gp.Key
# Get the value of the current loop iteration's referenced paramter
$globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification])
# Add the current loop iteration's referenced paramter to a collection
$newGlobalParameters.Add($gp.Key, $globalParameterValue)
}
# Get the context of the target data factory to update
$dataFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
# Assign your new paramter values to this context value in memory
$dataFactory.GlobalParameters = $newGlobalParameters
# Write out a message for debugging and logging purposes
Write-Host "Updating" $newGlobalParameters.Count "global parameters."
# Force update the target data factory
Set-AzDataFactoryV2 -InputObject $dataFactory -Force
Disclaimer – I have not yet tried the PowerShell code above myself, I only came across this as a thing while researching for this article. So if you find any piece of the code not working as expected, please leave a comment below with your findings so we can update the article to ensure the code is usable for anyone referencing it in future.
Pipelines in Synapse Analytics Consideration
- For each “global variable” you need, define a variable in the pipeline itself
- Use a “Set Variable” activity at the start of the pipeline flow to update this variable
- For the expression, use an if statement to determine what the current synapse workspace name is by referencing the system variable that contains the current synapse workspace name.
- If it is the DEV instance, provide the DEV version of your parameter value, if it is UAT instance name, provide the UAT value name etc.
If you like what I do please consider supporting me on Ko-Fi
Just wanna remark on few general things, The website style and design is perfect, the subject matter is real good : D.
Heya i’m for the first time here. I found this board and I find It truly useful & it helped me out much. I hope to give something back and aid others like you helped me.
Just wanna comment on few general things, The website style and design is perfect, the written content is very excellent : D.
you’re really a good webmaster. The web site loading speed is incredible. It seems that you are doing any unique trick. Furthermore, The contents are masterpiece. you have done a magnificent job on this topic!
I was very pleased to find this web-site.I wanted to thanks for your time for this wonderful read!! I definitely enjoying every little bit of it and I have you bookmarked to check out new stuff you blog post.
Thank you for the good writeup. It in fact was a amusement account it. Look advanced to far added agreeable from you! By the way, how could we communicate?
it works! thanks a lot, you save my life