integration-pipelines

Dynamic Datasets in Azure Data Factory

In a previous post linked at the bottom, I showed how you can setup global parameters in your Data Factory that is accessible from any pipeline at run time. This post will show you how you can leverage global parameters to minimize the number of datasets you need to create. Specifically, I will show how …

Dynamic Datasets in Azure Data Factory Read More »

Parameterize Synapse Analytics Spark Notebooks Efficiently

When creating pipelines in any sort of data flow to move data from an incoming source to a target location, ideally you don’t want to create single-purpose activities. These would only perform one action or set of actions on a specific set of source and target objects. E.g. take the source.sales table, filter out where …

Parameterize Synapse Analytics Spark Notebooks Efficiently Read More »