Overview
This article describes how to set up automated ADF pipeline activity log capture for the Fusion to Snowflake pipeline. After every pipeline run, detailed activity-level logs are automatically captured and stored in a dedicated Snowflake table enabling centralized log storage and easier operational tracking. This setup helps improve visibility into pipeline executions and provides historical execution tracking
Logs are fetched using the ADF REST API (management.azure.com) via a Managed Identity and inserted into Snowflake using a stored procedure.
What Gets Captured
Every activity in a pipeline run is logged as a separate row including:
- Pipeline Name , Pipeline Run ID
- Activity Name, Activity Run ID, Type and Status
- Start Time, End Time and Duration
- Rows Read, Copied and Skipped (Copy activities only)
- Source Table Name
- Integration Runtime Used
- Error message if the activity failed
- Timestamp when the log was inserted
Requirements
Before setting up pipeline log capture, ensure the following are in place:
| Requirement | Details |
|---|---|
| Managed Identity Enabled | Azure Portal → Your ADF → Settings → Managed Identities → System Assigned → Status: ON |
| Reader Role Assigned | Azure Portal → Your ADF → Access Control (IAM) → Add Role Assignment → Role: Reader → Assign to ADF Managed Identity |
| ADF Resource ID | Contains Subscription ID, Resource Group and ADF Name — see below on how to find it |
Note: Managed Identity and Reader role are required so ADF can securely call its own REST API without any credentials or secrets.
How to Find the Resource ID
The Resource ID contains everything needed to build the API URL — Subscription ID, Resource Group Name and ADF Instance Name — all in one place.
Option 1 — Azure Portal Properties:Azure Portal → Your ADF → Properties → Resource ID
Option 2 — JSON View of ADF Instance:Azure Portal → Your ADF → top right corner → JSON View button
Shows full resource JSON including:
"id": "/subscriptions/{subId}/resourceGroups/{rg}/providers/Microsoft.DataFactory/factories/{adfName}"
Template Files
As part of this setup, you will find 2 files included in the template:
CREATE_TABLE_QUERY_FOR_PIPELINE_ACTIVITY_LOGS_TABLE — contains the create table statement for the
ADF_ACTIVITY_LOGStable. Please replace the schema name with your actual schema name before execution.ADF_ACTIVITY_LOGS — contains the stored procedure
ADF_INSERT_ACTIVITY_LOGS. Please replace the schema name with your actual schema name before execution.
Setup Steps
Step 1 — Run the Table DDL in Snowflake
Execute File 1 in your Snowflake worksheet to create the ADF_ACTIVITY_LOGS table in your target schema.
Step 2 — Create the Stored Procedure in Snowflake
Execute File 2 in your Snowflake worksheet to create the ADF_INSERT_ACTIVITY_LOGS stored procedure.
Step 3 — Update Web Activity Resource Details
After importing the pipeline, update the Web Activity (GET_PIPELINE_ACTIVITY_LOGS) with your ADF resource details.
In the Web Activity URL field, replace the resourceId placeholder with your actual ADF instance resourceId.
Note: Double-check that the Web Activity URL follows the format below:
@{concat( 'https://management.azure.com/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/YourResourceGroup/providers/Microsoft.DataFactory/factories/YourADFName/pipelineruns/', pipeline().RunId, '/queryActivityruns?api-version=2018-06-01' )}
Step 4 — Publish All changes
- Publish the changes and click trigger now to execute the pipeline