Friday, April 24, 2026

Parsing Copilot Studio Conversation Transcripts in Power Apps

Parsing Copilot Studio Conversation Transcripts in Power Apps

Ever stared at a Copilot Studio transcript JSON and wondered how to turn that nested mess into a clean, readable table? 

The raw transcript comes as a single JSON string with dozens of activities: traces, events, debug plans, tool calls, and somewhere buried in there, the actual user prompts and bot responses. Most of it is plumbing. What you really care about are the messages where role = 1 (user) and role = 0 (bot), paired together by the replyToId field.

The trick in Power Fx is ParseJSON combined with a Clear + Collect pattern. ForAll iterates over each transcript, filters activities down to type = "message", then pairs each user prompt with its matching bot response using the replyToId. Timestamps come through as Unix epoch seconds, so a quick DateAdd against DateTime(1970,1,1,0,0,0) with TimeUnit.Seconds converts them to real datetimes.

Here's the full formula I dropped into App.OnStart:

Clear(colChatTurns);
ForAll(
    ConversationTranscripts As Transcript,
    With(
        { parsed: ParseJSON(Transcript.Content) },
        With(
            {
                allMessages: Filter(
                    Table(parsed.activities),
                    Text(ThisRecord.Value.type) = "message"
                )
            },
            ForAll(
                Filter(allMessages, Text(ThisRecord.Value.from.role) = "1") As UserMsg,
                Collect(
                    colChatTurns,
                    {
                        ReplyToId: Text(UserMsg.Value.id),
                        UserPrompt: Text(UserMsg.Value.text),
                        PromptTimestamp: DateAdd(
                            DateTime(1970,1,1,0,0,0),
                            Value(UserMsg.Value.timestamp),
                            TimeUnit.Seconds
                        ),
                        BotResponse: Text(
                            First(
                                Filter(
                                    allMessages,
                                    Text(ThisRecord.Value.from.role) = "0" 
                                    And Text(ThisRecord.Value.replyToId) = Text(UserMsg.Value.id)
                                )
                            ).Value.text
                        ),
                        ResponseTimestamp: DateAdd(
                            DateTime(1970,1,1,0,0,0),
                            Value(
                                First(
                                    Filter(
                                        allMessages,
                                        Text(ThisRecord.Value.from.role) = "0" 
                                        And Text(ThisRecord.Value.replyToId) = Text(UserMsg.Value.id)
                                    )
                                ).Value.timestamp
                            ),
                            TimeUnit.Seconds
                        )
                    }
                )
            )
        )
    )
)

A few gotchas worth knowing. ParseJSON returns untyped objects, so every field needs explicit coercion with Text() or Value(). The role field can behave as either a number or a string depending on the source, so comparing with "1" and "0" as strings is safer than numeric comparisons. Ungroup sounds like the right tool for flattening nested results, but it chokes on string identifiers — Clear + Collect inside a ForAll loop is cleaner and easier to debug. And always check View → Collections to see if your data is actually landing where you think it is.

Once the collection is populated, binding it to a Data Table gives you an instant conversation viewer. Load it on App.OnStart for speed, or on Screen.OnVisible if you need fresh data every visit. Skip the intermediate gallery and point straight at your data source to keep things lean.

From messy telemetry JSON to a working chat review dashboard.

Tags: PowerApps, PowerFx, CopilotStudio, LowCode, MicrosoftPowerPlatform, JSON, ParseJSON, Dataverse, SharePoint, ConversationAnalytics, ChatbotAnalytics, AIAgents, PowerAppsTutorial, EnterpriseIT, SolutionArchitecture

Wednesday, April 22, 2026

How to Set Global Parameters in an Existing Azure Data Factory Using PowerShell

How to Set Global Parameters in an Existing Azure Data Factory Using PowerShell

Global parameters in Azure Data Factory (ADF) let you define constants that can be reused across all pipelines in a factory — think environment names, retry counts, API base URLs, and more. Managing them via PowerShell gives you repeatability, version control, and automation that clicking through the Azure Portal simply cannot match.

In this post, I'll walk you through a clean, production-ready script that sets global parameters on an existing ADF instance without recreating it.


Why Global Parameters?

ADF global parameters solve a common problem: hardcoded values scattered across dozens of pipelines. Instead of updating each pipeline individually when you promote from UAT to Production, you update a single global parameter and every pipeline picks it up automatically.

Common use cases:

  • EnvironmentDev, UAT, Production
  • MaxRetryCount — control retry logic centrally
  • StorageAccountUrl — swap endpoints per environment
  • NotificationEmail — route alerts without pipeline changes

Prerequisites

  • PowerShell 5.1 or later
  • An Azure subscription with Contributor access on the ADF resource
  • The Az.DataFactory PowerShell module (the script installs it automatically)

The Script

if (-not (Get-Module -ListAvailable -Name Az.DataFactory)) {
    Install-Module -Name Az.DataFactory -Scope CurrentUser -Force
}

Import-Module Az.DataFactory

Connect-AzAccount -SubscriptionId "6f50c2eb-40ba-45e8-be26-98443a01899f"

# Define Resource Details
$resourceGroupName = "rg1"
$dataFactoryName   = "adf1"

# Add or remove parameters in this JSON array
$paramJson = @'
[
  { "Name": "Environment",   "Type": "String", "Value": "Production" },
  { "Name": "MaxRetryCount", "Type": "Int",    "Value": 3            }
]
'@ | ConvertFrom-Json

$globalParams = New-Object "System.Collections.Generic.Dictionary[string, Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]"

foreach ($entry in $paramJson) {
    $spec       = New-Object Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification
    $spec.Type  = $entry.Type
    $spec.Value = $entry.Value
    $globalParams.Add($entry.Name, $spec)
}

# Fetch the factory's real location — avoids location mismatch errors
$existingFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
$location        = $existingFactory.Location

Set-AzDataFactoryV2 `
  -ResourceGroupName         $resourceGroupName `
  -Name                      $dataFactoryName `
  -GlobalParameterDefinition $globalParams `
  -Location                  $location `
  -Force

Key Design Decisions

1. Install the Module Only When Missing

if (-not (Get-Module -ListAvailable -Name Az.DataFactory)) {
    Install-Module -Name Az.DataFactory -Scope CurrentUser -Force
}

Running Install-Module unconditionally adds several seconds of overhead on every run. This guard makes the script safe to call repeatedly in a CI/CD pipeline.

2. Inline JSON Array for Parameters

Parameters are defined as an inline JSON array, making it trivial to add a new one — no extra files, no hashtable type-conversion issues:

[
  { "Name": "Environment",   "Type": "String", "Value": "Production" },
  { "Name": "MaxRetryCount", "Type": "Int",    "Value": 3            }
]

Supported Type values: String, Int, Float, Bool, Array, Object.

3. Dynamically Fetch the Factory Location

A common gotcha: Set-AzDataFactoryV2 requires the -Location parameter, and if you hard-code the wrong display name (e.g., "East US" instead of "East US 2"), Azure throws a Conflict / InvalidResourceLocation error. Fetching the location from the existing factory object eliminates this entirely:

$location = (Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName).Location

4. -Force Skips the Confirmation Prompt

Without -Force, PowerShell asks "A data factory with this name exists. Are you sure?" — which breaks unattended runs. Since we are intentionally updating an existing factory, -Force is the correct flag to use.


Adding a New Global Parameter

Open the script and add a line to the JSON array:

[
  { "Name": "Environment",       "Type": "String", "Value": "Production"                          },
  { "Name": "MaxRetryCount",     "Type": "Int",    "Value": 3                                     },
  { "Name": "StorageAccountUrl", "Type": "String", "Value": "https://mystorage.blob.core.windows.net" }
]

Run the script. Done.

Note: Set-AzDataFactoryV2 replaces the entire global parameter set. Parameters not included in the array will be removed. Always include all parameters you want to keep.

Running the Script

.\Untitled-1.ps1

A browser window opens for Azure sign-in. After authentication, the script fetches the factory, applies the parameters, and exits.


Integrating with CI/CD

For automated pipelines (GitHub Actions, Azure DevOps), replace browser sign-in with a service principal:

Connect-AzAccount -ServicePrincipal `
  -TenantId   $env:ARM_TENANT_ID `
  -Credential (New-Object PSCredential($env:ARM_CLIENT_ID, (ConvertTo-SecureString $env:ARM_CLIENT_SECRET -AsPlainText -Force)))

Store credentials as pipeline secrets, never in source code.


Conclusion

Managing ADF global parameters through PowerShell is straightforward once you know the two key pitfalls — location mismatch and the replace-all behavior. With an inline JSON array as your parameter source, the script stays readable, easily extendable, and pipeline-friendly.


#AzureDataFactory, #PowerShell, #ADF, #Azure, #DataEngineering, #CloudAutomation, #ETL, #DevOps, #AzureAutomation, #DataPipeline

Wednesday, April 15, 2026

Power Platform + Postman: The Complete Developer Guide

Power Platform Developer Series

Power Platform + Postman:
The Complete Developer Guide

✍ Sreekanth Udayagiri  |  📅 April 2026  |  ⏱ 15 min read

Postman is one of the most popular API development and testing tools used by developers worldwide. When working with Microsoft Power Platform, Postman plays multiple roles: as an API client, a testing harness, a debugging tool, a connector generator, and even as an integration target.

In this post, I'll walk you through all 6 ways Postman is used with Power Platform — with sample code and step-by-step instructions for each, based entirely on official Microsoft documentation.

💡
Based on Official Microsoft DocsMicrosoft officially supports Postman for Power Platform development. Their documentation lists Postman alongside Insomnia, Bruno, and curl as recommended API client tools for Dataverse Web API. All steps in this guide are verified against Microsoft Learn.
01

Dataverse Web API

Connect, authenticate, and run CRUD operations on Dataverse tables

02

Power Automate HTTP Trigger

Test and trigger Power Automate flows via HTTP endpoints

03

Custom Connectors

Import Postman Collections to auto-generate custom connectors

04

Custom APIs in Dataverse

Test plugin-backed custom API messages via Postman

05

Power Pages Web API

Test portal table permissions and data operations

06

Native Power Automate Integration

Backup collections, send monitor results to Power Automate


1

Postman with Dataverse Web API — Setup + CRUD

Connect Postman to your Power Apps / Dataverse environment and run Create, Read, Update, Delete operations

Microsoft Dataverse exposes a RESTful Web API based on OData v4. Postman lets you authenticate to this API and run queries — without writing any code. This is useful for debugging, data exploration, and verifying Power Platform logic.

Step 1 — Get Your Dataverse Web API URL

1
Go to make.powerapps.com— Select your environment from the top-right environment picker.
2
Click the Settings (⚙) icon→ select Developer resources.
3
Copy the Web API endpoint.It looks like:
https://yourorg.api.crm.dynamics.com/api/data/v9.2
Remove /api/data/v9.2 — keep only the part ending in .dynamics.com

Step 2 — Create Postman Environment (6 Variables)

In Postman, go to Environments → + New. Add these 6 variables:

VariableInitial ValueNotes
urlhttps://yourorg.api.crm.dynamics.comYour Dataverse org URL — no trailing slash
clientid51f81489-12ee-4a9e-aaae-a2591f45987dMicrosoft's pre-registered app ID — works for all Dataverse environments. No Azure App Registration needed.
version9.2Latest stable Web API version
webapiurl{{url}}/api/data/v{{version}}/Composite variable — base URL for all requests
redirecturlhttps://localhostOAuth redirect URI
authurlhttps://login.microsoftonline.com/common/oauth2/authorize?resource={{url}}Azure AD authorization endpoint
⚠️
ImportantThe clientid value 51f81489-12ee-4a9e-aaae-a2591f45987d is a Microsoft-provided application ID pre-approved for all Dataverse environments. This is documented on Microsoft Learn. You do NOT need to register your own Azure AD app to get started.

Step 3 — Authenticate with OAuth 2.0

Create a new HTTP request. In the Authorization tab, set:

Postman Auth Tab ConfigurationOAuth 2.0
// Authorization Tab → Type: OAuth 2.0
// Add auth to: Request Headers
// Configure New Token section:

Token Name    : DataverseToken
Grant Type    : Implicit
Auth URL      : {{authurl}}
Client ID     : {{clientid}}
Redirect URL  : {{redirecturl}}
Client Auth   : Send client credentials in body

// Click "Get New Access Token" → Sign in with your M365 account
// Click "Use Token" → Token is added to Authorization header automatically

Step 4 — Verify Connection with WhoAmI

WhoAmI — Test ConnectionGET
URL: {{webapiurl}}WhoAmI

// Required Headers (add via Header Manager)
Accept          : application/json
OData-MaxVersion: 4.0
OData-Version   : 4.0

// Expected Response — 200 OK
{
  "@odata.context": "https://yourorg.api.crm.dynamics.com/api/data/v9.2/$metadata#...",
  "BusinessUnitId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "UserId":         "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "OrganizationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
}

CRUD Operations — Sample Requests

GET Retrieve Accounts (top 3, with filter)

Retrieve RecordsGET
// Get top 3 accounts — select specific fields
URL: {{webapiurl}}accounts?$select=name,telephone1,emailaddress1&$top=3

// Get single record by GUID
URL: {{webapiurl}}accounts(xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)?$select=name,revenue

// Filter — accounts in a specific city
URL: {{webapiurl}}accounts?$filter=address1_city eq 'Bangalore'&$select=name,address1_city

// Expand — contacts linked to an account
URL: {{webapiurl}}accounts?$select=name&$expand=contact_customer_accounts($select=fullname,emailaddress1)

POST Create a new Contact

Create RecordPOST
URL: {{webapiurl}}contacts

// Headers
Content-Type    : application/json
Accept          : application/json
OData-MaxVersion: 4.0
OData-Version   : 4.0

// Body — raw JSON
{
  "firstname":      "Sreekanth",
  "lastname":       "Test",
  "emailaddress1":  "test@example.com",
  "telephone1":     "+91-9999999999",
  "jobtitle":       "M365 Administrator"
}

// Response: 204 No Content
// New record GUID is in the OData-EntityId response header

PATCH Update an existing record

Update RecordPATCH
URL: {{webapiurl}}contacts(xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)

// Headers — include If-Match: * to avoid 412 Precondition Failed
Content-Type    : application/json
OData-MaxVersion: 4.0
OData-Version   : 4.0
If-Match        : *

// Body — only include fields you want to update
{
  "jobtitle":   "Senior M365 Administrator",
  "telephone1": "+91-8888888888"
}

// Response: 204 No Content = success

DELETE Delete a record

Delete RecordDELETE
URL: {{webapiurl}}contacts(xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)

// No request body needed
// Response: 204 No Content = successfully deleted

2

Testing Power Automate HTTP Request Triggers

Use Postman to manually fire Power Automate flows via HTTP endpoints

Power Automate's "When an HTTP request is received" trigger generates a unique URL endpoint. Postman is the ideal tool to test these flows — send JSON payloads, inspect responses, and validate flow logic without building a frontend first.

Scenario A — Anonymous Trigger (Anyone with URL)

The simplest case — no authentication required. Suitable for dev/test only; not recommended for production.

Power Automate Anonymous HTTP TriggerPOST
// Copy the HTTP POST URL from the trigger card in Power Automate
URL: https://prod-xx.westus.logic.azure.com:443/workflows/<workflow-id>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<signature>

// Headers
Content-Type: application/json
Accept      : application/json

// Body — must match the JSON Schema defined in the trigger
{
  "employeeId":   "EMP001",
  "employeeName": "John Doe",
  "department":   "IT",
  "action":       "offboard"
}

// Response: 202 Accepted — flow triggered successfully

JSON Schema to Define in Power Automate Trigger

JSON Schema — Paste into Power Automate triggerJSON
{
  "type": "object",
  "properties": {
    "employeeId":   { "type": "string" },
    "employeeName": { "type": "string" },
    "department":   { "type": "string" },
    "action":       { "type": "string" }
  }
}

Scenario B — Tenant-Restricted Trigger (Entra ID OAuth Required)

When a flow is restricted to "Any user in my tenant", Postman must pass a bearer token. Two steps: get the token first, then call the flow.

Step 1 — Get Bearer Token from Entra IDPOST
URL: https://login.microsoftonline.com/<YOUR_TENANT_ID>/oauth2/v2.0/token

// Body: x-www-form-urlencoded (NOT JSON)
grant_type   : client_credentials
client_id    : <YOUR_ENTRA_APP_CLIENT_ID>
client_secret: <YOUR_ENTRA_APP_CLIENT_SECRET>
scope        : https://service.flow.microsoft.com//.default

// ⚠️ CRITICAL: Use double slash before .default in scope
// Single slash → "MisMatchingOAuthClaims" error

// Response
{
  "access_token": "eyJ0eXAiOiJKV1Qi...",
  "token_type":   "Bearer",
  "expires_in":   3599
}
Step 2 — Call the Tenant-Restricted FlowPOST
URL: https://<environmentID>.environment.api.powerplatform.com/powerautomate/automations/direct/workflows/<flow-id>/triggers/manual/paths/invoke?api-version=1&sp=/triggers/manual/run&sv=1.0&sig=<sig>

// Headers
Authorization: Bearer <access_token_from_step_1>
Content-Type : application/json
Accept       : application/json

// Body
{
  "message":     "Hello from Postman",
  "requestedBy": "Sreekanth"
}
💡
Pro Tip — Auto Token Refresh with Pre-request ScriptAdd a Pre-request Script in Postman to auto-fetch a fresh token before every request. This saves time when testing repeatedly.
Pre-request Script — Auto Token RefreshJavaScript
// Paste this in: Pre-request Script tab of your collection/request
const tokenUrl = `https://login.microsoftonline.com/${pm.environment.get('tenantId')}/oauth2/v2.0/token`;

pm.sendRequest({
  url   : tokenUrl,
  method: 'POST',
  header: { 'Content-Type': 'application/x-www-form-urlencoded' },
  body  : {
    mode: 'urlencoded',
    urlencoded: [
      { key: 'grant_type',    value: 'client_credentials' },
      { key: 'client_id',     value: pm.environment.get('clientId') },
      { key: 'client_secret', value: pm.environment.get('clientSecret') },
      { key: 'scope',         value: 'https://service.flow.microsoft.com//.default' }
    ]
  }
}, (err, res) => {
  if (!err) {
    pm.environment.set('bearerToken', res.json().access_token);
  }
});

3

Creating Custom Connectors from a Postman Collection

Import a Postman Collection into Power Automate to auto-generate a Custom Connector — no Swagger file needed

If you have a third-party REST API, you can build a Postman Collection for it, export it, and Power Automate will automatically generate a Swagger definition and create a Custom Connector. This saves hours of manual API definition work.

Step 1 — Build and Export Your Postman Collection

Sample Collection JSON (V1 format)JSON
{
  "info": {
    "name":   "ServiceNow KB API",
    "schema": "https://schema.getpostman.com/json/collection/v1/collection.json"
  },
  "item": [
    {
      "name": "Get KB Articles",
      "request": {
        "method": "GET",
        "header": [
          { "key": "Accept",        "value": "application/json" },
          { "key": "Authorization", "value": "Bearer {{token}}" }
        ],
        "url": "{{baseUrl}}/api/now/table/kb_knowledge?sysparm_limit=10"
      }
    },
    {
      "name": "Create KB Article",
      "request": {
        "method": "POST",
        "header": [
          { "key": "Content-Type", "value": "application/json" }
        ],
        "body": {
          "mode": "raw",
          "raw": "{\"short_description\":\"Test Article\",\"text\":\"Content here\"}"
        },
        "url": "{{baseUrl}}/api/now/table/kb_knowledge"
      }
    }
  ]
}
1
In Postman— right-click your collection →Export
2
Select"Collection v1"format → click Export → save the JSON file locally.

Step 2 — Import into Power Automate

1
Go tomake.powerautomate.com → Data → Custom Connectors → + New custom connector → Import a Postman collection
2
Upload your exported V1 JSON file. Power Automate parses it and generates Swagger automatically.
3
Review auto-generated Actions. Configure theSecurity tab(OAuth 2.0, API Key, or Basic Auth).
4
ClickCreate Connector. Now available in Power Automate flows and Power Apps.
🎯
Best Use CaseWhen you have a third-party API (ServiceNow, Salesforce, any internal REST API) and want to call it from Power Automate or Power Apps without writing code — and you don't have a Swagger/OpenAPI spec ready.

4

Testing Custom APIs in Dataverse

Invoke and test your custom Dataverse API messages — plugin-backed actions, functions, and business events

Dataverse Custom APIs let developers define their own messages (actions or functions) — typically backed by C# plugins. Postman is the recommended tool for testing these during development. All three types of custom API calls are shown below.

Testing a Custom Unbound Action

Custom Unbound ActionPOST
// Unbound = not tied to a specific table record
URL: {{webapiurl}}myapi_CustomUnboundAPI
// Format: [publisher_prefix]_[APIName]

// Headers
Content-Type    : application/json
Accept          : application/json
OData-MaxVersion: 4.0
OData-Version   : 4.0

// Body — your custom API's input parameters
{
  "InputParam1": "SampleValue",
  "InputParam2": 42,
  "FlagParam":   true
}

Testing a Custom Bound Action (tied to a record)

Custom Bound ActionPOST
// Bound to a specific table and record GUID
URL: {{webapiurl}}accounts(xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)/Microsoft.Dynamics.CRM.myapi_BoundAction

// Body
{
  "ActionType": "Approve",
  "Comment":    "Approved via Postman test"
}

Testing a Custom Function (read-only, uses GET)

Custom FunctionGET
// Functions use GET — parameters go in the URL, not the body
URL: {{webapiurl}}myapi_GetEmployeeDetails(EmployeeId='EMP001',Department='IT')

// Expected response — your plugin's output parameters
{
  "@odata.context": "...",
  "EmployeeName":   "Sreekanth",
  "Status":         "Active",
  "RolesAssigned":  ["M365 Admin", "Power Platform Maker"]
}
📌
Create Custom APIs via Postman (Deep Insert)Microsoft supports creating Custom API records directly via Postman using deep insert — a single POST to {{webapiurl}}customapis. Useful for rapid iteration: create, test, delete, recreate — all without going to the maker portal.

5

Power Pages Web API Testing

Test your Power Pages portal's table permissions, anonymous vs. authenticated access, and data operations

Power Pages exposes a subset of Dataverse operations through the Portals Web API. This runs in portal context and respects table permissions (web roles). Postman is useful for verifying whether your portal's permissions are configured correctly.

⚠️
Power Pages Web API ≠ Dataverse Web APIThe base URL is your portal URL (not the Dataverse org URL). Authentication uses portal session cookies — not the Dataverse OAuth token. Keep these as separate Postman environments.

GET — Retrieve Data via Power Pages Web API

Power Pages Web API — RetrieveGET
// Base URL: your portal URL + /_api/
URL: https://yourportal.powerappsportals.com/_api/contacts?$select=fullname,emailaddress1&$top=5

// Required Headers
Accept                    : application/json
Content-Type              : application/json
__RequestVerificationToken: <antiforgery_token>

// For authenticated portal sessions, pass the cookie:
Cookie: .AspNet.ApplicationCookie=<cookie_value>

// Get antiforgery token first:
// GET https://yourportal.powerappsportals.com/_api/antiforgery/token

POST — Create a Record via Power Pages Web API

Power Pages Web API — CreatePOST
URL: https://yourportal.powerappsportals.com/_api/cr123_customentities

// Headers
Content-Type              : application/json
__RequestVerificationToken: <antiforgery_token>

// Body
{
  "cr123_name":   "Test Record from Postman",
  "cr123_status": "Active"
}

// 201 Created  = success
// 403 Forbidden = table permission not granted for the web role
// 401 Unauthorized = portal login required
💡
Debug Table Permissions FasterWhen Power Pages returns 403, test the same request with different portal session cookies (different web roles) to identify which web role is missing the required table permission. Much faster than navigating the portal admin UI repeatedly.

6

Native Postman ↔ Power Automate Integration

Postman natively integrates with Power Automate — backup collections, send monitor results, and post activity feeds

In this scenario, Postman is the event source and Power Automate is the target. Postman sends events — collection changes, monitor results, team activity — into Power Automate flows via webhooks. Useful for team notification workflows and automated backups.

Use Case 1 — Auto-backup a Postman Collection

1
In Power Automate, create a flow with When an HTTP request is received trigger. Copy the generated webhook URL.
2
In Postman:Home → Integrations → Browse All Integrations → Search "Microsoft Power Automate"
3
Select"Back up a collection". Enter the Nickname, choose your Workspace, choose your Collection, paste the Notification URL (webhook). ClickAdd Integration.
4
Postman detects collection changes and sends them to Power Automate. Your flow can then save the backup to SharePoint, OneDrive, or any storage connector.

Use Case 2 — Monitor Results → Teams/Email Notification

JSON Schema for Monitor Results WebhookJSON
// Use this schema in your Power Automate HTTP trigger to parse Postman monitor payloads
{
  "type": "object",
  "properties": {
    "collection_uid": { "type": "string" },
    "monitor_name":   { "type": "string" },
    "run_id":         { "type": "string" },
    "status":         { "type": "string" },
    "failed_count":   { "type": "integer" },
    "passed_count":   { "type": "integer" },
    "run_at":         { "type": "string" }
  }
}

// In Power Automate after trigger:
// 1. Parse JSON using the schema above
// 2. Condition: if failed_count > 0
// 3. Send Teams notification OR email via Outlook connector

Use Case 3 — Team Activity → Email Notification

Integration Setup — Team ActivityConfig
// In Postman Integration setup for "Post collection activity":
Nickname         : MyTeam-PA-Integration
Choose Workspace : Your shared workspace name
Choose Collection: Your collection to monitor
Notification URL : <Power Automate webhook URL>

// Power Automate flow actions after trigger:
// Parse JSON → Send Outlook email with dynamic content:
//   "Collection @{body('Parse_JSON')?['collection_name']} was updated
//    by @{body('Parse_JSON')?['triggered_by']} at @{utcNow()}"

⚡ Common Errors & Quick Fixes

401 Unauthorized

Token expired or missing. Click Get New Access Token in Postman Auth tab. For tenant-restricted flows, verify your Entra app has Power Automate API permissions with admin consent.

403 Forbidden

Authenticated but not authorized. For Dataverse: user lacks the required security role. For Power Pages: table permission not assigned to the web role.

406 Not Acceptable

Wrong or missing Accept header. Add Accept: application/json + OData-MaxVersion: 4.0 + OData-Version: 4.0 to all Dataverse requests.

415 Unsupported Media Type

Missing Content-Type: application/json on POST/PATCH requests. Always include this when sending a JSON body.

404 Not Found

Entity set name wrong or record GUID doesn't exist. Verify the entity set name using {{webapiurl}}$metadata — it's plural and lowercase (e.g., contacts, not Contact).

MisMatchingOAuthClaims

Power Automate API scope URL has a single slash. Must use double slash: https://service.flow.microsoft.com//.default

Required Headers — Quick Reference

Headers Cheat SheetReference
// ── Dataverse Web API (all requests) ────────────────────────
Accept          : application/json
OData-MaxVersion: 4.0
OData-Version   : 4.0
Authorization   : Bearer {{accessToken}}

// ── Add for POST / PATCH (requests with a body) ──────────────
Content-Type    : application/json

// ── Add for PATCH to avoid 412 errors ────────────────────────
If-Match        : *

// ── Power Pages Web API ───────────────────────────────────────
Accept                    : application/json
Content-Type              : application/json
__RequestVerificationToken: <antiforgery_token>

// ── Power Automate HTTP Trigger ───────────────────────────────
Content-Type    : application/json
Accept          : application/json
Authorization   : Bearer <entra_token>   // only for tenant-restricted flows

Featured Post

Parsing Copilot Studio Conversation Transcripts in Power Apps

Parsing Copilot Studio Conversation Transcripts in Power Apps Ever stared at a Copilot Studio transcript JSON and wondered how to turn t...

Popular posts