Power BI tenant to tenant migration often looks simple during planning. In reality, it can become a much more demanding project once you start working through authentication, API sequencing, throttling, report exports, dataset reconfiguration, and post-migration validation.
Most migration guidance focuses on portal-based admin steps, not the API-driven workflow developers actually need to build and maintain. That leaves a practical question: how do you handle Microsoft Power BI tenant to tenant migration programmatically?
This guide breaks down the manual framework step by step. It covers the Power BI REST API endpoints used for workspace discovery and migration, the trade-offs between service principals and master users, the rate limits that affect large-scale jobs, and the cross-tenant limitations that force additional development effort. It also shows how Apps4.Pro reduces this 10-step manual process to a 5-click automated workflow.
Notes:
Power BI APIs, limits, and supported migration behaviors can change over time. Before implementing a production migration framework, verify all endpoints, quotas, and workload limitations against current Microsoft documentation.
- Why Build a REST API Migration Framework
- Authentication: Service Principal vs. Master User
- The 10-Step Developer Migration Framework
- Power BI Migration Rate Limits You Cannot Ignore
- Critical Limitations of the API Approach
- Sample Migration Script Skeleton
- From 10 Manual Steps to 5 Clicks with Apps4.Pro
Why Build a REST API Migration Framework
The Power BI Admin Portal and Desktop client are suitable for small, one-time migrations. However, when handling large-scale tenant migrations, a manual, click-driven approach quickly becomes inefficient and difficult to manage.
A REST API framework gives you the operational control that portal-based migration lacks:
- Repeatable execution across dev, test, and production environments
- Auditable logs for every migrated workspace, report, and dataset
- Retry logic and exception handling for transient API failures
- Parallel execution within Power BI rate-limit boundaries
The trade-off is that you also own every exception path. The Power BI REST API can automate much of the migration workflow, but not every workload migrates cleanly across tenants. Gaps around features such as incremental refresh, paginated reports, and some dataflow scenarios often require custom handling, reconfiguration, or manual follow-up.
Authentication: Service Principal vs. Master User
Before you call any Power BI REST API, you need to choose an authentication model. That decision affects security, operational overhead, and which API operations your migration framework can use.
Service Principal
A service principal is a Microsoft Entra application identity. It is the modern, recommended option for unattended automation.
- No MFA prompts or user-password dependencies
- Granular API permissions through Microsoft Entra ID
- Must be explicitly enabled in the Power BI Admin Portal under developer settings
- Cannot access personal My Workspace content
- Works for many workspace and dataset operations, though some scenarios may still require delegated user authentication
Master User
A master user is a licensed Power BI Pro or PPU account whose credentials your app uses for delegated access.
- Can access personal workspaces and delegated-only scenarios
- Requires a dedicated Power BI Pro or PPU license
- MFA exceptions or workarounds may introduce security and operational risk
- Password expiration and credential management can interrupt unattended migrations
For production migration frameworks, use a service principal as the default for bulk migration tasks. Fall back to a master user only when an API requires delegated authentication, such as selected export, takeover, or personal workspace scenarios.
For service principal authentication, the standard approach is to request an access token from the Microsoft Identity Platform using the client credentials flow:
POST https://login.microsoftonline.com/{tenant-id}/oauth2/v2.0/token
Content-Type: application/x-www-form-urlencoded
grant_type=client_credentials
&client_id={app-id}
&client_secret={secret}
&scope=https://analysis.windows.net/powerbi/api/.default
Cache the bearer token for its 60-minute lifetime and refresh proactively at the 55-minute mark.
The 10-Step Developer Migration Framework
Step 1: Inventory the Source Tenant
Start with GET /admin/groups?$top=5000&$expand=users,reports,datasets,dashboards. This admin endpoint returns every workspace and its contents in one pass. Page through results using $skip for tenants with more than 5,000 workspaces. Persist the inventory to a SQL table or JSONL file. Teams that want a faster starting point can use PowerShell inventory scripts before building a full migration pipeline
Step 2: Create Target Workspaces
For each source workspace, provision a matching target using Groups – Create Group:
POST https://api.powerbi.com/v1.0/myorg/groups?workspaceV2=true
{
"name": "Finance Analytics"
}
Store the returned id in your mapping table alongside the source workspace ID. Every subsequent API call depends on this source-to-target map.
Step 3: Reassign Capacity
If the source workspace is assigned to Premium or Fabric capacity, map it to the appropriate target capacity after workspace creation.
POST https://api.powerbi.com/v1.0/myorg/groups?workspaceV2=true
{
"name": "Finance Analytics"
}
This step is especially important when performance, refresh behavior, or capacity-specific features must be preserved in the target environment.
Step 4: Export Reports as PBIX
Use Reports – ExportToFile or the simpler Reports – Export File in Group:
GET /groups/{sourceGroupId}/reports/{reportId}/Export
This returns the .pbix binary. Stream it to blob storage rather than holding it in memory, large reports can exceed 1 GB. Reports built on live-connected datasets export without the model; you’ll rebind them in Step 7.
Step 5: Import Reports to the Target
Push the PBIX using Imports – Post Import:
POST /groups/{targetGroupId}/imports?datasetDisplayName=FinanceReport.pbix&nameConflict=CreateOrOverwrite
Content-Type: multipart/form-data
Poll GET /imports/{importId} until importState returns Succeeded. Failures here are usually due to missing custom visuals or unsupported data source types — log them and flag for manual review.
Step 6: Recreate Dataset Parameters and Credentials
Datasets arrive in the target tenant without credentials. Update parameters first:
POST /groups/{groupId}/datasets/{datasetId}/Default.UpdateParameters
Then patch data source credentials via the gateway data source endpoints. Service principals cannot set credentials on OAuth2 data sources, this is one of the gaps that forces master-user fallback.
Step 7: Bind Datasets to the Target Gateway
Use Datasets – Bind To Gateway:
POST /groups/{groupId}/datasets/{datasetId}/Default.BindToGateway
{
"gatewayObjectId": "{target-gateway-id}",
"datasourceObjectIds": ["{datasource-id-1}", "{datasource-id-2}"]
}
This step is where most migrations fail silently. If the target gateway doesn’t have matching data source definitions, the bind succeeds but refreshes fail later.
Step 8: Migrate Row-Level Security Roles
POST/groups/{groupId}/datasets/{datasetId}/roles/{roleName}/members to recreate RLS mappings. The source assignments must be read via the admin scanner API (POST /admin/workspaces/getInfo) because the non-admin surface doesn’t expose them fully.
Step 9: Restore Permissions and Sharing
Re-apply workspace roles with POST /groups/{groupId}/users. For app workspaces, republish the app using POST /groups/{groupId}/apps once reports and datasets are stable.
Step 10: Validate and Cut Over
Trigger a test refresh on every dataset via POST /groups/{groupId}/datasets/{datasetId}/refreshes and poll the refresh history. Only after every dataset returns Completed should you update consumer bookmarks and decommission the source.
Power BI Migration Rate Limits You Cannot Ignore
The Power BI REST API enforces throttling that will absolutely derail an unoptimized migration script.
- Admin APIs: 200 requests per hour per tenant
- Scanner API (GetInfo): 500 workspaces per call, up to 16 concurrent calls, and 10,000 workspaces per hour
- Dataset refresh triggers: 8 per dataset per day on Pro and 48 on Premium. For a deeper look at refresh governance, see our dataset migration guide.
- Export to File: 50 concurrent exports per capacity
- General API throttling: HTTP 429 responses with Retry-After headers
Build exponential backoff into every call. A production-grade migration script spends more code on throttle handling than on actual migration logic.
Critical Limitations of the API Approach
The Power BI REST API cannot do the following:
- Export datasets with incremental refresh configured -the Export endpoint returns a 400 error, and there is no supported workaround other than rebuilding the dataset in the target tenant
- Migrate paginated reports (RDL) through the same export/import path – you must download the RDL separately via the Power BI Report Server API or recreate them
- Migrate dataflows – there is no Dataflows export/import API at all; you export the JSON model definition manually and recreate each dataflow in the target
- Preserve scheduled refresh history, usage metrics, or audit logs – these are tenant-scoped and do not travel
- Migrate deployment pipelines – pipeline definitions must be recreated manually after workspace migration
- Handle certified and promoted endorsements – these reset to “None” and must be re-applied by a capacity admin
For a mid-size enterprise, these gaps typically affect 15–25% of the estate and are the single largest source of post-migration support tickets.
Skip the gaps.
If rebuilding incremental-refresh datasets, dataflows, and paginated reports by hand isn’t in your project plan, see how Apps4.Pro Power BI Migration handles all three natively before you write another line of code.
Sample Migration Script Skeleton
import requests, time
def migrate_workspace(source_id, token):
target = create_group(source_id, token)
reports = list_reports(source_id, token)
for r in reports:
pbix = export_report(source_id, r.id, token)
import_pbix(target.id, pbix, token)
time.sleep(0.5) # respect throttling
bind_datasets(target.id, token)
restore_permissions(target.id, token)
return target.id
Wrap every call in retry logic that honors Retry-After headers, and persist state after every step so a crash at workspace 147 of 200 doesn’t force a full restart.
From 10 Manual Steps to 5 Clicks with Apps4.Pro
Everything above works. It’s also a multi-week engineering effort that your team has to maintain, test, and extend as Microsoft deprecates endpoints. Apps4.Pro Power BI Migration collapses the entire framework into a guided workflow:
- Connect source tenant
- Connect target tenant
- Select workspaces to migrate
- Map gateways
- Run – with automatic handling of incremental refresh, paginated reports, dataflows, RLS, and rate-limit backoff
What the manual framework delegates to your developers, retry logic, throttle handling, dataflow recreation, credential rebinding, Apps4.Pro handles natively. You get the auditability of a scripted migration without owning the code.
Apps4.Pro Migration Manager reduces that effort with a guided workflow for Power BI migration from one tenant to another, including workspace migration, report migration, dataset handling, gateway mapping, RLS preservation, and built-in throttle management. Instead of maintaining custom migration code, teams can move faster with a more predictable migration path.










Migrate
Manage






