
This post is a part of the DP-600: Implementing Analytics Solutions Using Microsoft Fabric Exam Prep Hub; and this topic falls under these sections:
Maintain a data analytics solution
--> Maintain the analytics development lifecycle
--> Create and configure deployment pipelines
Development pipelines in Microsoft Fabric provide a structured, governed way to promote analytics content across environments—typically Development, Test, and Production. They are a core lifecycle management feature that helps teams deploy changes safely, consistently, and with minimal risk. For the DP-600 exam, you should understand what development pipelines are, how they are configured, what they support, and how they differ from Git-based version control.
What Are Development Pipelines?
A development pipeline is a Fabric feature that:
- Connects multiple workspaces into an ordered promotion flow
- Enables controlled deployment of items between environments
- Supports validation and testing before production release
Pipelines are especially important for enterprise-scale analytics solutions.
Typical Pipeline Structure
A standard Fabric pipeline consists of three stages:
- Development
- Active development
- Frequent changes
- Used by engineers and analysts
- Test
- Validation and user acceptance testing
- Data and logic verification
- Limited access
- Production
- Certified, trusted content
- Broad consumer access
- Minimal direct changes
Each stage is linked to a separate Fabric workspace.
Creating a Development Pipeline
At a high level, the process is:
- Create a deployment pipeline in Microsoft Fabric
- Assign a workspace to each stage:
- Dev workspace
- Test workspace
- Prod workspace
- Configure pipeline settings
- Control who can deploy between stages
Once created, the pipeline provides a visual interface showing item differences across stages.
What Items Can Be Deployed Through Pipelines?
Development pipelines support deployment of many Fabric items, including:
- Semantic models
- Reports and dashboards
- Dataflows Gen2
- Lakehouses and Warehouses (supported scenarios)
- Other supported analytics artifacts
Exam note:
Not every Fabric item supports pipeline deployment equally—expect questions to focus on Power BI and core analytics items.
How Deployment Works
Comparing Changes
- Pipelines show differences between stages
- You can review what will change before deploying
Deploying Content
- Deploy from Dev → Test
- Validate
- Deploy from Test → Prod
Deployments:
- Copy item definitions
- Can update existing items or create new ones
- Do not automatically move workspace permissions
Deployment Rules and Parameters
Pipelines support deployment rules, such as:
- Changing data source connections per environment
- Switching parameters between Dev, Test, and Prod
- Avoiding hard-coded environment values
This is critical for:
- Separating development and production data
- Supporting safe testing
Pipelines vs Git Integration (Exam Comparison)
This distinction is frequently tested.
| Feature | Development Pipelines | Git Integration |
| Purpose | Environment promotion | Source control |
| Focus | Deployment | Versioning |
| Tracks history | No | Yes |
| Supports branching | No | Yes |
| Typical use | Dev → Test → Prod | Code collaboration |
Key insight:
They are complementary, not competing features.
Permissions and Governance
To use pipelines:
- Users need appropriate pipeline permissions
- Workspace access is still required
- Production deployments are often restricted to a small group
Pipelines support governance by:
- Reducing direct changes in production
- Enforcing controlled release processes
- Improving auditability
Common Exam Scenarios
You may be asked to:
- Choose pipelines for controlled promotion of reports
- Identify when pipelines are preferable to manual publishing
- Combine pipelines with Git and PBIP
- Configure different data sources per environment
- Prevent accidental production changes
Example:
A report must be tested before being released to executives.
Correct concept: Use a development pipeline with Dev, Test, and Prod stages.
Best Practices to Remember
- Use separate workspaces per environment
- Restrict production deployment permissions
- Combine pipelines with:
- PBIP projects
- Git integration
- Endorsements and certification
- Avoid direct editing in production
Key Exam Takeaways
- Development pipelines manage content promotion across environments
- They connect multiple Fabric workspaces
- Pipelines support comparison, validation, and controlled deployment
- They do not replace Git-based version control
- A core feature of the Fabric analytics lifecycle
Exam Tips
- If a question focuses on moving content safely from development to production, the correct answer is development pipelines.
- If it focuses on tracking changes or collaboration, the answer is Git or PBIP.
- Know how pipelines support:
- Dev/Test/Prod lifecycle
- Governance & change control
- Environment-specific configuration
- Enterprise-scale BI practices
- Common exam traps:
- Confusing workspace roles with deploy permissions
- Assuming pipelines manage security or performance
- Forgetting deployment rules
Practice Questions
Question 1 (Single choice)
What is the PRIMARY purpose of a deployment pipeline in Microsoft Fabric?
A. Schedule dataset refreshes
B. Promote content across lifecycle environments
C. Enable row-level security
D. Optimize DAX performance
Correct Answer: B
Explanation:
Deployment pipelines are designed to promote content across environments (for example, Development → Test → Production) in a controlled and governed manner.
- ❌ A: Refresh scheduling is handled separately
- ❌ C: Security is not the primary purpose
- ❌ D: Performance tuning is unrelated
Question 2 (Multi-select)
Which stages are available by default in a Fabric deployment pipeline? (Select all that apply.)
A. Development
B. Test
C. Production
D. Sandbox
Correct Answers: A, B, C
Explanation:
Fabric deployment pipelines use a three-stage lifecycle:
- Development
- Test
- Production
There is no default Sandbox stage.
Question 3 (Scenario-based)
A team wants analysts to freely modify reports, while only approved changes reach production. Which pipeline stage should analysts primarily work in?
A. Production
B. Test
C. Development
D. Any stage
Correct Answer: C
Explanation:
The Development stage is intended for:
- Frequent changes
- Experimentation
- Initial validation
Higher stages are more controlled.
Question 4 (Single choice)
Which permission is required to deploy content from one stage to the next in a deployment pipeline?
A. Viewer
B. Contributor
C. Admin
D. Pipeline deploy permission
Correct Answer: D
Explanation:
Deploying content requires explicit pipeline deployment permissions, not just workspace roles.
- ❌ Admin alone is not sufficient
- ❌ Contributor may edit but not deploy
Question 5 (Scenario-based)
You deploy a semantic model from Test to Production. What happens to data source connections by default?
A. They are deleted
B. They remain unchanged
C. They can be overridden per stage
D. They must be manually reconfigured
Correct Answer: C
Explanation:
Deployment pipelines support parameter and data source rules, allowing environment-specific connections.
Question 6 (Multi-select)
Which items can be deployed using deployment pipelines? (Select all that apply.)
A. Reports
B. Semantic models
C. Dashboards
D. Notebooks
Correct Answers: A, B, C
Explanation:
Deployment pipelines support Power BI artifacts, including:
- Reports
- Semantic models
- Dashboards
❌ Notebooks are Fabric artifacts but are not deployed via Power BI deployment pipelines.
Question 7 (Scenario-based)
A deployment shows warnings that some items are skipped. What is the MOST likely cause?
A. The workspace is full
B. Unsupported artifacts exist
C. The dataset is too large
D. Git integration is disabled
Correct Answer: B
Explanation:
Unsupported or incompatible artifacts (for example, unsupported report types) may be skipped during deployment.
Question 8 (Single choice)
Which feature allows different environments to use different data sources during deployment?
A. Row-level security
B. Dynamic format strings
C. Deployment rules
D. Incremental refresh
Correct Answer: C
Explanation:
Deployment rules allow:
- Data source switching
- Parameter overrides
- Environment-specific configuration
Question 9 (Scenario-based)
You want production users to access only certified content. How do deployment pipelines help?
A. By enforcing sensitivity labels
B. By promoting tested content only
C. By encrypting production reports
D. By disabling edit access
Correct Answer: B
Explanation:
Deployment pipelines ensure:
- Content is validated in Test
- Only approved changes reach Production
They support trust and governance, not encryption or labeling.
Question 10 (Multi-select)
Which best practices apply when configuring deployment pipelines? (Select all that apply.)
A. Restrict deploy permissions
B. Use separate data sources per stage
C. Allow all users to deploy to Production
D. Validate content in Test before Production
Correct Answers: A, B, D
Explanation:
Best practices include:
- Limited deploy access
- Environment-specific configurations
- Mandatory testing before production
❌ Allowing everyone to deploy defeats governance.

One thought on “Create and configure deployment pipelines”