Category: Microsoft Certification

Practice Questions: Configure Subscriptions and Data Alerts (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Configure Subscriptions and Data Alerts


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions


Question 1

A sales manager wants to receive an email every Monday morning with a snapshot of the latest sales performance report.
Which Power BI feature should you configure?

A. Data alert
B. Subscription
C. Dashboard tile alert
D. Power Automate flow

Correct Answer: B

Explanation:
Subscriptions are schedule-based and designed to send report or dashboard snapshots via email. Data alerts are triggered only by thresholds, not schedules.


Question 2

You need to notify users immediately when total daily revenue drops below $50,000.
Which requirement must be met to configure this?

A. The visual must be in a report
B. The dataset must be DirectQuery
C. The visual must be a dashboard tile showing a single value
D. A Power BI app must be published

Correct Answer: C

Explanation:
Data alerts can only be created on dashboard tiles that display a single numeric value (such as a Card or KPI).


Question 3

A user attempts to create a data alert on a table visual in a report but cannot find the option. Why?

A. Alerts require Power BI Premium
B. Alerts only work on dashboards
C. Alerts cannot be used with imported data
D. Alerts must be created by workspace admins

Correct Answer: B

Explanation:
Data alerts cannot be configured on report visuals. They must be created on dashboard tiles.


Question 4

Which delivery method is supported by both subscriptions and data alerts?

A. Microsoft Teams messages
B. SMS notifications
C. Email notifications
D. Power BI mobile push only

Correct Answer: C

Explanation:
Both subscriptions and data alerts support email notifications. Other delivery methods require additional tools such as Power Automate.


Question 5

A report uses Row-Level Security (RLS). A subscription is created for multiple users.
What will each user see?

A. The same data as the report owner
B. Filtered data based on their RLS role
C. All data in the dataset
D. Only summary-level data

Correct Answer: B

Explanation:
Subscriptions respect Row-Level Security, meaning each user only sees data they are authorized to view.


Question 6

Which scenario is the best use case for a data alert?

A. Sending a monthly executive summary
B. Monitoring KPIs for threshold breaches
C. Distributing weekly dashboards
D. Sharing reports with external users

Correct Answer: B

Explanation:
Data alerts are designed for exception-based monitoring, triggering notifications when values cross defined thresholds.


Question 7

Which content types support subscriptions in Power BI?

A. Dashboards only
B. Reports only
C. Reports, report pages, and dashboards
D. Dashboard tiles only

Correct Answer: C

Explanation:
Subscriptions can be created for reports, individual report pages, and dashboards.


Question 8

A user wants to temporarily stop receiving notifications without deleting the configuration.
Which feature supports this requirement?

A. Data alerts only
B. Subscriptions only
C. Both subscriptions and data alerts
D. Neither feature

Correct Answer: C

Explanation:
Both subscriptions and data alerts can be turned on or off without being deleted.


Question 9

When are data alert conditions evaluated?

A. At fixed times defined by the user
B. Continuously in real time
C. After the dataset is refreshed
D. When a report is opened

Correct Answer: C

Explanation:
Data alerts are evaluated after dataset refresh. If the data doesn’t refresh, the alert won’t trigger.


Question 10

You need to send scheduled updates to users who should not edit reports but must receive consistent insights.
Which approach is MOST appropriate?

A. Grant Build permissions
B. Configure data alerts
C. Create report subscriptions
D. Enable Analyze in Excel

Correct Answer: C

Explanation:
Subscriptions are ideal for read-only, passive distribution of insights without requiring users to interact with reports.


Final Exam Tip

If the question mentions:

  • “Threshold,” “exceeds,” “drops below”Data Alert
  • “Daily, weekly, scheduled email”Subscription
  • “Dashboard tile”Data Alert
  • “Passive consumption”Subscription

Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Promote or certify Power BI content (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Promote or certify Power BI content


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

What is the primary purpose of certifying Power BI content?

A. To improve report rendering performance
B. To indicate that content is approved and trusted for organizational use
C. To allow content to refresh more frequently
D. To enable sharing with external users

Correct Answer: B

Explanation:
Certification identifies datasets or dataflows that have been formally reviewed and approved, making them trusted sources for enterprise-wide reporting.


Question 2

Which Power BI artifact can be certified?

A. Report
B. Dashboard
C. Dataset
D. App

Correct Answer: C

Explanation:
Only datasets (semantic models) and dataflows can be certified. Reports and dashboards cannot be certified directly.


Question 3

Who determines which users are allowed to certify Power BI content?

A. Workspace Admin
B. Report Creator
C. Power BI Tenant Administrator
D. Dataset Owner

Correct Answer: C

Explanation:
Certification permissions are controlled at the tenant level by a Power BI administrator.


Question 4

A dataset is reliable but still evolving and has not undergone a formal governance review. What should you do?

A. Certify the dataset
B. Delete the dataset
C. Promote the dataset
D. Publish it as an app

Correct Answer: C

Explanation:
Promotion is appropriate when content is useful and reliable but not formally approved through governance.


Question 5

What happens when a dataset is certified?

A. Only admins can access it
B. It appears higher in search results
C. It automatically refreshes more often
D. Reports using it are locked

Correct Answer: B

Explanation:
Certified datasets are prioritized in search and discovery to encourage reuse of trusted data.


Question 6

Which of the following statements about promoted content is true?

A. It requires tenant admin approval
B. It represents the highest level of trust
C. It can be promoted without a formal review
D. Only IT users can promote it

Correct Answer: C

Explanation:
Promoted content does not require formal approval and signals informal trust.


Question 7

Your organization wants a single source of truth for company-wide KPIs. Which approach is best?

A. Create multiple promoted datasets
B. Certify one centrally governed dataset
C. Share reports individually
D. Use dashboards instead of datasets

Correct Answer: B

Explanation:
Certified datasets are intended to serve as enterprise-wide trusted sources.


Question 8

Where do you promote or certify a dataset?

A. Power BI Desktop
B. Dataset settings in the Power BI Service
C. Power Query Editor
D. Workspace app settings

Correct Answer: B

Explanation:
Promotion and certification are configured in the Power BI Service, not in Desktop.


Question 9

Which of the following is a key benefit of certification?

A. Faster data refresh
B. Reduced report load times
C. Improved data governance and reuse
D. Automatic row-level security

Correct Answer: C

Explanation:
Certification improves governance by guiding users toward trusted, reusable datasets.


Question 10

Why can a report not be certified?

A. Reports do not contain data
B. Reports cannot be shared
C. Certification applies only to datasets and dataflows
D. Reports already inherit certification

Correct Answer: C

Explanation:
Certification is applied to the data source, not the visualization layer.


Exam Tips

  • Promoted ≠ Certified
  • Certification requires tenant admin configuration
  • Certified content supports enterprise BI
  • Reports inherit trust from their dataset, not certification directly

Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Create dashboards (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Create dashboards


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions


1. Where can Power BI dashboards be created?

A. Power BI Desktop
B. Power BI Service
C. Power BI Report Server
D. Microsoft Excel

Correct Answer: B

Explanation:
Dashboards can only be created in the Power BI Service. Power BI Desktop is used to create reports, not dashboards.


2. What is the maximum number of pages a Power BI dashboard can have?

A. One
B. Five
C. Ten
D. Unlimited

Correct Answer: A

Explanation:
Dashboards are single-page canvases designed for high-level monitoring.


3. A business user wants to view KPIs from three different datasets on one screen. What should you create?

A. A report with multiple pages
B. A report using composite models
C. A dashboard
D. A paginated report

Correct Answer: C

Explanation:
Dashboards can display visuals from multiple datasets, while reports are limited to one dataset.


4. How is a dashboard tile created?

A. By importing a dataset
B. By saving a report page
C. By pinning content from a report or Q&A
D. By creating a new visual in the Service

Correct Answer: C

Explanation:
Dashboard tiles are created by pinning visuals, pages, or Q&A results to a dashboard.


5. What happens when a user clicks a visual tile on a dashboard?

A. The tile expands in place
B. The dashboard refreshes
C. The source report opens
D. A detailed tooltip appears

Correct Answer: C

Explanation:
Clicking a dashboard tile opens the underlying report that the tile was pinned from.


6. Which tile types support data alerts in Power BI dashboards?

A. Table and matrix
B. KPI, card, and gauge
C. Any visual
D. Image and text box

Correct Answer: B

Explanation:
Data alerts are supported on KPI, card, and gauge tiles, making dashboards useful for monitoring thresholds.


7. A user shares a dashboard with a colleague, but the colleague cannot see the data. What is the most likely reason?

A. The dashboard is unpublished
B. The dataset has not refreshed
C. The colleague lacks access to the dataset
D. The visuals are not interactive

Correct Answer: C

Explanation:
Sharing a dashboard does not grant dataset access. Users must have permission to the underlying dataset.


8. Which statement about dashboards is TRUE?

A. Dashboards store their own data
B. Dashboards can be edited in Power BI Desktop
C. Dashboards support drill-through
D. Dashboards reflect live dataset data

Correct Answer: D

Explanation:
Dashboards do not store data; they reflect the current state of the underlying datasets.


9. What is a key limitation of Power BI dashboards compared to reports?

A. Dashboards cannot be shared
B. Dashboards cannot include visuals
C. Dashboards have limited interactivity
D. Dashboards cannot refresh data

Correct Answer: C

Explanation:
Dashboards are designed for at-a-glance insights and have limited filtering, slicing, and interaction.


10. What is the primary purpose of a Power BI dashboard?

A. Detailed data exploration
B. Data modeling
C. High-level monitoring and summarization
D. Row-level security configuration

Correct Answer: C

Explanation:
Dashboards are built for monitoring key metrics and summarizing performance across the business.


Final Exam Tip

If a question mentions:

  • Multiple datasets
  • Single-page view
  • Pinned visuals
  • Executive or KPI monitoring

➡️ The correct answer is likely to be Dashboard.


Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Choose a distribution method (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Choose a distribution method


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

You need to distribute a set of certified reports to 300 business users. Users should not be able to modify the reports, and updates should be centrally managed. What is the best distribution method?

A. Share each report individually
B. Grant Viewer access to the workspace
C. Publish a Power BI app
D. Create email subscriptions

Correct Answer: C

Explanation:
Power BI apps provide centralized, read-only distribution to large audiences and allow content to be updated without re-sharing. This is the recommended enterprise distribution method.


Question 2

Which distribution method is most appropriate for collaboration among report developers?

A. Power BI app
B. Publish to web
C. Workspace access
D. Dashboard subscription

Correct Answer: C

Explanation:
Workspace access allows contributors and members to create, edit, and manage content collaboratively. Apps are designed for consumption, not development.


Question 3

An executive wants to receive a weekly snapshot of a sales dashboard without logging into Power BI. What should you configure?

A. Report sharing
B. Power BI app
C. Dashboard subscription
D. Workspace Viewer access

Correct Answer: C

Explanation:
Subscriptions deliver scheduled email snapshots of dashboards or reports and are ideal for passive consumption.


Question 4

Which distribution method should be used only for non-confidential data?

A. Power BI app
B. Secure embed
C. Workspace Viewer access
D. Publish to web

Correct Answer: D

Explanation:
Publish to web makes reports publicly accessible with no authentication. It is inappropriate for sensitive or internal data.


Question 5

You share a report with a user, but they cannot see the data. What is the most likely cause?

A. The user lacks workspace access
B. The dataset permissions were not granted
C. The report is not certified
D. Row-level security is disabled

Correct Answer: B

Explanation:
Sharing a report does not automatically grant access to the underlying dataset unless permissions are explicitly assigned.


Question 6

Your organization wants users to access Power BI reports inside Microsoft Teams. Which distribution option supports this scenario?

A. Publish to web
B. Power BI Embedded
C. SharePoint or Teams embed
D. Dashboard subscription

Correct Answer: C

Explanation:
Power BI reports can be securely embedded in Teams and SharePoint while respecting Power BI permissions.


Question 7

Which distribution method provides the strongest governance and version control?

A. Individual report sharing
B. Workspace access
C. Dashboard subscriptions
D. Power BI app

Correct Answer: D

Explanation:
Apps allow controlled publishing, versioning, and consistent delivery of approved content to users.


Question 8

You want to distribute reports to external customers using a custom web application. What is the best option?

A. Publish to web
B. Power BI Embedded
C. Share reports
D. Power BI app

Correct Answer: B

Explanation:
Power BI Embedded allows secure integration of reports into custom applications with external users.


Question 9

Which statement about workspace access is true?

A. It is the recommended method for executive distribution
B. It restricts users to read-only access
C. It is intended for collaboration and content management
D. It replaces the need for Power BI apps

Correct Answer: C

Explanation:
Workspace access is designed for creators and collaborators, not for broad report consumption.


Question 10

A question on the PL-300 exam describes a scenario involving production reports, a large audience, and controlled access. Which answer is most likely correct?

A. Share reports
B. Workspace Viewer access
C. Publish to web
D. Power BI app

Correct Answer: D

Explanation:
These keywords strongly indicate the use of a Power BI app, which is the exam’s preferred answer for governed distribution scenarios.


Final Exam Tips

  • Apps = enterprise distribution
  • Workspaces = collaboration
  • Sharing = small, ad hoc access
  • Subscriptions = passive consumption
  • Publish to web = public data only

Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Publish, import, or update items in a workspace (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Publish, import, or update items in a workspace


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions


Question 1

What happens when you publish a PBIX file from Power BI Desktop to a workspace that already contains a report with the same name?

A. Power BI creates a duplicate report
B. Power BI merges the changes
C. Power BI overwrites the existing report and semantic model
D. Power BI creates a new dataset only

Correct Answer: C

Explanation:
Publishing a PBIX with the same name replaces both the report and the semantic model in the workspace. This behavior is commonly tested on the exam.


Question 2

Which workspace role is required to publish content from Power BI Desktop to a shared workspace?

A. Viewer
B. Contributor
C. Viewer or Contributor
D. Viewer or Member

Correct Answer: B

Explanation:
A Contributor (or higher) role is required to publish content. Viewers have read-only access and cannot publish or update items.


Question 3

Which method allows you to bring an Excel workbook into a Power BI workspace as a report and dataset?

A. Publish from Power BI Desktop
B. Import the Excel file in the Power BI Service
C. Copy from another workspace
D. Use a deployment pipeline

Correct Answer: B

Explanation:
Excel files can be imported directly in the Power BI Service, which creates reports and datasets in the workspace.


Question 4

You need to update a report’s visuals without changing the underlying semantic model. What is the most appropriate action?

A. Delete and recreate the dataset
B. Republish the PBIX
C. Edit the report in the Power BI Service
D. Import the report again

Correct Answer: C

Explanation:
If permissions allow, editing visuals in the Power BI Service updates the report without replacing the dataset.


Question 5

Which action requires you to manually update a workspace app for users to see the change?

A. Adding a user to the workspace
B. Editing a report in the workspace
C. Refreshing a dataset
D. Applying row-level security

Correct Answer: B

Explanation:
Changes made in the workspace do not automatically appear in the app. The app must be republished.


Question 6

Which item can be edited directly in the Power BI Service without using Power BI Desktop?

A. PBIX file
B. Dataflow
C. Dataset schema
D. Power BI template

Correct Answer: B

Explanation:
Dataflows are created and edited directly in the Power BI Service and support centralized data preparation.


Question 7

What is the primary difference between publishing and importing content into a workspace?

A. Publishing creates dashboards; importing does not
B. Importing always overwrites existing content
C. Publishing is done from Desktop; importing is done from the Service
D. Importing does not create datasets

Correct Answer: C

Explanation:
Publishing is typically done from Power BI Desktop, while importing occurs within the Power BI Service using files or other sources.


Question 8

Which workspace role can update reports but cannot publish or update a workspace app?

A. Viewer
B. Contributor
C. Member
D. Admin

Correct Answer: B

Explanation:
Contributors can publish and update content but cannot manage workspace settings or apps, which is restricted to Members and Admins.


Question 9

You want to move the same report through Development, Test, and Production environments. What is the recommended approach?

A. Use My Workspace
B. Import the report multiple times
C. Publish the PBIX to multiple workspaces
D. Share the report with different users

Correct Answer: C

Explanation:
Publishing the same PBIX to separate workspaces supports lifecycle management and environment separation.


Question 10

After importing a PBIX file into a workspace, which action is often required before users can view updated data?

A. Update the app
B. Configure data source credentials
C. Assign workspace roles
D. Enable row-level security

Correct Answer: B

Explanation:
Imported datasets often require credential configuration before refreshes can occur and data becomes available.


Exam Tip

If a question mentions:

  • Overwriting content
  • Workspace roles
  • Publishing vs importing
  • Updating without breaking access

➡️ Focus on workspace permissions, publish behavior, and app update requirements.


Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Configure and Update a Workspace App (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Configure and Update a Workspace App


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions


Question 1

What is the primary purpose of publishing a workspace app in Power BI?

A. To allow multiple developers to edit reports simultaneously
B. To provide a read-only, curated experience for report consumers
C. To improve dataset refresh performance
D. To apply row-level security to reports

Correct Answer: B

Explanation:
Workspace apps are designed for content consumption, not development. They provide a controlled, read-only experience for users, which is why they are preferred over direct report sharing for large audiences.


Question 2

Which workspace roles can publish or update a workspace app?

A. Viewer only
B. Contributor and Viewer
C. Member and Admin
D. Contributor and Member

Correct Answer: C

Explanation:
Only Members and Admins have permission to publish or update workspace apps. Contributors can create content but cannot manage apps.


Question 3

You update a report in a workspace, but users do not see the changes in the app. What must you do?

A. Refresh the dataset
B. Clear the app cache
C. Republish or update the app
D. Reassign user permissions

Correct Answer: C

Explanation:
Changes made in the workspace do not automatically appear in the app. You must explicitly update (republish) the app for consumers to see the changes.


Question 4

Which feature allows different users to see different content within the same workspace app?

A. Row-level security (RLS)
B. Dataset roles
C. App audiences
D. Visual-level filters

Correct Answer: C

Explanation:
App audiences control content visibility, allowing different groups to see different reports or dashboards without duplicating content.


Question 5

Which permission allows users of an app to build their own reports using the app’s semantic model?

A. Viewer permission
B. Allow reshare
C. Build permission
D. Admin permission

Correct Answer: C

Explanation:
Granting Build permission allows users to connect to the underlying semantic model for scenarios such as Analyze in Excel or creating new reports.


Question 6

What happens to users when a workspace app is updated?

A. They must be re-added to the app
B. Their bookmarks are deleted
C. They automatically see the updated content
D. Their permissions are reset

Correct Answer: C

Explanation:
After an app is updated, users retain access and permissions, and the updated content becomes available automatically without additional action.


Question 7

Which scenario is the best use case for a workspace app?

A. Collaborative report development
B. Testing new visuals
C. Distributing finalized reports to a large audience
D. Debugging DAX calculations

Correct Answer: C

Explanation:
Workspace apps are ideal for broad distribution of production-ready content, while workspaces remain the collaboration area for developers.


Question 8

Which of the following is true about workspace apps?

A. Users can edit reports in an app
B. Apps automatically update when workspace content changes
C. Apps provide a controlled navigation experience
D. Apps replace the need for workspaces

Correct Answer: C

Explanation:
Apps allow creators to control navigation, ordering, and visibility of content. They are read-only and require manual updates.


Question 9

You want to hide a report from most users but make it available to executives using the same app. What should you do?

A. Duplicate the report into a new workspace
B. Use row-level security
C. Create a separate app
D. Use app audiences

Correct Answer: D

Explanation:
App audiences allow selective visibility of content without duplication, making them ideal for role-based access to reports.


Question 10

What is the key difference between a workspace and a workspace app?

A. Workspaces support data refresh; apps do not
B. Workspaces are for collaboration; apps are for consumption
C. Apps allow editing; workspaces do not
D. Apps automatically apply security

Correct Answer: B

Explanation:
A workspace is where content is created and maintained, while a workspace app is the distribution layer for end users.


Exam Tip

If a question mentions:

  • Broad distribution
  • Read-only access
  • Controlled release
  • Audiences
  • Updating without disrupting users

➡️ The correct answer is almost always Workspace App.


Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Identify when a gateway is required (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Manage and secure Power BI (15–20%)
--> Create and manage workspaces and assets
--> Identify when a gateway is required


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions


Question 1

You publish a Power BI report that imports data from an on-premises SQL Server and want to schedule daily refreshes in the Power BI service. What is required?

A. No additional configuration
B. A Power BI app
C. An on-premises data gateway
D. A premium capacity workspace

Correct Answer: C

Explanation:
Scheduled refresh from an on-premises data source requires a gateway to securely connect Power BI service to the local SQL Server.


Question 2

A dataset uses Azure SQL Database in Import mode with scheduled refresh enabled. Is a gateway required?

A. Yes, because scheduled refresh is enabled
B. Yes, because Import mode is used
C. No, because the data source is cloud-based
D. No, because the dataset is small

Correct Answer: C

Explanation:
Azure SQL Database is a cloud data source that Power BI can access directly, so no gateway is needed.


Question 3

You create a Power BI report using DirectQuery to an on-premises SQL Server. When users view the report in the Power BI service, what is required?

A. A gateway
B. A scheduled refresh
C. Import mode
D. Power BI Premium

Correct Answer: A

Explanation:
DirectQuery sends queries at report view time. A gateway is required for on-premises sources.


Question 4

Which scenario does NOT require a Power BI gateway?

A. Importing data from SharePoint Online
B. DirectQuery to an on-premises database
C. Refreshing an on-premises dataflow
D. Live connection to on-premises SSAS

Correct Answer: A

Explanation:
SharePoint Online is a cloud-based service and does not require a gateway.


Question 5

A report combines data from Azure Data Lake Storage and an on-premises file share. What is true?

A. No gateway is required because one source is cloud-based
B. A gateway is required for the on-premises source
C. A gateway is required for both sources
D. Gateways are not supported for mixed data sources

Correct Answer: B

Explanation:
Any on-premises data source used in the Power BI service requires a gateway, even in hybrid datasets.


Question 6

While working in Power BI Desktop, you connect to an on-premises SQL Server and refresh data locally. Is a gateway required?

A. Yes, always
B. Yes, if Import mode is used
C. No, gateways are only needed in the Power BI service
D. No, if DirectQuery is used

Correct Answer: C

Explanation:
Power BI Desktop connects directly to local data sources. Gateways are only required after publishing to the Power BI service.


Question 7

You want to refresh a Power BI dataflow that connects to an on-premises Oracle database. What is required?

A. Power BI Premium
B. A gateway
C. A paginated report
D. An app workspace

Correct Answer: B

Explanation:
Dataflows that use on-premises data sources require a gateway to refresh in the Power BI service.


Question 8

Which connection type always requires a gateway when the data source is on-premises?

A. Import with manual refresh
B. Import with scheduled refresh
C. DirectQuery
D. Both B and C

Correct Answer: D

Explanation:
Scheduled refresh and DirectQuery both require a gateway for on-premises data sources.


Question 9

A report uses a Live connection to an on-premises Analysis Services model. What is required?

A. A dataset refresh schedule
B. A gateway
C. Import mode
D. A certified dataset

Correct Answer: B

Explanation:
Live connections to on-premises Analysis Services require a gateway for real-time queries.


Question 10

Which factor is the most important when deciding if a gateway is required?

A. Dataset size
B. Data refresh frequency
C. Location of the data source
D. Number of report users

Correct Answer: C

Explanation:
Gateway requirements are based on whether the data source is accessible from the cloud or located on-premises.


Exam Tips

  • On-premises + Power BI service = Gateway
  • Cloud sources do not require gateways
  • DirectQuery and Live connections still require gateways
  • Desktop-only work never requires a gateway

Go back to the PL-300 Exam Prep Hub main page

Understanding the Power BI DAX “GENERATE / ROW” Pattern

The GENERATE / ROW pattern is an advanced but powerful DAX technique used to dynamically create rows and expand tables based on calculations. It is especially useful when you need to produce derived rows, combinations, or scenario-based expansions that don’t exist physically in your data model.

This article explains what the pattern is, when to use it, how it works, and provides practical examples. It assumes you are familiar with concepts such as row context, filter context, and iterators.


What Is the GENERATE / ROW Pattern?

At its core, the pattern combines two DAX functions:

  • GENERATE() – Iterates over a table and returns a union of tables generated for each row.
  • ROW() – Creates a single-row table with named columns and expressions.

Together, they allow you to:

  • Loop over an outer table
  • Generate one or more rows per input row
  • Shape those rows using calculated expressions

In effect, this pattern mimics a nested loop or table expansion operation.


Why This Pattern Exists

DAX does not support procedural loops like for or while.
Instead, iteration happens through table functions.

GENERATE() fills a critical gap by allowing you to:

  • Produce variable numbers of rows per input row
  • Apply row-level calculations while preserving relationships and context

Function Overview

GENERATE

GENERATE (
    table1,
    table2
)

  • table1: The outer table being iterated.
  • table2: A table expression evaluated for each row of table1.

The result is a flattened table containing all rows returned by table2 for every row in table1.


ROW

ROW (
    "ColumnName1", Expression1,
    "ColumnName2", Expression2
)

  • Returns a single-row table
  • Expressions are evaluated in the current row context

When Should You Use the GENERATE / ROW Pattern?

This pattern is ideal when:

✅ You Need to Create Derived Rows

Examples:

  • Generating “Start” and “End” rows per record
  • Creating multiple event types per transaction

✅ You Need Scenario or Category Expansion

Examples:

  • Actual vs Forecast vs Budget rows
  • Multiple pricing or discount scenarios

✅ You Need Row-Level Calculations That Produce Rows

Examples:

  • Expanding date ranges into multiple calculated milestones
  • Generating allocation rows per entity

❌ When Not to Use It

  • Simple aggregations → use SUMX, ADDCOLUMNS
  • Static lookup tables → use calculated tables or Power Query
  • High-volume fact tables without filtering (can be expensive)

Basic Example: Expanding Rows with Labels

Scenario

You have a Sales table:

OrderIDAmount
1100
2200

You want to generate two rows per order:

  • One for Gross
  • One for Net (90% of gross)

DAX Code

Sales Breakdown =
GENERATE (
    Sales,
    ROW (
        "Type", "Gross",
        "Value", Sales[Amount]
    )
    &
    ROW (
        "Type", "Net",
        "Value", Sales[Amount] * 0.9
    )
)


Result

OrderIDTypeValue
1Gross100
1Net90
2Gross200
2Net180

Key Concept: Context Transition

Inside ROW():

  • You are operating in row context
  • Columns from the outer table (Sales) are directly accessible
  • No need for EARLIER() or variables in most cases

This makes the pattern cleaner and easier to reason about.


Intermediate Example: Scenario Modeling

Scenario

You want to model multiple pricing scenarios for each product.

ProductBasePrice
A50
B100

Scenarios:

  • Standard (100%)
  • Discounted (90%)
  • Premium (110%)

DAX Code

Product Pricing Scenarios =
GENERATE (
    Products,
    UNION (
        ROW ( "Scenario", "Standard",   "Price", Products[BasePrice] ),
        ROW ( "Scenario", "Discounted", "Price", Products[BasePrice] * 0.9 ),
        ROW ( "Scenario", "Premium",    "Price", Products[BasePrice] * 1.1 )
    )
)


Result

ProductScenarioPrice
AStandard50
ADiscounted45
APremium55
BStandard100
BDiscounted90
BPremium110

Advanced Example: Date-Based Expansion

Scenario

For each project, generate two milestone rows:

  • Start Date
  • End Date
ProjectStartDateEndDate
X2024-01-012024-03-01

DAX Code

Project Milestones =
GENERATE (
    Projects,
    UNION (
        ROW (
            "Milestone", "Start",
            "Date", Projects[StartDate]
        ),
        ROW (
            "Milestone", "End",
            "Date", Projects[EndDate]
        )
    )
)

This is especially useful for timeline visuals or event-based reporting.


Performance Considerations ⚠️

The GENERATE / ROW pattern can be computationally expensive.

Best Practices

  • Filter the outer table as early as possible
  • Avoid using it on very large fact tables
  • Prefer calculated tables over measures when expanding rows
  • Test with realistic data volumes

Common Mistakes

❌ Using GENERATE When ADDCOLUMNS Is Enough

If you’re only adding columns—not rows—ADDCOLUMNS() is simpler and faster.

❌ Forgetting Table Shape Consistency

All ROW() expressions combined with UNION() must return the same column structure.

❌ Overusing It in Measures

This pattern is usually better suited for calculated tables, not measures.


Mental Model to Remember

Think of the GENERATE / ROW pattern as:

“For each row in this table, generate one or more calculated rows and stack them together.”

If that sentence describes your problem, this pattern is likely the right tool.


Final Thoughts

The GENERATE / ROW pattern is one of those DAX techniques that feels complex at first—but once understood, it unlocks entire classes of modeling and analytical solutions that are otherwise impossible.

Used thoughtfully, it can replace convoluted workarounds, reduce model complexity, and enable powerful scenario-based reporting.

Thanks for reading!

Exam Prep Hub for DP-600: Implementing Analytics Solutions Using Microsoft Fabric

This is your one-stop hub with information for preparing for the DP-600: Implementing Analytics Solutions Using Microsoft Fabric certification exam. Upon successful completion of the exam, you earn the Fabric Analytics Engineer Associate certification.

This hub provides information directly here, links to a number of external resources, tips for preparing for the exam, practice tests, and section questions to help you prepare. Bookmark this page and use it as a guide to ensure that you are fully covering all relevant topics for the exam and using as many of the resources available as possible. We hope you find it convenient and helpful.

Why do the DP-600: Implementing Analytics Solutions Using Microsoft Fabric exam to gain the Fabric Analytics Engineer Associate certification?

Most likely, you already know why you want to earn this certification, but in case you are seeking information on its benefits, here are a few:
(1) there is a possibility for career advancement because Microsoft Fabric is a leading data platform used by companies of all sizes, all over the world, and is likely to become even more popular
(2) greater job opportunities due to the edge provided by the certification
(3) higher earnings potential,
(4) you will expand your knowledge about the Fabric platform by going beyond what you would normally do on the job and
(5) it will provide immediate credibility about your knowledge, and
(6) it may, and it should, provide you with greater confidence about your knowledge and skills.


Important DP-600 resources:


DP-600: Skills measured as of October 31, 2025:

Here you can learn in a structured manner by going through the topics of the exam one-by-one to ensure full coverage; click on each hyperlinked topic below to go to more information about it:

Skills at a glance

  • Maintain a data analytics solution (25%-30%)
  • Prepare data (45%-50%)
  • Implement and manage semantic models (25%-30%)

Maintain a data analytics solution (25%-30%)

Implement security and governance

Maintain the analytics development lifecycle

Prepare data (45%-50%)

Get Data

Transform Data

Query and analyze data

Implement and manage semantic models (25%-30%)

Design and build semantic models

Optimize enterprise-scale semantic models


Practice Exams:

We have provided 2 practice exams with answers to help you prepare.

DP-600 Practice Exam 1 (60 questions with answer key)

DP-600 Practice Exam 2 (60 questions with answer key)


Good luck to you passing the DP-600: Implementing Analytics Solutions Using Microsoft Fabric certification exam and earning the Fabric Analytics Engineer Associate certification!

DP-600 Practice Exam 1 (60 questions with answer key)

This post is a part of the DP-600: Implementing Analytics Solutions Using Microsoft Fabric Exam Prep Hub. Bookmark this hub and use it as a guide to help you prepare for the DP-600 certification exam.

This is a practice exam for the
DP-600: Implementing Analytics Solutions Using Microsoft Fabric
certification exam.
– It contains: 60 Questions (the questions are of varying type and difficulty)
– The answer key is located at: the end of the exam; i.e., after all the questions. We recommend that you try to answer the questions before looking at the answers.
– Upon successful completion of the official certification exam, you earn the Fabric Analytics Engineer Associate certification.

Good luck to you!


SECTION A – Prepare Data (Questions 1–24)

Question 1 (Single Choice)

You need to ingest CSV files from an Azure Data Lake Gen2 account into a Lakehouse with minimal transformation. Which option is most appropriate?

A. Power BI Desktop
B. Dataflow Gen2
C. Warehouse COPY INTO
D. Spark notebook


Question 2 (Multi-Select – Choose TWO)

Which Fabric components support both ingestion and transformation of data?

A. Dataflow Gen2
B. Eventhouse
C. Spark notebooks
D. SQL analytics endpoint
E. Power BI Desktop


Question 3 (Scenario – Single Choice)

Your team wants to browse datasets across workspaces and understand lineage and ownership before using them. Which feature should you use?

A. Deployment pipelines
B. OneLake catalog
C. Power BI lineage view
D. XMLA endpoint


Question 4 (Single Choice)

Which statement best describes Direct Lake?

A. Data is cached in VertiPaq during refresh
B. Queries run directly against Delta tables in OneLake
C. Queries always fall back to DirectQuery
D. Requires incremental refresh


Question 5 (Matching)

Match the Fabric item to its primary use case:

ItemUse Case
1. LakehouseA. High-concurrency SQL analytics
2. WarehouseB. Event streaming and time-series
3. EventhouseC. Open data storage + Spark

Question 6 (Single Choice)

Which ingestion option is best for append-only, high-volume streaming telemetry?

A. Dataflow Gen2
B. Eventstream to Eventhouse
C. Warehouse COPY INTO
D. Power Query


Question 7 (Scenario – Single Choice)

You want to join two large datasets without materializing the result. Which approach is most appropriate?

A. Power Query merge
B. SQL VIEW
C. Calculated table in DAX
D. Dataflow Gen2 output table


Question 8 (Multi-Select – Choose TWO)

Which actions help reduce data duplication in Fabric?

A. Using shortcuts in OneLake
B. Creating multiple Lakehouses per workspace
C. Sharing semantic models
D. Importing the same data into multiple models


Question 9 (Single Choice)

Which column type is required for incremental refresh?

A. Integer
B. Text
C. Boolean
D. Date/DateTime


Question 10 (Scenario – Single Choice)

Your dataset contains nulls in a numeric column used for aggregation. What is the best place to handle this?

A. DAX measure
B. Power Query
C. Report visual
D. RLS filter


Question 11 (Single Choice)

Which Power Query transformation is foldable in most SQL sources?

A. Adding an index column
B. Filtering rows
C. Custom M function
D. Merging with fuzzy match


Question 12 (Multi-Select – Choose TWO)

Which scenarios justify denormalizing data?

A. Star schema reporting
B. OLTP transactional workloads
C. High-performance analytics
D. Reducing DAX complexity


Question 13 (Single Choice)

Which operation increases cardinality the most?

A. Removing unused columns
B. Splitting a text column
C. Converting text to integer keys
D. Aggregating rows


Question 14 (Scenario – Single Choice)

You need reusable transformations across multiple datasets. What should you create?

A. Calculated columns
B. Shared semantic model
C. Dataflow Gen2
D. Power BI template


Question 15 (Fill in the Blank)

The two required Power Query parameters for incremental refresh are __________ and __________.


Question 16 (Single Choice)

Which Fabric feature allows querying data without copying it into a workspace?

A. Shortcut
B. Snapshot
C. Deployment pipeline
D. Calculation group


Question 17 (Scenario – Single Choice)

Your SQL query performance degrades after adding many joins. What is the most likely cause?

A. Low concurrency
B. Snowflake schema
C. Too many measures
D. Too many visuals


Question 18 (Multi-Select – Choose TWO)

Which tools can be used to query Lakehouse data?

A. Spark SQL
B. T-SQL via SQL endpoint
C. KQL
D. DAX Studio


Question 19 (Single Choice)

Which language is used primarily with Eventhouse?

A. SQL
B. Python
C. KQL
D. DAX


Question 20 (Scenario – Single Choice)

You want to analyze slowly changing dimensions historically. Which approach is best?

A. Overwrite rows
B. Incremental refresh
C. Type 2 dimension design
D. Dynamic RLS


Question 21 (Single Choice)

Which feature helps understand downstream dependencies?

A. Impact analysis
B. Endorsement
C. Sensitivity labels
D. Git integration


Question 22 (Multi-Select – Choose TWO)

Which options support data aggregation before reporting?

A. SQL views
B. DAX calculated columns
C. Power Query group by
D. Report-level filters


Question 23 (Single Choice)

Which scenario best fits a Warehouse?

A. Machine learning experimentation
B. Real-time telemetry
C. High-concurrency BI queries
D. File-based storage only


Question 24 (Scenario – Single Choice)

You want to reuse report layouts without embedding credentials. What should you use?

A. PBIX
B. PBIP
C. PBIT
D. PBIDS



SECTION B – Implement & Manage Semantic Models (Questions 25–48)

Question 25 (Single Choice)

Which schema is recommended for semantic models?

A. Snowflake
B. Star
C. Fully normalized
D. Graph


Question 26 (Scenario – Single Choice)

You have a many-to-many relationship between Sales and Promotions. What should you implement?

A. Bi-directional filters
B. Bridge table
C. Calculated column
D. Duplicate dimension


Question 27 (Multi-Select – Choose TWO)

Which storage modes support composite models?

A. Import
B. DirectQuery
C. Direct Lake
D. Live connection


Question 28 (Single Choice)

What is the primary purpose of calculation groups?

A. Reduce model size
B. Replace measures
C. Apply reusable calculations
D. Improve refresh speed


Question 29 (Scenario – Single Choice)

You need users to switch between metrics dynamically in visuals. What should you use?

A. Bookmarks
B. Calculation groups
C. Field parameters
D. Perspectives


Question 30 (Single Choice)

Which DAX pattern generally performs best?

A. SUMX(FactTable, [Column])
B. FILTER + CALCULATE
C. Simple aggregations
D. Nested iterators


Question 31 (Multi-Select – Choose TWO)

Which actions improve DAX performance?

A. Use variables
B. Increase cardinality
C. Avoid unnecessary iterators
D. Use bi-directional filters everywhere


Question 32 (Scenario – Single Choice)

Your model exceeds memory limits but queries are fast. What should you configure?

A. Incremental refresh
B. Large semantic model storage
C. DirectQuery fallback
D. Composite model


Question 33 (Single Choice)

Which tool is best for diagnosing slow visuals?

A. Tabular Editor
B. Performance Analyzer
C. Fabric Monitor
D. SQL Profiler


Question 34 (Scenario – Single Choice)

A Direct Lake model fails to read data. What happens next if fallback is enabled?

A. Query fails
B. Switches to Import
C. Switches to DirectQuery
D. Rebuilds partitions


Question 35 (Single Choice)

Which feature enables version control for Power BI artifacts?

A. Deployment pipelines
B. Git integration
C. XMLA endpoint
D. Endorsements


Question 36 (Matching)

Match the DAX function type to its example:

TypeFunction
1. IteratorA. CALCULATE
2. Filter modifierB. SUMX
3. InformationC. ISFILTERED

Question 37 (Scenario – Single Choice)

You want recent data queried in real time and historical data cached. What should you use?

A. Import only
B. DirectQuery only
C. Hybrid table
D. Calculated table


Question 38 (Single Choice)

Which relationship direction is recommended by default?

A. Both
B. Single
C. None
D. Many-to-many


Question 39 (Multi-Select – Choose TWO)

Which features help enterprise-scale governance?

A. Sensitivity labels
B. Endorsements
C. Personal bookmarks
D. Private datasets


Question 40 (Scenario – Single Choice)

Which setting most affects model refresh duration?

A. Number of measures
B. Incremental refresh policy
C. Number of visuals
D. Report theme


Question 41 (Single Choice)

What does XMLA primarily enable?

A. Real-time streaming
B. Advanced model management
C. Data ingestion
D. Visualization authoring


Question 42 (Fill in the Blank)

Direct Lake reads data directly from __________ stored in __________.


Question 43 (Scenario – Single Choice)

Your composite model uses both Import and DirectQuery. What is this called?

A. Live model
B. Hybrid model
C. Large model
D. Calculated model


Question 44 (Single Choice)

Which optimization reduces relationship ambiguity?

A. Snowflake schema
B. Bridge tables
C. Bidirectional filters
D. Hidden columns


Question 45 (Scenario – Single Choice)

Which feature allows formatting measures dynamically (e.g., %, currency)?

A. Perspectives
B. Field parameters
C. Dynamic format strings
D. Aggregation tables


Question 46 (Multi-Select – Choose TWO)

Which features support reuse across reports?

A. Shared semantic models
B. PBIT files
C. PBIX imports
D. Report-level measures


Question 47 (Single Choice)

Which modeling choice most improves query speed?

A. Snowflake schema
B. High-cardinality columns
C. Star schema
D. Many calculated columns


Question 48 (Scenario – Single Choice)

You want to prevent unnecessary refreshes when data hasn’t changed. What should you enable?

A. Large model
B. Detect data changes
C. Direct Lake fallback
D. XMLA read-write



SECTION C – Maintain & Govern (Questions 49–60)

Question 49 (Single Choice)

Which role provides full control over a Fabric workspace?

A. Viewer
B. Contributor
C. Admin
D. Member


Question 50 (Multi-Select – Choose TWO)

Which security mechanisms are item-level?

A. RLS
B. CLS
C. Workspace roles
D. Object-level security


Question 51 (Scenario – Single Choice)

You want to mark a dataset as trusted. What should you apply?

A. Sensitivity label
B. Endorsement
C. Certification
D. RLS


Question 52 (Single Choice)

Which pipeline stage is typically used for validation?

A. Development
B. Test
C. Production
D. Sandbox


Question 53 (Single Choice)

Which access control restricts specific tables or columns?

A. Workspace role
B. RLS
C. Object-level security
D. Sensitivity label


Question 54 (Scenario – Single Choice)

Which feature allows reviewing downstream report impact before changes?

A. Lineage view
B. Impact analysis
C. Git diff
D. Performance Analyzer


Question 55 (Multi-Select – Choose TWO)

Which actions help enforce data governance?

A. Sensitivity labels
B. Certified datasets
C. Personal workspaces
D. Shared capacities


Question 56 (Single Choice)

Which permission is required to deploy content via pipelines?

A. Viewer
B. Contributor
C. Admin
D. Member


Question 57 (Fill in the Blank)

Row-level security filters data at the __________ level.


Question 58 (Scenario – Single Choice)

You want Power BI Desktop artifacts to integrate cleanly with Git. What format should you use?

A. PBIX
B. PBIP
C. PBIT
D. PBIDS


Question 59 (Single Choice)

Which governance feature integrates with Microsoft Purview?

A. Endorsements
B. Sensitivity labels
C. Deployment pipelines
D. Field parameters


Question 60 (Scenario – Single Choice)

Which role can certify a dataset?

A. Viewer
B. Contributor
C. Dataset owner or admin
D. Any workspace member

DP-600 PRACTICE EXAM

FULL ANSWER KEY & EXPLANATIONS


SECTION A – Prepare Data (1–24)


Question 1

Correct Answer: B – Dataflow Gen2

Explanation:
Dataflow Gen2 is designed for low-code ingestion and transformation from files, including CSVs, into Fabric Lakehouses.

Why others are wrong:

  • A: Power BI Desktop is not an ingestion tool for Lakehouses
  • C: COPY INTO is SQL-based and less suitable for CSV transformation
  • D: Spark is overkill for simple ingestion

Question 2

Correct Answers: A and C

Explanation:

  • Dataflow Gen2 supports ingestion + transformation via Power Query
  • Spark notebooks support ingestion and complex transformations

Why others are wrong:

  • B: Eventhouse is optimized for streaming analytics
  • D: SQL endpoint is query-only
  • E: Power BI Desktop doesn’t ingest into Fabric storage

Question 3

Correct Answer: B – OneLake catalog

Explanation:
The OneLake catalog allows discovery, metadata browsing, and cross-workspace visibility.

Why others are wrong:

  • A: Pipelines manage deployment
  • C: Lineage view shows dependencies, not discovery
  • D: XMLA is for model management

Question 4

Correct Answer: B

Explanation:
Direct Lake queries Delta tables directly in OneLake without importing data into VertiPaq.

Why others are wrong:

  • A: That describes Import mode
  • C: Fallback is optional
  • D: Incremental refresh is not required

Question 5

Correct Matching:

  • 1 → C
  • 2 → A
  • 3 → B

Explanation:

  • Lakehouse = open storage + Spark
  • Warehouse = high-concurrency SQL
  • Eventhouse = streaming/time-series

Question 6

Correct Answer: B

Explanation:
Eventstream → Eventhouse is optimized for high-volume streaming telemetry.


Question 7

Correct Answer: B – SQL VIEW

Explanation:
Views allow joins without materializing data.

Why others are wrong:

  • A/C/D materialize or duplicate data

Question 8

Correct Answers: A and C

Explanation:

  • Shortcuts avoid copying data
  • Shared semantic models reduce duplication

Question 9

Correct Answer: D

Explanation:
Incremental refresh requires a Date or DateTime column.


Question 10

Correct Answer: B

Explanation:
Handling nulls in Power Query ensures clean data before modeling.


Question 11

Correct Answer: B

Explanation:
Row filtering is highly foldable in SQL sources.


Question 12

Correct Answers: A and C

Explanation:
Denormalization improves performance and simplifies star schemas.


Question 13

Correct Answer: B

Explanation:
Splitting text columns increases cardinality dramatically.


Question 14

Correct Answer: C

Explanation:
Dataflow Gen2 enables reusable transformations.


Question 15

Correct Answer:
RangeStart and RangeEnd


Question 16

Correct Answer: A – Shortcut

Explanation:
Shortcuts allow querying data without copying it.


Question 17

Correct Answer: B

Explanation:
Snowflake schemas introduce excessive joins, hurting performance.


Question 18

Correct Answers: A and B

Explanation:
Lakehouse data can be queried via Spark SQL or SQL endpoint.


Question 19

Correct Answer: C – KQL


Question 20

Correct Answer: C

Explanation:
Type 2 dimensions preserve historical changes.


Question 21

Correct Answer: A – Impact analysis


Question 22

Correct Answers: A and C


Question 23

Correct Answer: C


Question 24

Correct Answer: C – PBIT

Explanation:
Templates exclude data and credentials.



SECTION B – Semantic Models (25–48)


Question 25

Correct Answer: B – Star schema


Question 26

Correct Answer: B – Bridge table


Question 27

Correct Answers: A and B

Explanation:
Composite models combine Import + DirectQuery.


Question 28

Correct Answer: C


Question 29

Correct Answer: C – Field parameters


Question 30

Correct Answer: C

Explanation:
Simple aggregations outperform iterators.


Question 31

Correct Answers: A and C


Question 32

Correct Answer: B


Question 33

Correct Answer: B – Performance Analyzer


Question 34

Correct Answer: C


Question 35

Correct Answer: B – Git integration


Question 36

Correct Matching:

  • 1 → B
  • 2 → A
  • 3 → C

Question 37

Correct Answer: C – Hybrid table


Question 38

Correct Answer: B – Single


Question 39

Correct Answers: A and B


Question 40

Correct Answer: B


Question 41

Correct Answer: B


Question 42

Correct Answer:
Delta tables stored in OneLake


Question 43

Correct Answer: B – Hybrid model


Question 44

Correct Answer: B – Bridge tables


Question 45

Correct Answer: C


Question 46

Correct Answers: A and B


Question 47

Correct Answer: C


Question 48

Correct Answer: B – Detect data changes



SECTION C – Maintain & Govern (49–60)


Question 49

Correct Answer: C – Admin


Question 50

Correct Answers: B and D


Question 51

Correct Answer: B – Endorsement


Question 52

Correct Answer: B – Test


Question 53

Correct Answer: C


Question 54

Correct Answer: B – Impact analysis


Question 55

Correct Answers: A and B


Question 56

Correct Answer: C – Admin


Question 57

Correct Answer:
Row (data) level


Question 58

Correct Answer: B – PBIP


Question 59

Correct Answer: B – Sensitivity labels


Question 60

Correct Answer: C