Month: January 2026

Create and Modify Parameters (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Prepare the data (25–30%)
--> Get or connect to data
--> Create and Modify Parameters


Note that there are 10 practice questions (with answers and explanations) for each section to help you solidify your knowledge of the material. Also, there are 2 practice tests with 60 questions each available on the hub below the exam topics section.

As you prepare for the PL-300: Microsoft Power BI Data Analyst exam, one important skill to master is using parameters in Power Query to build flexible, reusable, and dynamic data connections and transformations. Parameters help you abstract static values—such as server names, file paths, or filter values—and manage them in a centralized way. Microsoft Learn


What Are Parameters in Power BI?

A parameter is a named value that you can use in Power Query to influence how queries run. Instead of hard-coding values directly into your query steps, a parameter lets you declare the value once and reference it throughout your queries so you can:

  • Change values globally without editing each query step.
  • Switch between environments (Dev, Test, Prod) easily.
  • Apply consistent filters or thresholds.
  • Support dynamic filtering and modular query design.

In many ways, parameters act like variables in the Power Query (M) engine—reusable building blocks you can plug into various parts of your query logic. Microsoft Learn


Where You Use Parameters

Parameters are especially useful when connecting and shaping data:

At the data source level

  • Define the server, database, folder, or file path.
  • Change the current environment without rewriting source steps.

In query logic

  • Filter tables based on parameter values.
  • Customize operations like date range filters or limit values.

For reusability

  • Use the same parameter value across multiple queries, reducing maintenance.
  • Make Power BI solutions easier to support if data locations change. Microsoft Learn

How to Create Parameters

In Power BI Desktop:

  1. Go to Home → Transform data → Transform data to open Power Query Editor.
  2. In the Power Query Editor, choose Manage Parameters → New Parameter.
  3. In the Manage Parameters dialog, provide:
    • Name (meaningful identifier for the parameter)
    • Description (optional, useful for documentation)
    • Required (whether a value must be entered)
    • Type (Text, Number, Date, etc.)
    • Suggested values (optional: helps users pick from list, default, etc.)
    • Current value (the value Power Query uses when the parameter is applied) Microsoft Learn

Once created, the parameter appears in Power Query as its own query and can be referenced in other queries.


Example Uses of Parameters

Here are typical scenarios where parameters are valuable:

1. Dynamic Data Source Connection

If a report needs to point to Development in one workflow and Production in another, you can create parameters for server name and database name, then reference them in the data source step.

This avoids hard-coding connection strings and simplifies environment switches without editing M code. 3Cloud


2. Reusable Filters

Suppose you want to filter a sales table by a minimum sales amount that might change often. Instead of editing the filter step directly each time, you use a parameter called MinSalesThreshold. If business needs change, you update the parameter once, and all queries referencing it update accordingly. Microsoft Learn


3. Parameter-Driven Queries

You can embed a parameter into custom functions or use it directly in the M query logic (e.g., in Table.SelectRows), making your query logic adaptable without editing M code every time. Microsoft Learn


Modifying Parameters

Once a parameter is created:

  • Return to Manage Parameters in Power Query Editor to change metadata (name, description, type).
  • Update the Current Value to change how queries refresh with different values.
  • If you change suggested values, you may also redefine the list of allowed inputs.

Changing a parameter’s current value typically triggers a refresh of any queries that reference it, applying the new logic or sources immediately. Microsoft Learn


Best Practices for Parameters

Use clear names and descriptions
Someone else reviewing your model should immediately understand the purpose of each parameter.

Keep parameter types consistent
Ensure the data type matches how you intend to use it in queries to avoid type mismatch errors.

Leverage suggested values
If a parameter is intended to allow only certain options (e.g., environment names), defining suggested values improves usability and reduces errors.

Document your parameters
Include descriptions so others know what each parameter does and why it exists. Microsoft Learn


How This Appears on the PL-300 Exam

You may see exam scenarios such as:

  • “You need to allow end users to easily switch data sources without editing M code.”
  • “Modify a query so that multiple environments use the same report with different connection settings.”
  • “Filter the data using a parameter and explain how the data refresh behaves.”

To answer confidently:

  • Know how to create and configure parameters.
  • Understand how parameters change the behavior of queries and filtering.
  • Recognize when parameters simplify maintenance and governance.

This skill not only helps you build robust and flexible Power BI solutions but also aligns with the Prepare the data objectives in the PL-300 exam. Microsoft Learn


Practice Questions

Go to the Practice Exam Questions for this topic.

Choose Between DirectQuery and Import (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections: 
Prepare the data (25–30%)
--> Get or connect to data
--> Choose Between DirectQuery and Import


Note that there are 10 practice questions (with answers and explanations) at the end of each section to help you solidify your knowledge of the material. Also, there are 2 practice tests with 60 questions each available on the hub below the exam topics section.

One of the critical decisions a Power BI Data Analyst makes is how Power BI retrieves and interacts with data. The choice between DirectQuery and Import affects performance, modeling flexibility, scalability, and how data refresh is managed. This topic is core to the Get or connect to data objective and frequently appears in scenario-based exam questions.


What Are Import and DirectQuery?

Before comparing them, it’s important to understand what each mode does:

  • Import: Loads data from a data source into the Power BI dataset’s in-memory storage. Once imported, Power BI interacts with the data locally and refreshes it periodically.
  • DirectQuery: Does not store data in Power BI. Instead, it queries the source system in real time whenever the report needs data.

Import vs. DirectQuery – Conceptual Overview

Here’s the high-level conceptual difference:

AspectImportDirectQuery
Where data is storedIn Power BI’s in-memory engineIn the source system
PerformanceVery fast (in-memory)Depends on source performance
Data freshnessRequires scheduled refreshAlways up-to-date at query time
Modeling flexibilityFullLimited
Query folding benefitYesLimited (source dependent)
Refresh requiredYesNo
Suitable for large tables?Depends on sizeYes
Depends on gateway?Only for on-premises sourcesYes for on-premises sources

When to Choose Import

Choose Import when:

🟢 Performance and Interactivity Matter

Because data is loaded into memory, visuals and filters respond very quickly, which is ideal for dashboards and interactive exploration.

🟢 Full Modeling Flexibility Is Needed

Import supports:

  • Complex DAX
  • Calculated tables & columns
  • Relationship editing
  • Role-level security

This makes it the most flexible and commonly used mode.

🟢 Data Volume is Manageable

For moderately sized datasets that fit comfortably in memory, Import is typically preferred.

Example Scenario

A finance team needs a dashboard that refreshes nightly and delivers fast slicing and dicing of historical sales data.

In this case, Import is the likely choice.


When to Choose DirectQuery

Opt for DirectQuery when:

🔵 Data Must Stay at Source

Some data may be too large or sensitive to copy into Power BI. With DirectQuery, data remains on the original system.

🔵 Real-Time or Near-Real-Time Data is Required

If reports must reflect the most current operational data instantly, DirectQuery ensures up-to-date results.

🔵 Data Volume Is Massive

For very large tables (e.g., billions of rows), Import is impractical. DirectQuery avoids memory constraints by querying only needed data.

🔵 Source System Can Handle Query Load

DirectQuery performance is heavily dependent on the source system’s ability to process queries quickly.

Example Scenario

A company needs a report showing up-to-the-second inventory levels from an operational database.

In this case, DirectQuery is a better fit.


Trade-Offs: What You Lose With DirectQuery

While DirectQuery offers real-time access and handles large data without importing, it has limitations:

LimitationDirectQuery Impact
Modeling flexibilityReduced (no calculated tables; limited DAX)
PerformanceDepends on underlying source and network
Query loadHeavy impact on source system if not optimized
Transformation optionsLimited Power Query support

Because of these trade-offs, exam scenarios often ask you to weigh performance vs. flexibility vs. freshness.


Composite Models: Best of Both Worlds

Power BI supports composite models, which allow mixing Import and DirectQuery within the same dataset.

Use cases:

  • Import smaller, static reference tables
  • DirectQuery larger, frequently updated fact tables

Composite models provide:

  • Query optimization
  • Flexibility across mixed scenarios

Exam questions may include composite model scenarios, so understanding both modes is helpful.


Decision Criteria (Exam-Ready Framework)

When a question asks “Should you use DirectQuery or Import?”, use this decision checklist:

  1. How current must the data be?
    • Real-time → DirectQuery
    • Static or refreshed periodically → Import
  2. Can the source handle query load?
    • Yes → DirectQuery
    • No → Import
  3. Is high performance critical?
    • Yes and data is manageable → Import
  4. Does modeling complexity matter?
    • Yes → Import
    • Minimal modeling needed → DirectQuery
  5. Is dataset very large?
    • Yes → DirectQuery
    • No → Import

Typical Exam Scenarios

Here are some real-feel patterns that may appear in exam questions:

📌 Scenario: Fast Interactive Reporting

Data refreshes once per day, users need fast slicing.
Answer: Import.

📌 Scenario: Very Large Operational Tables

Real-time insight into an operational system required.
Answer: DirectQuery.

📌 Scenario: Need Most Recent Data Without Refresh Schedule

Source changes constantly; data must reflect current state.
Answer: DirectQuery.

📌 Scenario: Mixed Requirements

A small lookup table and a large facts table.
Answer: Composite model (Import + DirectQuery).


Pitfalls and How to Avoid Them

❌ Mistake: Choosing DirectQuery “just in case”

DirectQuery reduces modeling capability and can slow performance if the source isn’t optimized.

Tip: Choose DirectQuery for specific reasons—not by default.

❌ Mistake: Ignoring refresh schedule

Import requires scheduled refreshes. Forgetting this can lead to stale visuals.

Tip: Always confirm refresh requirements before choosing Import.


Key Exam Takeaways

  • Import = data loaded into memory (fast & flexible)
  • DirectQuery = data queried live (real-time & scalable)
  • Composite models combine both
  • Use decision criteria based on performance, freshness, modeling, and source capability
  • Always consider source capacity and query load

Understanding why you choose one mode over the other is more important than memorizing UI steps—especially on the PL-300 exam.


Practice Questions

Go to the Practice Exam Questions for this topic.

Change Data Source Settings (including Credentials & Privacy Levels) (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections: 
Prepare the data (25–30%)
--> Get or connect to data
--> Change Data Source Settings, including Credentials & Privacy Levels


Note that there are 10 practice questions (with answers and explanations) at the end of each section to help you solidify your knowledge of the material. Also, there are 2 practice tests with 60 questions each available on the hub below the exam topics section.

Managing data source settings is a core responsibility of a Power BI Data Analyst and a frequently tested skill on the PL-300: Microsoft Power BI Data Analyst exam. This topic focuses on understanding how Power BI connects to data, how credentials are stored and updated, and how privacy levels affect data combination and security.

On the exam, you’re rarely asked where to click. The emphasis will likely be on conceptual understanding, decision-making, and troubleshooting. You’re likely to be tested on things like what happens when credentials are wrong, why a refresh fails, or how privacy levels affect query behavior.


What Are Data Source Settings in Power BI?

Data source settings define how Power BI authenticates, accesses, and combines data from one or more sources. These settings exist at two primary levels:

  • Power BI Desktop (development and modeling)
  • Power BI Service (publishing, refresh, and sharing)

Key components include:

  • Authentication credentials
  • Privacy levels
  • Data source permissions
  • Refresh behavior

Managing Credentials

What Are Credentials?

Credentials determine who Power BI is when it connects to a data source. If credentials are invalid, expired, or mismatched between environments, refreshes will fail.

Common Authentication Types (Exam-Relevant)

  • Windows authentication
    • On-premises SQL Server
    • Often paired with gateways
  • Database authentication
    • SQL login stored in Power BI
    • Independent of user identity
  • Microsoft Entra ID (OAuth)
    • Cloud sources (Azure SQL, Power BI datasets)
    • Most common for Power BI Service
  • Anonymous
    • Public web data
    • Rare in enterprise scenarios

Where Credentials Are Managed

In Power BI Desktop

  • File → Options and settings → Data source settings
  • Credentials are stored locally
  • Used during development and testing

In Power BI Service

  • Managed per dataset
  • Required for scheduled refresh
  • Often paired with a gateway for on-premises data

📌 Exam tip:
Credentials in Desktop do not automatically carry over to the Power BI Service.


Updating or Changing Credentials

You may need to update credentials when:

  • Passwords expire
  • Data sources move environments
  • Ownership of a dataset changes
  • Refresh failures occur after publishing

On the exam, expect scenarios like:

“A dataset refresh fails after being published. It worked in Power BI Desktop.”

The likely solution:

  • Update credentials in Power BI Service
  • Configure or reconfigure the gateway

Understanding Privacy Levels

Privacy levels control how Power BI isolates data sources when combining data to prevent unintended data leakage.

The Three Privacy Levels

Public

  • Data is publicly available
  • Lowest level of isolation
  • Examples: public websites

Organizational

  • Internal company data
  • Can be shared within the organization
  • Common for corporate databases

Private

  • Highly sensitive data
  • Strict isolation
  • Cannot be freely combined with other sources

Why Privacy Levels Matter

When Power BI combines multiple data sources:

  • It enforces privacy isolation
  • It may block query folding
  • It may require additional processing

📌 Key exam concept:
Privacy levels affect data combination behavior, not just security.


Privacy Levels and Query Folding

Privacy levels can:

  • Prevent Power BI from pushing transformations back to the source
  • Force local evaluation of queries
  • Reduce performance

Example:

  • Combining a Private SQL database with a Public web source may prevent query folding.

📌 Exam takeaway:
Privacy settings can impact performance, not just access.


The “Ignore Privacy Levels” Setting

Power BI Desktop allows you to:

  • Ignore privacy levels (for development or testing)

⚠️ Important for the exam:

  • This setting does not apply in Power BI Service
  • Service refresh always enforces privacy levels

Common Exam Scenarios and What They’re Testing

Scenario 1: Refresh Fails in Service

Tests:

  • Understanding of credential scope
  • Knowledge of service-side configuration

Correct thinking:

  • Credentials must be set in the Power BI Service
  • Gateway may be required

Scenario 2: Combining Multiple Data Sources

Tests:

  • Privacy level understanding
  • Data leakage prevention

Correct thinking:

  • Choose appropriate privacy levels
  • Understand impact on query folding

Scenario 3: Dataset Ownership Changes

Tests:

  • Credential reassignment
  • Dataset security awareness

Correct thinking:

  • Credentials may need to be re-entered
  • Refresh ownership matters

Key Exam Takeaways

For the PL-300 exam, remember:

  • Credentials define who Power BI is when accessing data
  • Desktop and Service credentials are separate
  • Privacy levels control data isolation
  • Privacy levels affect data combination and performance
  • Ignoring privacy levels works only in Desktop
  • Refresh failures often point to credential or gateway issues

If you understand why Power BI behaves the way it does with credentials and privacy, you’ll answer these questions confidently—even when the wording is tricky.


Practice Questions

Go to the Practice Exam Questions for this topic.

Identify and Connect to Data Sources or a Shared Semantic Model (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections: 
Prepare the data (25–30%)
--> Get or connect to data
--> Identify and connect to data sources or a shared semantic model


Note that there are 10 practice questions (with answers and explanations) for each section to help you solidify your knowledge of the material. Also, there are 2 practice tests with 60 questions each available on the hub below the exam topics section.

One of the first and most foundational skills tested in the PL-300: Microsoft Power BI Data Analyst exam is the ability to identify, select, and connect to appropriate data sources. This skill lives within the Prepare the data domain and underpins everything that follows—modeling, visualization, and analysis.

A Power BI Data Analyst must understand where data lives, how to connect to it efficiently, and when to reuse existing models instead of building new ones.


Understanding Data Source Types in Power BI

Power BI supports a wide variety of data sources, and the exam expects familiarity with common enterprise and self-service scenarios rather than obscure connectors.

Common Data Source Categories

File-based sources

  • Excel workbooks
  • CSV and text files
  • XML and JSON files
  • PDF files (structured tables)

Database sources

  • SQL Server
  • Azure SQL Database
  • Azure Synapse Analytics
  • Oracle
  • MySQL / PostgreSQL

Cloud and SaaS sources

  • SharePoint Online lists and files
  • OneDrive
  • Dataverse
  • Azure Blob Storage
  • Azure Data Lake Storage Gen2

Power Platform and Power BI sources

  • Power BI datasets (shared semantic models)
  • Dataflows
  • Datamarts

Exam tip: You are not expected to memorize every connector—focus on recognizing the correct source for a given scenario.


Choosing the Right Connectivity Mode

When connecting to data, Power BI offers three primary connectivity modes, and the exam frequently tests your understanding of when to use each one.

Import Mode

  • Data is loaded into the Power BI model
  • Fast performance
  • Supports full DAX and modeling capabilities
  • Requires dataset refreshes

Use when:
Data size is manageable and performance is critical.


DirectQuery Mode

  • Queries data directly at the source
  • Minimal data stored in Power BI
  • Performance depends on the source system
  • Some modeling and DAX limitations

Use when:
Data is very large or must remain in the source system.


Live Connection

  • Used with shared semantic models (Power BI datasets) or Analysis Services
  • No local model created
  • Modeling is done in the source dataset

Use when:
Connecting to a centrally governed dataset managed by another team.


Connecting to a Shared Semantic Model

A shared semantic model (formerly called a “dataset”) allows analysts to reuse existing data models instead of duplicating logic.

What Is a Shared Semantic Model?

  • A centrally managed Power BI dataset
  • Contains curated tables, relationships, measures, and security
  • Published to the Power BI Service
  • Designed for reuse across multiple reports

Benefits of Using Shared Semantic Models

  • Consistent metrics and definitions
  • Reduced duplication of logic
  • Centralized governance and security
  • Faster report development

How to Connect

In Power BI Desktop:

  1. Select Get data
  2. Choose Power BI datasets
  3. Select an existing dataset from the workspace
  4. Create a report using a Live connection

Exam tip:
When connected to a shared semantic model:

  • You cannot modify relationships
  • You cannot add calculated tables
  • You can create report-level measures (thin reports)

Authentication and Access Considerations

The exam also expects awareness of authentication methods, especially in enterprise environments.

Common authentication types include:

  • Windows authentication
  • Database authentication
  • Microsoft Entra ID (Azure AD)
  • OAuth

Key concept:
Your ability to connect depends on permissions at the source, not just in Power BI.


Identifying the Best Data Source for a Scenario

You may see exam questions that describe a business requirement and ask you to identify the most appropriate data source or connection method.

Example Scenarios

  • A finance team needs fast, interactive reports → Import mode
  • A large transactional database updated every minute → DirectQuery
  • Multiple teams need consistent KPIs → Shared semantic model
  • Files stored in SharePoint and updated weekly → SharePoint Folder connector

Key Exam Takeaways

For the PL-300 exam, remember the following:

  • Understand major data source categories
  • Know Import vs DirectQuery vs Live connection
  • Recognize when to use shared semantic models
  • Be aware of limitations when using Live connections
  • Choose data sources based on performance, governance, and scale

This topic is less about clicking buttons and more about making the right architectural decision—a critical skill for a Power BI Data Analyst.


Power BI Connection Modes – Comparison Table (PL-300 Exam Focus)

Feature / AspectImportDirectQueryLive Connection
Where data is storedLoaded into Power BI datasetRemains in source systemRemains in source dataset or Analysis Services
PerformanceFastest (in-memory)Depends on source performanceDepends on source dataset
Data refresh requiredYes (scheduled or manual)No (queries run live)No (always live)
Data volume supportLimited by dataset sizeVery large datasetsVery large datasets
Modeling allowedFull modeling supportedLimited modelingNo model changes allowed
Create relationshipsYesLimitedNo
Create calculated tablesYesLimitedNo
Create calculated columnsYesLimitedNo
Create measures (DAX)YesYes (with limitations)Yes (report-level only)
Power Query transformationsFully supportedLimited (query folding dependent)Not supported
Row-level security (RLS)SupportedSupportedInherited from source
Typical sourcesExcel, CSV, SQL, SharePointLarge SQL, Azure SQL, SynapsePower BI datasets, Analysis Services
Best use caseHigh performance, small-to-medium dataNear-real-time or massive dataReusing governed enterprise models
Exam frequencyVery highHighVery high

When to Choose Each Mode (Exam Scenarios)

Import Mode

Choose Import when:

  • Performance is the top priority
  • Data size is manageable
  • Full modeling flexibility is needed

📌 Most common and safest answer on the exam unless constraints are stated.


DirectQuery

Choose DirectQuery when:

  • Data is extremely large
  • Data must stay at the source
  • Near real-time reporting is required

📌 Expect questions about performance trade-offs and modeling limitations.


Live Connection

Choose Live Connection when:

  • Using a shared semantic model
  • Consistent metrics are required across reports
  • Centralized governance is in place

📌 Often paired with “thin report” exam scenarios.


High-Value Exam Tips ⭐

  • If the question mentions enterprise governance or shared datasets → think Live Connection
  • If it mentions large, frequently changing data → think DirectQuery
  • If it mentions fast visuals and full modeling → think Import
  • Live connections do not allow model changes
  • Import mode requires refresh
  • DirectQuery relies on query folding

Practice Exam Questions

Go to the Practice Exam Questions for this section.

Practice Questions: Identify and connect to data sources or a shared semantic model (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections: 
Prepare the data (25–30%)
--> Get or connect to data
--> Identify and connect to data sources or a shared semantic model


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

A report requires the fastest possible query performance and the ability to create calculated tables and columns. The dataset size is moderate and updates occur nightly. Which connection mode should you use?

A. DirectQuery
B. Live connection
C. Import
D. Composite model

✅ Correct Answer: C

Explanation:
Import mode stores data in memory, provides the best performance, and supports full modeling capabilities.


Question 2

A company has a centrally governed Power BI dataset with standardized KPIs used across multiple departments. You need to build a report using this dataset without modifying the model. What should you use?

A. Import mode
B. DirectQuery mode
C. Live connection
D. Power BI dataflow

✅ Correct Answer: C

Explanation:
Live connections are used to connect to shared semantic models and enforce centralized governance.


Question 3

Which limitation exists when using a Live connection to a Power BI dataset?

A. Measures cannot be created
B. Relationships cannot be modified
C. Data refresh cannot be scheduled
D. Visual interactions are disabled

✅ Correct Answer: B

Explanation:
When using a Live connection, the data model is read-only and cannot be changed.


Question 4

A dataset must query a transactional database containing billions of rows with near real-time requirements. The data must remain at the source system. Which connection mode is most appropriate?

A. Import
B. Live connection
C. DirectQuery
D. Power BI dataflow

✅ Correct Answer: C

Explanation:
DirectQuery queries the data directly at the source and supports very large datasets.


Question 5

You connect to a shared semantic model in Power BI Service. Which type of measures can you create?

A. Calculated column measures
B. Dataset-level measures
C. Report-level measures only
D. No measures can be created

✅ Correct Answer: C

Explanation:
With a Live connection, only report-level measures are allowed.


Question 6

Which data source scenario is best suited for Import mode?

A. Streaming IoT telemetry
B. Large fact table updated every second
C. Excel files stored in SharePoint and refreshed weekly
D. Enterprise semantic model shared across teams

✅ Correct Answer: C

Explanation:
File-based sources with scheduled refreshes are ideal for Import mode.


Question 7

Which statement about DirectQuery is true?

A. All Power Query transformations are supported
B. Data is cached in memory for performance
C. Query performance depends on the source system
D. Calculated tables are fully supported

✅ Correct Answer: C

Explanation:
DirectQuery sends queries to the source system, so performance depends on the source.


Question 8

When should you connect to a Power BI dataset instead of importing data?

A. When you need to modify relationships
B. When consistent metrics across reports are required
C. When data volume is small
D. When Power Query transformations are required

✅ Correct Answer: B

Explanation:
Shared semantic models ensure consistency and governance across reports.


Question 9

Which authentication method is commonly used when connecting to cloud-based Power BI datasets?

A. Database authentication
B. Windows authentication
C. OAuth / Microsoft Entra ID
D. Anonymous authentication

✅ Correct Answer: C

Explanation:
Power BI Service connections typically use Microsoft Entra ID (OAuth).


Question 10

A Power BI report must use data from both an imported Excel file and a DirectQuery SQL source. Which feature enables this scenario?

A. Live connection
B. Power BI dataflow
C. Composite model
D. Shared semantic model

✅ Correct Answer: C

Explanation:
Composite models allow combining Import and DirectQuery data sources.


Exam Readiness Check ✅

If you can:

  • Quickly identify Import vs DirectQuery vs Live
  • Recognize shared semantic model scenarios
  • Recall modeling limitations by connection type

…then you’re in great shape for this PL-300 objective.


Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Change data source settings, including credentials and privacy levels (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Prepare the data (25–30%)
--> Get or connect to data
--> Change data source settings, including credentials and privacy levels


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

A dataset refresh succeeds in Power BI Desktop but fails after being published to the Power BI Service. What is the most likely cause?

A. Privacy levels were set incorrectly in Desktop
B. Credentials were not configured in the Power BI Service
C. Query folding was disabled
D. The dataset uses Import mode

✅ Correct Answer: B

Explanation:
Credentials used in Power BI Desktop do not automatically transfer to the Power BI Service. They must be configured separately.


Question 2

Which authentication method is most commonly used when connecting to cloud-based data sources in the Power BI Service?

A. Windows authentication
B. Database authentication
C. OAuth (Microsoft Entra ID)
D. Anonymous authentication

✅ Correct Answer: C

Explanation:
Cloud data sources typically use OAuth through Microsoft Entra ID for secure authentication.


Question 3

What is the primary purpose of privacy levels in Power BI?

A. To encrypt data at rest
B. To restrict report sharing
C. To prevent unintended data leakage when combining sources
D. To improve dataset refresh performance

✅ Correct Answer: C

Explanation:
Privacy levels isolate data sources to prevent sensitive data from being unintentionally shared across sources.


Question 4

A Power BI model combines data from a Private SQL database and a Public web source. What is a potential impact of this configuration?

A. Scheduled refresh will be disabled
B. Query folding may be prevented
C. Calculated measures will not work
D. Row-level security will be removed

✅ Correct Answer: B

Explanation:
When combining sources with different privacy levels, Power BI may prevent query folding to enforce data isolation.


Question 5

Which privacy level should be assigned to internal corporate data sources?

A. Public
B. Anonymous
C. Organizational
D. Private

✅ Correct Answer: C

Explanation:
Organizational privacy level is intended for internal company data that can be shared within the organization.


Question 6

Which statement about the “Ignore privacy levels” option is true?

A. It applies to both Desktop and Service
B. It permanently disables privacy enforcement
C. It only applies in Power BI Desktop
D. It improves dataset refresh performance in the Service

✅ Correct Answer: C

Explanation:
Ignoring privacy levels is a Desktop-only setting and is not honored during refresh in the Power BI Service.


Question 7

When are credentials typically required to be re-entered?

A. When a report is viewed by another user
B. When the dataset owner changes
C. When a visual is modified
D. When a slicer is added

✅ Correct Answer: B

Explanation:
Changing dataset ownership often requires credentials to be re-entered to maintain refresh functionality.


Question 8

Which scenario requires the use of an on-premises data gateway?

A. Connecting to Azure SQL Database
B. Connecting to a Power BI dataset
C. Connecting to an on-premises SQL Server from the Service
D. Connecting to SharePoint Online

✅ Correct Answer: C

Explanation:
On-premises data sources require a gateway to allow the Power BI Service to access them.


Question 9

What happens if incorrect credentials are configured in the Power BI Service?

A. The report will not open
B. Visuals will display blank data
C. Scheduled refresh will fail
D. The dataset will be deleted

✅ Correct Answer: C

Explanation:
Incorrect credentials cause scheduled refresh failures, even if reports still open using cached data.


Question 10

Which action should you take first when a scheduled refresh suddenly starts failing?

A. Rebuild the data model
B. Change privacy levels
C. Check and update credentials in the Service
D. Disable query folding

✅ Correct Answer: C

Explanation:
Most unexpected refresh failures are caused by expired or changed credentials and should be checked first.


Exam Readiness Summary ✅

If you can confidently:

  • Distinguish Desktop vs Service credentials
  • Explain why privacy levels exist
  • Identify refresh and gateway issues
  • Understand how privacy levels affect query folding

…then you are well prepared for this PL-300 exam objective.


Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Choose between DirectQuery and Import (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Prepare the data (25–30%)
--> Get or connect to data
--> Choose between DirectQuery and Import


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

A Power BI report must deliver the fastest possible visual response. The dataset is moderate in size and refreshed once per day. Which connectivity mode should you choose?

A. DirectQuery
B. Live connection
C. Import
D. Composite model

✅ Correct Answer: C

Explanation:
Import mode stores data in memory, providing the fastest performance and full modeling capabilities.


Question 2

A report must show up-to-the-minute transaction data from a large operational database. Data must remain in the source system. What is the best option?

A. Import
B. DirectQuery
C. Live connection
D. Power BI dataflow

✅ Correct Answer: B

Explanation:
DirectQuery retrieves data directly from the source in real time and avoids importing large datasets.


Question 3

Which limitation is most commonly associated with DirectQuery?

A. No scheduled refresh support
B. Reduced modeling and DAX capabilities
C. Inability to use row-level security
D. Inability to connect to SQL Server

✅ Correct Answer: B

Explanation:
DirectQuery limits certain modeling features, including calculated tables and some DAX expressions.


Question 4

A dataset contains a small product lookup table and a very large fact table that updates continuously. What is the most appropriate solution?

A. Import both tables
B. Use DirectQuery for both tables
C. Use a composite model
D. Use a live connection

✅ Correct Answer: C

Explanation:
Composite models allow importing small static tables while using DirectQuery for large, frequently updated tables.


Question 5

Which factor has the greatest impact on report performance when using DirectQuery?

A. Number of visuals on the page
B. Power BI Desktop version
C. Performance of the source system
D. Dataset refresh frequency

✅ Correct Answer: C

Explanation:
DirectQuery sends queries to the source system, so performance depends heavily on the source’s ability to handle queries.


Question 6

When is Import mode generally not recommended?

A. When modeling flexibility is required
B. When dataset size exceeds practical memory limits
C. When reports need fast interactivity
D. When refresh can occur on a schedule

✅ Correct Answer: B

Explanation:
Very large datasets may exceed memory constraints, making Import impractical.


Question 7

Which statement about data freshness is true?

A. Import mode always shows real-time data
B. DirectQuery requires scheduled refresh
C. Import mode relies on dataset refresh
D. DirectQuery stores data in memory

✅ Correct Answer: C

Explanation:
Import mode displays data as of the last refresh, while DirectQuery retrieves data at query time.


Question 8

A Power BI report must support complex DAX measures and calculated tables. Data updates hourly and does not need real-time accuracy. What should you choose?

A. DirectQuery
B. Import
C. Live connection
D. Streaming dataset

✅ Correct Answer: B

Explanation:
Import mode supports full DAX and modeling flexibility and is appropriate when real-time data is not required.


Question 9

Which scenario is the best candidate for DirectQuery?

A. Monthly financial reporting
B. Historical trend analysis
C. Real-time inventory monitoring
D. Static reference data

✅ Correct Answer: C

Explanation:
Real-time or near-real-time monitoring scenarios are ideal for DirectQuery.


Question 10

Why might a Power BI Data Analyst avoid DirectQuery unless necessary?

A. It cannot connect to cloud data sources
B. It disables report sharing
C. It can negatively impact performance and modeling flexibility
D. It does not support security

✅ Correct Answer: C

Explanation:
DirectQuery introduces performance dependencies on the source system and limits modeling features, making Import preferable when possible.


Exam Readiness Check ✅

You’re well prepared for this PL-300 objective if you can:

  • Identify real-time vs scheduled refresh needs
  • Balance performance vs flexibility
  • Recognize large-scale data scenarios
  • Explain why DirectQuery is chosen—not just when

Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Create and modify parameters (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Prepare the data (25–30%)
--> Get or connect to data
--> Create and modify parameters


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

What is the primary purpose of using parameters in Power Query?

A. To improve visual rendering performance
B. To store calculated DAX expressions
C. To replace hard-coded values with reusable variables
D. To control report-level security

✅ Correct Answer: C

Explanation:
Parameters allow you to define reusable values that replace hard-coded inputs such as server names, file paths, or filter values.


Question 2

Which scenario is the best use case for Power Query parameters?

A. Creating dynamic slicers for end users
B. Switching between development and production databases
C. Defining relationships between tables
D. Creating calculated measures

✅ Correct Answer: B

Explanation:
Parameters are commonly used to manage environment-specific values like server and database names.


Question 3

Where are parameters created and managed in Power BI Desktop?

A. Report view
B. Model view
C. DAX editor
D. Power Query Editor

✅ Correct Answer: D

Explanation:
Parameters are created and modified in the Power Query Editor under Manage Parameters.


Question 4

Which data types can be assigned to a Power Query parameter?

A. Text only
B. Numeric only
C. Any DAX-supported data type
D. Text, Number, Date, DateTime, and Boolean

✅ Correct Answer: D

Explanation:
Power Query parameters support multiple data types, including text, numeric, and date-based types.


Question 5

What happens when you change the current value of a parameter that is used in a query?

A. The report visuals automatically change without refresh
B. Only the parameter value updates, not the data
C. The dependent queries are refreshed using the new value
D. The parameter is deleted and recreated

✅ Correct Answer: C

Explanation:
Changing a parameter’s current value causes any dependent queries to refresh using the updated value.


Question 6

Which option helps restrict users to predefined parameter values?

A. Required
B. Suggested values
C. Parameter type
D. Query folding

✅ Correct Answer: B

Explanation:
Suggested values allow you to define a list or range of acceptable parameter inputs.


Question 7

A Power BI dataset uses a parameter to define a file path. The file location changes. What is the most efficient way to update the dataset?

A. Rewrite the M code
B. Replace the query entirely
C. Update the parameter value
D. Create a calculated column

✅ Correct Answer: C

Explanation:
Updating the parameter value avoids editing query logic and ensures all dependent queries update automatically.


Question 8

Which statement about Power Query parameters is true?

A. Parameters can only be used for filtering rows
B. Parameters must be created using M code
C. Parameters appear as queries in Power Query
D. Parameters are only supported in Import mode

✅ Correct Answer: C

Explanation:
Parameters appear as special queries in Power Query and can be referenced by other queries.


Question 9

A parameter is used to filter data in a query. What happens if the parameter’s value does not match the expected data type?

A. Power BI automatically converts the value
B. The report loads with blank visuals
C. The query refresh may fail
D. The parameter is ignored

✅ Correct Answer: C

Explanation:
A mismatch between parameter type and usage can cause query errors during refresh.


Question 10

How are parameters most commonly tested on the PL-300 exam?

A. Memorizing the M syntax used to create them
B. Identifying UI navigation steps
C. Selecting the correct scenario where parameters improve flexibility
D. Writing custom functions

✅ Correct Answer: C

Explanation:
The exam focuses on when and why to use parameters rather than detailed syntax or UI steps.


Exam Readiness Summary ✅

You are well prepared for this topic if you can:

  • Explain why parameters are used
  • Identify environment and filtering scenarios
  • Understand how parameter changes affect refresh
  • Recognize parameters as a Power Query feature, not DAX

Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Evaluate Data including Data Statistics & Column Properties (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Prepare the data (25–30%)
--> Profile and clean the data
--> Evaluate Data including Data Statistics & Column Properties


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

Question 1

Which Power Query feature helps you quickly identify null, error, and valid values in a column?

A. Column distribution
B. Column profile
C. Column quality
D. Data type detection

✅ Correct Answer: C

Explanation:
Column quality visually shows the percentage of valid, error, and empty (null) values in a column.


Question 2

You want to understand how often each value appears in a categorical column. Which tool should you use?

A. Column quality
B. Column distribution
C. Column profile
D. Default summarization

✅ Correct Answer: B

Explanation:
Column distribution displays value frequency and helps identify duplicates or unexpected values.


Question 3

Which Power Query tool provides statistics such as minimum, maximum, and average values?

A. Column quality
B. Column distribution
C. Column profile
D. Data view

✅ Correct Answer: C

Explanation:
Column profile provides detailed statistical information for numeric and date columns.


Question 4

A numeric column cannot be aggregated in a visual because it is treated as text. What should you evaluate first?

A. Column distribution
B. Column format
C. Column data type
D. Default summarization

✅ Correct Answer: C

Explanation:
Incorrect data types prevent aggregation. The data type should be corrected before formatting or summarization.


Question 5

Which setting determines how a numeric column behaves by default when added to a visual?

A. Column profile
B. Data type
C. Column format
D. Default summarization

✅ Correct Answer: D

Explanation:
Default summarization controls whether a numeric column sums, averages, counts, or does not summarize.


Question 6

When should data evaluation and profiling primarily take place?

A. In Report view
B. In Model view
C. In Power Query Editor
D. In the Power BI Service

✅ Correct Answer: C

Explanation:
Data profiling and evaluation are performed in Power Query before data is loaded into the model.


Question 7

You suspect a column contains unexpected negative values. Which feature helps confirm this?

A. Column quality
B. Column distribution
C. Column profile
D. Column format

✅ Correct Answer: C

Explanation:
Column profile provides min/max statistics, making it ideal for detecting invalid ranges.


Question 8

Which column property affects how values are displayed but not how they are calculated?

A. Data type
B. Column format
C. Column profile
D. Default summarization

✅ Correct Answer: B

Explanation:
Formatting controls visual appearance (currency, percentage, date format) but does not affect calculations.


Question 9

A column shows a high percentage of error values during evaluation. What is the most likely implication?

A. The column has duplicate values
B. The column contains invalid or incompatible data
C. The column is incorrectly summarized
D. The column has incorrect formatting

✅ Correct Answer: B

Explanation:
Error values typically indicate issues such as invalid conversions or incompatible data types.


Question 10

Which pairing of Power Query tools and purpose is correct?

A. Column quality → value frequency analysis
B. Column distribution → identifying null values
C. Column profile → statistical analysis
D. Column format → detecting outliers

✅ Correct Answer: C

Explanation:
Column profile is used for statistical analysis, including min, max, and average values.


Exam Readiness Check ✅

You’re well prepared for this PL-300 topic if you can:

  • Distinguish between column quality, distribution, and profile
  • Identify which tool answers which question
  • Understand how column properties affect modeling and visuals
  • Recognize that data evaluation happens before modeling

Go back to the PL-300 Exam Prep Hub main page

Practice Questions: Resolve inconsistencies, unexpected or null values, and data quality issues (PL-300 Exam Prep)

This post is a part of the PL-300: Microsoft Power BI Data Analyst Exam Prep Hub; and this topic falls under these sections:
Prepare the data (25–30%)
--> Profile and clean the data
--> Resolve inconsistencies, unexpected or null values, and data quality issues


Below are 10 practice questions (with answers and explanations) for this topic of the exam.
There are also 2 practice tests for the PL-300 exam with 60 questions each (with answers) available on the hub.

Practice Questions

1. Identifying Null Values

You open Power Query and want to quickly identify which columns contain null values. Which feature should you use?

A. Column distribution
B. Column quality
C. Column profile
D. Query dependencies

Correct Answer: B

Explanation:
Column Quality displays the percentage of Valid, Error, and Empty (null) values, making it the fastest way to identify nulls in a column.


2. Replacing Null Values Appropriately

A numeric column contains null values that should be treated as zero for reporting purposes. What is the BEST approach?

A. Remove rows with null values
B. Replace null values with 0
C. Fill down the column
D. Change the column data type

Correct Answer: B

Explanation:
Replacing nulls with zero is appropriate when the business meaning supports it. Removing rows would result in data loss.


3. Fixing Inconsistent Text Values

A column contains values such as “USA”, “usa”, and “Usa”. What is the MOST efficient way to standardize these values?

A. Replace values manually
B. Create a conditional column
C. Apply text formatting (Uppercase)
D. Remove duplicate rows

Correct Answer: C

Explanation:
Applying a text format such as Uppercase standardizes casing efficiently across the entire column.


4. Detecting Unexpected Values

You suspect a column contains invalid numeric values outside the expected range. Which Power Query feature helps identify this?

A. Column quality
B. Column distribution
C. Column profile
D. Data type conversion

Correct Answer: C

Explanation:
Column Profile shows statistics such as minimum, maximum, and average, making it ideal for detecting outliers and unexpected values.


5. Understanding Data Profiling Scope

Why might column profiling results appear inaccurate in Power Query?

A. Profiling only works for text columns
B. Profiling is disabled by default
C. Profiling is based on a data sample
D. Profiling ignores null values

Correct Answer: C

Explanation:
By default, Power Query profiles only a sample of rows. You must enable “Column profiling based on entire dataset” for full accuracy.


6. Handling Error Values

A column contains error values caused by invalid data type conversions. What is the BEST first step?

A. Remove rows with errors
B. Replace errors with null
C. Fix the transformation causing the error
D. Change the column name

Correct Answer: C

Explanation:
Best practice is to address the root cause of errors rather than simply removing or replacing them.


7. Removing Leading and Trailing Spaces

A text column appears to have duplicate values due to extra spaces. Which transformation should you apply?

A. Clean
B. Trim
C. Replace values
D. Capitalize Each Word

Correct Answer: B

Explanation:
Trim removes leading and trailing spaces, which commonly cause hidden inconsistencies in text data.


8. Cleaning Non-Printable Characters

Some text values contain hidden characters that affect filtering and grouping. Which transformation should you use?

A. Trim
B. Replace errors
C. Clean
D. Format → Lowercase

Correct Answer: C

Explanation:
The Clean transformation removes non-printable characters that are not visible but can impact data quality.


9. Resolving Missing Values in Hierarchical Data

A dataset contains a category value only in the first row of each group, with subsequent rows being null. What is the BEST solution?

A. Replace nulls with “Unknown”
B. Fill down
C. Remove null rows
D. Merge columns

Correct Answer: B

Explanation:
Fill Down propagates the previous non-null value and is commonly used for hierarchical or grouped data structures.


10. Examining Category Inconsistencies

You want to identify inconsistent category names such as misspellings or variations. Which tool is MOST useful?

A. Column quality
B. Column distribution
C. Query parameters
D. Conditional formatting

Correct Answer: B

Explanation:
Column Distribution shows value frequency, making it easy to spot variations and inconsistencies in categorical data.


✅ PL-300 Exam Takeaways

  • Know which profiling tool answers which question
  • Understand why a transformation is used, not just how
  • Prefer fixing issues in Power Query instead of DAX
  • Expect scenario-based questions asking for the best action

Go back to the PL-300 Exam Prep Hub main page