Practice Questions: Describe considerations for data ingestion and processing (DP-900 Exam Prep)

Practice Questions


Question 1

What is the primary purpose of data ingestion?

A. To visualize data
B. To store data permanently
C. To collect and import data into a system
D. To delete outdated data

Answer: C

Explanation:
Data ingestion is the process of bringing data into a storage or analytics system.


Question 2

Which type of ingestion processes data at scheduled intervals?

A. Stream ingestion
B. Batch ingestion
C. Real-time ingestion
D. Event-driven ingestion

Answer: B

Explanation:
Batch ingestion processes data periodically, not continuously.


Question 3

Which Azure service is commonly used for batch data ingestion?

A. Azure Event Hubs
B. Azure Data Factory
C. Azure Stream Analytics
D. Azure Virtual Machines

Answer: B

Explanation:
Azure Data Factory is designed for batch ETL/ELT workflows.


Question 4

Which scenario requires stream (real-time) ingestion?

A. Monthly sales reporting
B. Archiving old data
C. Monitoring live sensor data from IoT devices
D. Migrating historical records

Answer: C

Explanation:
Streaming ingestion is used for continuous, real-time data like IoT.


Question 5

What is the primary benefit of stream processing?

A. Lower cost
B. Simpler architecture
C. Real-time insights
D. Reduced storage requirements

Answer: C

Explanation:
Stream processing enables low-latency, real-time analysis.


Question 6

Which Azure service is used for real-time data ingestion at scale?

A. Azure Synapse Analytics
B. Azure Blob Storage
C. Azure Event Hubs
D. Azure Files

Answer: C

Explanation:
Azure Event Hubs is designed for high-throughput streaming ingestion.


Question 7

Which type of processing is BEST suited for historical data analysis?

A. Stream processing
B. Batch processing
C. Real-time processing
D. Event-driven processing

Answer: B

Explanation:
Batch processing is ideal for large, historical datasets.


Question 8

Which factor is MOST important when choosing between batch and stream processing?

A. File format
B. Latency requirements
C. Storage account type
D. Programming language

Answer: B

Explanation:
The key decision is how quickly the data needs to be processed.


Question 9

Which Azure service is used to process streaming data in real time?

A. Azure Data Factory
B. Azure Stream Analytics
C. Azure SQL Database
D. Azure Files

Answer: B

Explanation:
Azure Stream Analytics processes real-time streaming data.


Question 10

Which of the following is a key consideration when designing a data ingestion pipeline?

A. Screen resolution
B. Latency, scalability, and data volume
C. Programming language syntax
D. User interface design

Answer: B

Explanation:
Important considerations include latency, scalability, volume, and data quality.


✅ Quick Exam Takeaways

Data ingestion = bringing data into the system
Data processing = transforming data for analysis

✔ Two main approaches:

  • Batch → scheduled, high latency
  • Streaming → continuous, low latency

✔ Key Azure services:

  • Azure Data Factory → batch ingestion
  • Azure Event Hubs → streaming ingestion
  • Azure Stream Analytics → real-time processing
  • Azure Synapse Analytics → batch processing

✔ Key decision factor:
👉 Do you need real-time insights or not?


Go to the DP-900 Exam Prep Hub main page.

Leave a comment