Free DP-700 Sample Questions — Implementing Data Engineering Solutions Using Microsoft Fabric

Free DP-700 sample questions for the Implementing Data Engineering Solutions Using Microsoft Fabric exam. No account required: study at your own pace.

Want an interactive quiz? Take the full DP-700 practice test

Looking for more? Click here to get the full PDF with 51+ practice questions for $10 for offline study and deeper preparation.

Question 1

You have a Fabric workspace that contains a lakehouse named Lakehouse1. In an external data source, you have data files that are 500 GB each. A new file is added every day. You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirements Trigger the process when a new file is added. Provide the highest throughput. Which type of item should you use to ingest the data?

  • A. Eventstream
  • B. Dataflow Gen2
  • C. Streaming dataset
  • D. Data pipeline
Show Answer
Correct Answer:
D. Data pipeline
Question 2

You have a Fabric workspace that contains a lakehouse named Lakehouse1. In an external data source, you have data files that are 500 GB each. A new file is added every day. You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirements Trigger the process when a new file is added. Provide the highest throughput. Which type of item should you use to ingest the data?

  • A. Data pipeline
  • B. Environment
  • C. KQL queryset
  • D. Dataflow Gen2
Show Answer
Correct Answer:
A. Data pipeline
Question 3

Your company has a sales department that uses two Fabric workspaces named Workspace1 and Workspace2. The company decides to implement a domain strategy to organize the workspaces. You need to ensure that a user can perform the following tasks: Create a new domain for the sales department. Create two subdomains: one for the east region and one for the west region. Assign Workspace1 to the east region subdomain. Assign Workspace2 to the west region subdomain. The solution must follow the principle of least privilege. Which role should you assign to the user?

  • A. workspace Admin
  • B. domain admin
  • C. domain contributor
  • D. Fabric admin
Show Answer
Correct Answer:
D. Fabric admin
Question 4

You have a Fabric warehouse named DW1. DW1 contains a table that stores sales data and is used by multiple sales representatives. You plan to implement row-level security (RLS). You need to ensure that the sales representatives can see only their respective data. Which warehouse object do you require to implement RLS?

  • A. STORED PROCEDURE
  • B. CONSTRAINT
  • C. SCHEMA
  • D. FUNCTION
Show Answer
Correct Answer:
D. FUNCTION
Question 5

You have a Fabric workspace that contains a write-intensive warehouse named DW1. DW1 stores staging tables that are used to load a dimensional model. The tables are often read once, dropped, and then recreated to process new data. You need to minimize the load time of DW1. What should you do?

  • A. Enable V-Order
  • B. Create statistics
  • C. Drop statistics
  • D. Disable V-Order
Show Answer
Correct Answer:
D. Disable V-Order
Question 6

You have a Fabric workspace named Workspace1. You plan to configure Git integration for Workspace1 by using an Azure DevOps Git repository. An Azure DevOps admin creates the required artifacts to support the integration of Workspace1. Which details do you require to perform the integration?

  • A. the organization, project, Git repository, and branch
  • B. the personal access token (PAT) for Git authentication and the Git repository URL
  • C. the project, Git repository, branch, and Git folder
  • D. the Git repository URL and the Git folder
Show Answer
Correct Answer:
A. the organization, project, Git repository, and branch
Question 7

You have five Fabric workspaces. You are monitoring the execution of items by using Monitoring hub. You need to identify in which workspace a specific item runs. Which column should you view in Monitoring hub?

  • A. Start time
  • B. Capacity
  • C. Activity name
  • D. Submitter
  • E. Item type
  • F. Job type
  • G. Location
Show Answer
Correct Answer:
G. Location
Question 8

You have a Fabric workspace that contains a lakehouse and a semantic model named Model1. You use a notebook named Notebook1 to ingest and transform data from an external data source. You need to execute Notebook1 as part of a data pipeline named Pipeline1. The process must meet the following requirements: • Run daily at 07:00 AM UTC. • Attempt to retry Notebook1 twice if the notebook fails. • After Notebook1 executes successfully, refresh Model1. Which three actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A. Place the Semantic model refresh activity after the Notebook activity and link the activities by using the On success condition
  • B. From the Schedule settings of Pipeline1, set the time zone to UTC
  • C. Set the Retry setting of the Notebook activity to 2
  • D. From the Schedule settings of Notebook1, set the time zone to UTC
  • E. Set the Retry setting of the Semantic model refresh activity to 2
  • F. Place the Semantic model refresh activity after the Notebook activity and link the activities by using an On completion condition
Show Answer
Correct Answer:
  • A. Place the Semantic model refresh activity after the Notebook activity and link the activities by using the On success condition
  • B. From the Schedule settings of Pipeline1, set the time zone to UTC
  • C. Set the Retry setting of the Notebook activity to 2
Question 9

You have a Fabric workspace that contains a semantic model named Model1. You need to dynamically execute and monitor the refresh progress of Model1. What should you use?

  • A. dynamic management views in Microsoft SQL Server Management Studio (SSMS)
  • B. Monitoring hub
  • C. dynamic management views in Azure Data Studio
  • D. a semantic link in a notebook
Show Answer
Correct Answer:
D. a semantic link in a notebook
Question 10

You have a Fabric workspace named Workspace1 that contains a notebook named Notebook1. In Workspace1, you create a new notebook named Notebook2. You need to ensure that you can attach Notebook2 to the same Apache Spark session as Notebook1. What should you do?

  • A. Enable high concurrency for notebooks
  • B. Enable dynamic allocation for the Spark pool
  • C. Change the runtime version
  • D. Increase the number of executors
Show Answer
Correct Answer:
A. Enable high concurrency for notebooks
Question 11

You are developing a data pipeline named Pipeline1. You need to add a Copy data activity that will copy data from a Snowflake data source to a Fabric warehouse. What should you configure?

  • A. Degree of copy parallelism
  • B. Fault tolerance
  • C. Enable staging
  • D. Enable logging
Show Answer
Correct Answer:
C. Enable staging
Question 12

You have a Fabric workspace that contains a lakehouse and a notebook named Notebook1. Notebook1 reads data into a DataFrame from a table named Table1 and applies transformation logic. The data from the DataFrame is then written to a new Delta table named Table2 by using a merge operation. You need to consolidate the underlying Parquet files in Table1. Which command should you run?

  • A. VACUUM
  • B. BROADCAST
  • C. OPTIMIZE
  • D. CACHE
Show Answer
Correct Answer:
C. OPTIMIZE
Question 13

You have a Fabric workspace that contains a warehouse named Warehouse1. You have an on-premises Microsoft SQL Server database named Database1 that is accessed by using an on-premises data gateway. You need to copy data from Database1 to Warehouse1. Which item should you use?

  • A. a data pipeline
  • B. an Apache Spark job definition
  • C. a streaming dataflow
  • D. a notebook
Show Answer
Correct Answer:
A. a data pipeline
Question 14

You have a Fabric deployment pipeline that uses three workspaces named Dev, Test, and Prod. You need to deploy an Eventhouse as part of the deployment process. What should you use to add the Eventhouse to the deployment process?

  • A. an Azure DevOps pipeline
  • B. an eventstream
  • C. GitHub Actions
Show Answer
Correct Answer:
A. an Azure DevOps pipeline
Question 15

You have a Fabric warehouse named DW1. DW1 contains a table that stores sales data and is used by multiple sales representatives. You plan to implement row-level security (RLS). You need to ensure that the sales representatives can see only their respective data. Which warehouse object do you require to implement RLS?

  • A. TRIGGER
  • B. SCHEMA
  • C. FUNCTION
  • D. DATABASE ROLE
Show Answer
Correct Answer:
C. FUNCTION

Aced these? Get the Full Exam

Download the complete DP-700 study bundle with 51+ questions in a single printable PDF.