Summer Sale - Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dpt65

DP-700 Questions and Answers

Question # 6

You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must minimize development effort. What should you recommend?

A.

Add a ForEach activity to the data pipeline.

B.

Configure retries for the Copy data activity.

C.

Configure Fault tolerance for the Copy data activity.

D.

Call a notebook from the data pipeline.

Full Access
Question # 7

You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements.

What should you do?

A.

Create a workspace identity and enable high concurrency for the notebooks.

B.

Create a shortcut and ensure that caching is disabled for the workspace.

C.

Create a workspace identity and use the identity in a data pipeline.

D.

Create a shortcut and ensure that caching is enabled for the workspace.

Full Access
Question # 8

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

A.

Add the DataAnalyst group to the Viewer role for WorkspaceA.

B.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

C.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

D.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Full Access
Question # 9

You are building a Fabric notebook named MasterNotebookl in a workspace. MasterNotebookl contains the following code.

You need to ensure that the notebooks are executed in the following sequence:

1. Notebook_03

2. Notebook.Ol

3. Notebook_02

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

A.

Split the Directed Acyclic Graph (DAG) definition into three separate definitions.

B.

Change the concurrency to 3.

C.

Move the declaration of Notebook_03 to the top of the Directed Acyclic Graph (DAG) definition.

D.

Move the declaration of Notebook_02 to the bottom of the Directed Acyclic Graph (DAG) definition.

E.

Add dependencies to the execution of Note boo k_02.

F.

Add dependencies to the execution of Notebook_03.

Full Access
Question # 10

HOTSPOT

You have a Fabric workspace that contains two lakehouses named Lakehouse1 and Lakehouse2. Lakehouse1 contains staging data in a Delta table named Orderlines. Lakehouse2 contains a Type 2 slowly changing dimension (SCD) dimension table named Dim_Customer.

You need to build a query that will combine data from Orderlines and Dim_Customer to create a new fact table named Fact_Orders. The new table must meet the following requirements:

Enable the analysis of customer orders based on historical attributes.

Enable the analysis of customer orders based on the current attributes.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 11

You have a Fabric deployment pipeline that uses three workspaces named Dev, Test, and Prod.

You need to deploy an eventhouse as part of the deployment process.

What should you use to add the eventhouse to the deployment process?

A.

GitHub Actions

B.

a deployment pipeline

C.

an Azure DevOps pipeline

Full Access
Question # 12

You have a Fabric workspace that contains a warehouse named Warehouse1. Warehouse! contains a table named Customer. Customer contains the following data.

You have an internal Microsoft Entra user named User1 that has an email address of user1@contoso.com.

You need to provide User1 with access to the Customer table. The solution must prevent User1 from accessing the CreditCard column.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 13

You have a Fabric workspace that contains an eventhouse named Eventhouse1.

In Eventhouse1, you plan to create a table named DeviceStreamData in a KQL database. The table will contain data based on the following sample.

Full Access
Question # 14

You have a Fabric workspace named Workspace1 that contains the items shown in the following table.

For Model1, the Keep your Direct Lake data up to date option is disabled.

You need to configure the execution of the items to meet the following requirements:

Notebook1 must execute every weekday at 8:00 AM.

Notebook2 must execute when a file is saved to an Azure Blob Storage container.

Model1 must refresh when Notebook1 has executed successfully.

How should you orchestrate each item? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 15

You have an Azure SQL database named DB1.

In a Fabric workspace, you deploy an eventstream named EventStreamDBI to stream record changes from DB1 into a lakehouse.

You discover that events are NOT being propagated to EventStreamDBI.

You need to ensure that the events are propagated to EventStreamDBI.

What should you do?

A.

Create a read-only replica of DB1.

B.

Create an Azure Stream Analytics job.

C.

Enable Extended Events for DB1.

D.

Enable change data capture (CDC) for DB1.

Full Access
Question # 16

You have a Fabric workspace that contains an eventhouse and a KQL database named Database1. Database1 has the following:

A table named Table1

A table named Table2

An update policy named Policy1

Policy1 sends data from Table1 to Table2.

The following is a sample of the data in Table2.

Recently, the following actions were performed on Table1:

An additional element named temperature was added to the StreamData column.

The data type of the Timestamp column was changed to date.

The data type of the DeviceId column was changed to string.

You plan to load additional records to Table2.

Which two records will load from Table1 to Table2? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

A)

B)

C)

D)

A.

Option A

B.

Option B

C.

Option c

D.

Option D

Full Access
Question # 17

You have a Fabric workspace that contains an eventstream named EventStreaml. EventStreaml outputs events to a table named Tablel in a lakehouse. The streaming data is souiced from motorway sensors and represents the speed of cars.

You need to add a transformation to EventStream1 to average the car speeds. The speeds must be grouped by non-overlapping and contiguous time intervals of one minute. Each event must belong to exactly one window.

Which windowing function should you use?

A.

sliding

B.

hopping

C.

tumbling

D.

session

Full Access
Question # 18

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a KQL database that contains two tables named Stream and Reference. Stream contains streaming data in the following format.

Reference contains reference data in the following format.

Both tables contain millions of rows.

You have the following KQL queryset.

You need to reduce how long it takes to run the KQL queryset.

Solution: You add the make_list() function to the output columns.

Does this meet the goal?

A.

Yes

B.

No

Full Access
Question # 19

You have a Fabric workspace that contains a warehouse named DW1. DW1 is loaded by using a notebook named Notebook1.

You need to identify which version of Delta was used when Notebook1 was executed.

What should you use?

A.

Real-Time hub

B.

OneLake data hub

C.

the Admin monitoring workspace

D.

Fabric Monitor

E.

the Microsoft Fabric Capacity Metrics app

Full Access
Question # 20

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 21

You need to ensure that the data engineers are notified if any step in populating the lakehouses fails. The solution must meet the technical requirements and minimize development effort.

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 22

You need to resolve the sales data issue. The solution must minimize the amount of data transferred.

What should you do?

A.

Spilt the dataflow into two dataflows.

B.

Configure scheduled refresh for the dataflow.

C.

Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.

D.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.

E.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Full Access