Pre-Summer Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: ac4s65

A Data Engineer is building a simple data pipeline using Lakeflow Declarative Pipelines (LDP) in...

A Data Engineer is building a simple data pipeline using Lakeflow Declarative Pipelines (LDP) in Databricks to ingest customer data. The raw customer data is stored in a cloud storage location in JSON format. The task is to create Lakeflow Declarative Pipelines that read the raw JSON data and write it into a Delta table for further processing.

Which code snippet will correctly ingest the raw JSON data and create a Delta table using LDP?

A.

import dlt

@dlt.table

def raw_customers():

return spark.read.format( " csv " ).load( " s3://my-bucket/raw-customers/ " )

B.

import dlt

@dlt.table

def raw_customers():

return spark.read.json( " s3://my-bucket/raw-customers/ " )

C.

import dlt

@dlt.table

def raw_customers():

return spark.read.format( " parquet " ).load( " s3://my-bucket/raw-customers/ " )

D.

import dlt

@dlt.view

def raw_customers():

return spark.format.json( " s3://my-bucket/raw-customers/ " )

Databricks-Certified-Professional-Data-Engineer PDF/Engine
  • Printable Format
  • Value of Money
  • 100% Pass Assurance
  • Verified Answers
  • Researched by Industry Experts
  • Based on Real Exams Scenarios
  • 100% Real Questions
buy now Databricks-Certified-Professional-Data-Engineer pdf
Get 65% Discount on All Products, Use Coupon: "ac4s65"