Summer Special Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 8w52ceb345

A Generative Al Engineer has developed an LLM application to answer questions about internal company...

A Generative Al Engineer has developed an LLM application to answer questions about internal company policies. The Generative AI Engineer must ensure that the application doesn’t hallucinate or leak confidential data.

Which approach should NOT be used to mitigate hallucination or confidential data leakage?

A.

Add guardrails to filter outputs from the LLM before it is shown to the user

B.

Fine-tune the model on your data, hoping it will learn what is appropriate and not

C.

Limit the data available based on the user’s access level

D.

Use a strong system prompt to ensure the model aligns with your needs.

Databricks-Generative-AI-Engineer-Associate PDF/Engine
  • Printable Format
  • Value of Money
  • 100% Pass Assurance
  • Verified Answers
  • Researched by Industry Experts
  • Based on Real Exams Scenarios
  • 100% Real Questions
buy now Databricks-Generative-AI-Engineer-Associate pdf
Get 60% Discount on All Products, Use Coupon: "8w52ceb345"