A customer mentions that the ML team wants to avoid overfitting models. What does this mean?
A.
The team wants to avoid wasting resources on training models with poorly selected hyperparameters.
B.
The team wants to spend less time on creating the code tor models and more time training models.
C.
The team wants to avoid training models to the point where they perform less well on new data.
D.
The team wants to spend less time figuring out which CPUs are available for training models.
The Answer Is:
C
This question includes an explanation.
Explanation:
Overfitting occurs when a model is trained too closely on the training data, leading to a model that performs very well on the training data but poorly on new data. This is because the model has been trained too closely to the training data, and so cannot generalize the patterns it has learned to new data. To avoid overfitting, the ML team needs to ensure that their models are not overly trained on the training data and that they have enough generalization capacity to be able to perform well on new data.
HPE2-N69 PDF/Engine
Printable Format
Value of Money
100% Pass Assurance
Verified Answers
Researched by Industry Experts
Based on Real Exams Scenarios
100% Real Questions
Get 65% Discount on All Products,
Use Coupon: "ac4s65"