The passage of convolutions backward through a neural network to update weights and biases
B.
The passage of accuracy backward through a neural network to update weights and biases
C.
The passage of nodes backward through a neural network to update weights and biases
D.
The passage of errors backward through a neural network to update weights and biases
The Answer Is:
D
This question includes an explanation.
Explanation:
→ Backpropagation (short for “backward propagation of errors”) is the fundamental algorithm for training neural networks. It involves computing the error at the output and propagating it backward through the network to update weights and biases via gradient descent.
Why the other options are incorrect:
A: Convolutions are specific to CNNs and are not propagated in this manner.
B: Accuracy is an evaluation metric, not used in weight updates.
C: Nodes are structural elements, not passed backward.
Official References:
CompTIA DataX (DY0-001) Official Study Guide – Section 4.3:“Backpropagation passes the error backward from the output layer to the input layer to adjust weights using gradient-based optimization.”
Deep Learning Textbook, Chapter 6:“The backpropagation algorithm is essential for computing gradients of the loss function with respect to each weight.”
—
DY0-001 PDF/Engine
Printable Format
Value of Money
100% Pass Assurance
Verified Answers
Researched by Industry Experts
Based on Real Exams Scenarios
100% Real Questions
Get 65% Discount on All Products,
Use Coupon: "ac4s65"