Ask Question
1 July, 19:06

Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the training error consistently goes up, what is likely going on? How can you fix this?

+4
Answers (1)
  1. 1 July, 22:22
    0
    The answer is "using validation error".

    Explanation:

    The validation error is used to response the test for one of the queries is activated to the participant, which may not properly answer the question. These errors go up continuously after each time, the processing rate is too high and also the method is different.

    These errors are also unless to increase when they are actually in the problem. The training level will be that, if the learning error may not increase when the model overrides the learning set and you should stop practicing.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the ...” in 📙 Computers & Technology if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers