Brain Dump

Epoch

Tags
adaptive-intelligence

A hyper-parameter specifying the number of times that a learning algorithms will work through the entire training dataset.

Defined as a single iteration through all the data points in a sample set, in terms of one-or-more batches.

One epoch means that each sample in the training dataset has had an opportunity to update the internal model parameters.

We generally use epochs to specify how many times the training data should be processed before we stop the learning process (as an alternative to stopping when the error drops below a certain threshold (which it may never actually reach)).

Links to this note