Skip to content

How to use pytorch for deep learning

Terms

Epoch

Refers to each round of training with a given batch to minimize the loss as much as possible. As you increase the number of epochs, you should keep decreasing the loss.

Batch

Batch is the full set of training samples. Using all training samples for to compute the loss function is called batch gradient descent. Using samples, one-by-one, is called stochastic gradient descent. Using only a subset of training samples is called mini-batch gradient descent.

  1. Introduction to PyTorch with examples
  2. What is a convolution kernel?
  3. The log-sum-exp trick for numerical stability
  4. Pixel-level image segmentation using U-Net