Not setting seeds for the random number generators in Pytorch can lead to reproducibility issues. Random numbers are used in the initialization of neural networks, in the shuffling of the training data, and, during training for layers such as Dropout. Not setting seeds causes the execution of the code to produce different results.