caching - Pytorch: the cache is constantly increasing - Stack Overflow

时间: 2025-01-06 admin 业界

During the training process of pytorch, the display memory keeps increasing, but there is no "out of memory", and the torch.cuda.memory_reserved() print shows that the cache keeps increasing. As the training progresses, the training slows down.

Training slows down with torch.cuda.empty_cache() and how to fix it

最新文章