We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Breaking down GPU VRAM consumption

Breaking down GPU VRAM consumption

2024/6/29
logo of podcast Machine Learning Tech Brief By HackerNoon

Machine Learning Tech Brief By HackerNoon

Shownotes Transcript

This story was originally published on HackerNoon at: https://hackernoon.com/breaking-down-gpu-vram-consumption). What factors influence VRAM consumption? How does it vary with different model settings? I dug into the topic and conducted my measurements. Check more stories related to machine-learning at: https://hackernoon.com/c/machine-learning). You can also check exclusive content about #llms), #vram), #machine-learning), #deep-learning), #gpus), #gpu-vram), #gpus-for-machine-learning), #gpu-optimization), and more.

        This story was written by: [@furiousteabag](https://hackernoon.com/u/furiousteabag)). Learn more about this writer by checking [@furiousteabag's](https://hackernoon.com/about/furiousteabag)) about page,
        and for more stories, please visit [hackernoon.com](https://hackernoon.com)).
        
            
            
            I’ve always been curious about the GPU VRAM required for training and fine-tuning transformer-based language models. What factors influence VRAM consumption? How does it vary with different model settings? I dug into the topic and conducted my measurements.