AI: post transformers

DyNN-Offload: Efficient Memory for Dynamic Neural Networks


Listen Later

This document introduces DyNN-Offload, a novel memory management system designed to overcome the GPU memory limitations faced when training large dynamic neural networks (DyNNs). Unlike traditional methods that struggle with DyNNs' unpredictable memory access patterns, DyNN-Offload employs a learned approach using a lightweight "pilot model" to predict tensor access orders. By using an idiom-based representation of network operations, the pilot model efficiently guides the migration of tensors between CPU and GPU memory, enabling significantly larger DyNN training on a single GPU. The system demonstrates superior performance compared to existing solutions like unified virtual memory (UVM) and dynamic tensor rematerialization (DTR), while introducing minimal overhead. Its transparent integration with existing deep learning frameworks makes it a practical solution for advancing large-scale DyNN development.


Source: 2024 - https://web.cs.ucla.edu/~harryxu/papers/ren-hpca24.pdf - Enabling Large Dynamic Neural Network Training

with Learning-based Memory Management

...more
View all episodesView all episodes
Download on the App Store

AI: post transformersBy mcgrof