> [!META]- Inline Metadata > [status:: boat] > [source:: ] > [tags:: #note/evergreen #state/boat #concepts/programming/machine-learning/distributed-training ] > [up:: [[Machine Learning MOC]]] Model parallelism is a distributed training strategy that splits the model itself across different GPUs and trains all the data on each.