> [!META]- Inline Metadata
> [status:: boat]
> [source:: ]
> [tags:: #note/evergreen #state/boat #concepts/programming/machine-learning/distributed-training ]
> [up:: [[Machine Learning MOC]]]
> [same:: [[Model Parallelism]]]
Data parallelism is when you split the data and train the full model in parallel on each segment of data. This is different from [[Model Parallelism]] in which the model itself is split across different GPUs.