Data vs Pipeline Parallelism in LLM TrainingOct 31, 2025 ยท 1 min readGo to Project SiteImplements distributed training methods, including data parallelism and pipeline parallelism across multiple GPUs.Last updated on Oct 31, 2025Distributed Training LLM AuthorsWang Ming (Melvin)MSc in Information Engineering DesignBench: A Multi-framework Multi-task Benchmark for Front-end Code Generation May 31, 2025 →