Skip to content
@deepspeedai

deepspeedai

Popular repositories Loading

  1. DeepSpeed DeepSpeed Public

    DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

    Python 38.7k 4.4k

  2. DeepSpeedExamples DeepSpeedExamples Public

    Example models using DeepSpeed

    Python 6.5k 1.1k

  3. Megatron-DeepSpeed Megatron-DeepSpeed Public

    Forked from NVIDIA/Megatron-LM

    Ongoing research training transformer language models at scale, including: BERT & GPT-2

    Python 2.1k 353

  4. DeepSpeed-MII DeepSpeed-MII Public

    MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.

    Python 2k 184

  5. DeepSpeed-Kernels DeepSpeed-Kernels Public

    C++ 71 15

  6. deepspeed-gpt-neox deepspeed-gpt-neox Public

    Forked from EleutherAI/gpt-neox

    An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

    Python 19 3

Repositories

Showing 6 of 6 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy