Framework Layer
Frameworks are where researchers and production teams spend most of their time. AGI progress depends on frameworks that expose power without hiding critical system trade-offs.
Framework Projects
1 of 4 projects complete
Nano Train
DoneLearning-first distributed LLM training framework built around Megatron-style parallelism, covering TP, PP, EP, DP, ZeRO-1/2, and mixed precision on a DeepSeek-style model stack.
→
Nano Torch
In ProgressDeep learning framework with PyTorch-like API.
→
Nano Serving
TBDModel serving and inference optimization system.
→
Nano Agentic RL
TBDAn agentic RL post-training framework.
→