Stepping into open source: Step3.5 Flash - Pre-training + training framework fully released, racing towards OpenClaw Top2.

date
04/03/2026
On March 4th, China's large-scale model startup Jieyue Xingchen released the pre-training weights, fine-tuning weights, and the accompanying Steptron training framework for its open-source Step3.5 Flash model. It is known that Step3.5 Flash adopts a sparse MoE architecture with a total of 196 billion parameters, but only activates approximately 11 billion parameters during inference. The fastest inference speed for a single request code task can reach 350 TPS. This model is designed for intelligent agent scenarios and performs well in complex inference and long-chain tasks. The official claims that its inference depth can rival some top closed-source models. As of now, this model has been downloaded over 300,000 times on Hugging Face and ranked first on OpenRouter Trending; on the well-known open-source project OpenClaw, the model has risen to second place.