iFLYTEK releases Spark X1.5, leading the country in overcoming the efficiency of training the MoE model on the national computing power platform.
On November 6th, iFlytek released the iFlytek Spark X1.5 deep reasoning large model based on domestic computing power, leading the way in overcoming the efficiency of MoE model full-chain training on the domestic computing power platform, with end-to-end performance reaching over 93% efficiency of international competitors. It is reported that this model is comparable to international mainstream large models in language understanding, text generation, knowledge question answering, logical reasoning, mathematical abilities, and coding capabilities. At the same time, the Spark's multilingual capabilities continue to improve, supporting over 130 languages, with an overall performance of over 95% of GPT-5.
Latest
3 m ago

