Surpassing OpenAI Medical Capabilities, Baichuan Releases Open-source Large Model Baichuan-M2
Phoenix Technology News, August 11. Baichuan has officially released its open-source large medical model, Baichuan-M2. According to official introductions, this model is small in size at 32B and not only surpasses OpenAI's latest open-source model gpt-oss120b but also puts pressure on other current world-leading open-source large models such as Qwen3-235B, Deepseek R1, Kimi K2.
Furthermore, Baichuan has also optimized the model for private deployment requirements of medical field users considering user privacy, achieving extreme lightweighting and quantization, with a precision close to lossless. The lightweighted model can be deployed on a single RTX4090 card, compared to DeepSeek-R1's H20 dual-node deployment method, with a cost reduction of 57 times.