China's DeepSeek kicked off 2026 with a new AI training method that analysts say is a 'breakthrough' for scaling
DeepSeek developed Manifold-Constrained Hyper-Connections (mHC), a training method that enables richer internal model communication while preserving training stability and efficiency as models scale.
OpenAI gpt-oss LLMs use MXFP4: smaller, faster, cheaper
MXFP4 is a 4-bit floating point data type defined by the Open Compute Project, allowing massive compute savings compared to traditional data types used by LLMs.