Published: August 8, 2025

Try it Out!

Try it in the AI Studio playground (GLM4.5) Try it in the AI Studio playground (GLM-4.5-Air) Try with API: glm4.5_1.ipynb  

TL;DR

GLM-4.5 is Z.ai’s breakthrough open-source AI model. Unifying reasoning, coding, and agentic capabilities in a single foundation model designed specifically for intelligent agents
  • Released: July 2025
  • Two models:
    • GLM-4.5: 355 billion total parameters with 32 billion active parameters using Mixture-of-Experts (MoE) architecture
    • GLM-4.5-Air: More compact design with 106 billion total parameters and 12 billion active parameters
  • 128K context length with native function calling capacity
  • Hybrid Reasoning System: “Thinking mode” for complex reasoning and tool usage, and “non-thinking mode” for instant responses
  • License: Released under MIT open-source license, allowing unlimited commercial use and secondary development

Fun Facts

The “Slime” Secret Sauce Z.ai actually built their reinforcement learning infrastructure and called it “slime”. This RL infrastructure is engineered for exceptional flexibility, efficiency, and scalability and helps GLM-4.5 learn complex agentic behaviors. The Crown-Passing Championship Within just a couple of weeks, the crown of “best open-source model” went from Kimi-K2 to Qwen3, and now to GLM-4.5! - source

Performance and Benchmarks

Official benchmarks

glem4.5-bench-1.png See more here

Artificial Analysis Benchmark

glm4.5-bench-2.png AA analysis

Fun Benchmark: “Pelican riding a bicycle”

Inspired by Simon Willison’s fun experiment (see here), this benchmark is all about how well models generate quirky, imaginative responses. Prompt:
Generate an SVG of a pelican riding a bicycle
You can see our full pelican tests here. So how does GLM do? Let’s see.
GLM-4.5GLM-4.5-Air
GLM4.5-pelican-1.pngGLM4.5-air-pelican-1.png

Comparing against other SOTA open source models

qwen3-235B-2507deepseek-r1-0528
Screenshot 2025-08-25 at 18.34.14.pngScreenshot 2025-08-25 at 18.34.19.png

References

  • Z.ai GLM
  • GLM-4.5: 355 billion total parameters with 32 billion active parameters using Mixture-of-Experts (MoE) architecture
  • GLM-4.5-Air: More compact design with 106 billion total parameters and 12 billion active parameters