Try it Out!
Try it in the AI Studio playground (120B model) Try it in the AI Studio playground (20B model) Try with API: gpt_oss_1.ipynbTL;DR
GPT OSS is a hugely anticipated open-weights release by OpenAI, designed for powerful reasoning, agentic tasks, and versatile developer use cases- Released: Aug 2025
- Two models:
- a big one with 117B parameters: gpt-oss-120b
- A smaller one with 21B parameters: gpt-oss-20b
- Both are mixture-of-experts (MoEs) and use a 4-bit quantization scheme (MXFP4), enabling fast inference (thanks to fewer active parameters)
- GPT‑OSS‑120B: 36 layers, ~117B total parameters, but only ~5.1B parameters activated per token. It uses 128 experts, with just 4 experts activated per token.
- GPT‑OSS‑20B: 24 layers, ~21B total parameters, ~3.6B active per token; 32 experts, 4 activated per token.
- Reasoning, text-only models; with chain-of-thought and adjustable reasoning effort levels.
- Instruction following and tool use support.
- License: Apache 2.0, with a small complementary use policy.
Fun Facts
GPT-OSS is OpenAI’s open source model release since GPT-2 (released in 2019). That’s over 6 years! Benchmark beaters:- GPT‑OSS‑120B rivals or outperforms OpenAI’s proprietary o4‑mini in coding (Codeforces), general knowledge (MMLU, HLE), and even exceeds on math (AIME) and health (HealthBench).
- GPT‑OSS‑20B, despite its size, matches or surpasses o3‑mini on many benchmarks—and even shines in math and health tasks
Performance and Benchmarks
Official benchmarks

Artificial Analysis Benchmark
(1).png?fit=max&auto=format&n=5dDGyj4I66EXxN5u&q=85&s=77b82bf4fd4a8969f796f79643f4eb74)
Fun Benchmark: “Pelican riding a bicycle”
Inspired by Simon Willison’s fun experiment (see here), this benchmark is all about how well models generate quirky, imaginative responses. Prompt:Generate an SVG of a pelican riding a bicycleYou can see our full pelican tests here. So how does gpt-oss do? Let’s see.
gpt-oss-20b | gpt-oss-120b |
---|---|
![]() | ![]() |
Comparing against other SOTA open source models
qwen3-235B-2507 | deepseek-r1-0528 |
---|---|
![]() | ![]() |
References
- OpenAI open models page | github openai/gpt-oss
- Welcome GPT OSS, the new open-source model family from OpenAI!
- Model card for 120B param model: gpt-oss-120b
- Model card for smaller 20B param model: gpt-oss-20b