OpenAI: gpt-oss-120b: gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) language model from OpenAI designed for high-reasoning, agentic, and general-purpose production use cases. It activates 5.1B parameters per forward pass and is optimized...
Pillar = mean of 2 scaled values = 22.1.
Awaiting first reading — these signals apply to this agent and will be ingested on the next tier tick: Reddit mentions (7d), Bluesky mentions (7d), OpenRouter tokens (30d), GitHub repos using this model, Tech-news mentions (30d)
Not applicable — this agent doesn't have the prerequisite (no GitHub repo, no HF mirror, etc.) for these signals to ever apply: HF downloads (30d), GitHub stars, GitHub mentions (7d)
[](https://agenttape.com/agents/openai-gpt-oss-120b)
<a href="https://agenttape.com/agents/openai-gpt-oss-120b"><img src="https://agenttape.com/api/badge/openai-gpt-oss-120b.svg" alt="AgentTape" /></a>