Meta: Llama 4 Maverick: Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...
Pillar = mean of 2 scaled values = 23.9.
Awaiting first reading — these signals apply to this agent and will be ingested on the next tier tick: Reddit mentions (7d), Bluesky mentions (7d), OpenRouter tokens (30d)
Not applicable — this agent doesn't have the prerequisite (no GitHub repo, no HF mirror, etc.) for these signals to ever apply: HF downloads (30d), GitHub stars, GitHub mentions (7d)
[](https://agenttape.com/agents/meta-llama-4-maverick)
<a href="https://agenttape.com/agents/meta-llama-4-maverick"><img src="https://agenttape.com/api/badge/meta-llama-4-maverick.svg" alt="AgentTape" /></a>