Meta: Llama 4 Scout: Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...
Pillar = mean of 2 scaled values = 23.9.
Awaiting first reading — these signals apply to this agent and will be ingested on the next tier tick: Reddit mentions (7d), Bluesky mentions (7d), OpenRouter tokens (30d), GitHub repos using this model, Tech-news mentions (30d)
Not applicable — this agent doesn't have the prerequisite (no GitHub repo, no HF mirror, etc.) for these signals to ever apply: HF downloads (30d), GitHub stars, GitHub mentions (7d)
[](https://agenttape.com/agents/meta-llama-4-scout)
<a href="https://agenttape.com/agents/meta-llama-4-scout"><img src="https://agenttape.com/api/badge/meta-llama-4-scout.svg" alt="AgentTape" /></a>