Grok Model Parameters, Size & Identifier: Complete xAI Guide (2026)
How many parameters does Grok have? Complete guide to xAI's Grok model sizes, parameter counts, model identifiers, and API strings. Covers Grok-3, Grok-4, and the upcoming Grok-5 with 6 trillion parameters.
TL;DR
Every Grok model at a glance — parameter counts, API identifier strings, and context windows:
| Model | Parameters | API Identifier | Context Window | Status |
|---|---|---|---|---|
| Grok-5 | 6 trillion | grok-5 (expected) | 512K+ (expected) | Coming Q1-Q2 2026 |
| Grok-4.1 | ~3 trillion (MoE) | grok-4.1 | 256K | Current flagship |
| Grok-4.1 Mini | ~400B (MoE) | grok-4.1-mini | 128K | Current fast model |
| Grok-3 | ~3 trillion (MoE) | grok-3 | 128K | Legacy |
| Grok-3 Mini | ~400B (MoE) | grok-3-mini | 128K | Legacy |
| Grok-2 | ~300B (estimated) | grok-2 | 128K | Deprecated |
| Grok-1 | 314B (MoE, 86B active) | grok-1 | 8K | Open-source, deprecated |
How Many Parameters Does Grok Have?
The answer depends on which Grok model you mean. xAI has released multiple generations, each with different parameter counts:
Grok-1 (Open Source, 2023)
Grok-1 was xAI's first public model and remains the only Grok model that is fully open-source. It uses a Mixture-of-Experts (MoE) architecture with 314 billion total parameters, of which 86 billion are active per inference pass. This means only ~27% of the model's weights fire for any given token.
- Total parameters: 314 billion
- Active parameters: 86 billion
- Architecture: MoE with 8 experts, 2 active
- Context window: 8,192 tokens
- License: Apache 2.0 (open-source)
- API string:
grok-1(deprecated)
Grok-2 (2024)
Grok-2 was a significant jump in capability. xAI never disclosed the exact parameter count, but external estimates place it around 300 billion parameters in a dense or sparse architecture. It introduced image understanding and improved reasoning.
- Total parameters: ~300B (estimated, not confirmed)
- Context window: 128K tokens
- API string:
grok-2(deprecated)
Grok-3 (Late 2024 / Early 2025)
Grok-3 marked xAI's leap to trillion-scale models. Trained on the Colossus supercomputer in Memphis with 100,000+ NVIDIA H100 GPUs, Grok-3 uses a Mixture-of-Experts architecture with approximately 3 trillion total parameters.
- Total parameters: ~3 trillion (MoE)
- Active parameters: Not disclosed (estimated 300-600B active)
- Context window: 128K tokens
- API string:
grok-3
Grok-4.1 (Current Flagship, 2025-2026)
Grok-4.1 is the current production model available through the xAI API and the Grok chatbot on X. It builds on Grok-3's architecture with additional training and refinements. The parameter count remains in the ~3 trillion range with improved training data and techniques.
- Total parameters: ~3 trillion (MoE)
- Context window: 256K tokens
- API string:
grok-4.1 - Mini variant:
grok-4.1-mini(~400B MoE, 128K context)
Grok-5 (Upcoming)
Grok-5 is expected to double the parameter count to 6 trillion parameters. It will feature native multimodal capabilities (text, image, video, audio) and real-time data access from Tesla's fleet and X.
- Total parameters: 6 trillion (expected)
- Context window: 512K+ (expected)
- API string:
grok-5(expected) - Release: Q1-Q2 2026
Be first to build with AI
Y Build is the AI-era operating system for startups. Join the waitlist and get early access.
How to Find the Grok Model Identifier String
If you need the exact model identifier string for API calls, here is what to use:
xAI API (Direct)
The xAI API follows OpenAI-compatible formatting. Your API call looks like this:
curl https://api.x.ai/v1/chat/completions \
-H "Authorization: Bearer $XAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "grok-4.1",
"messages": [{"role": "user", "content": "Hello"}]
}'
Available model strings as of March 2026:
| Model String | Description |
|---|---|
grok-4.1 | Current flagship, best quality |
grok-4.1-mini | Fast, cost-effective |
grok-3 | Previous generation |
grok-3-mini | Previous generation fast |
Listing Available Models
You can query the API to get all currently available model identifiers:
curl https://api.x.ai/v1/models \
-H "Authorization: Bearer $XAI_API_KEY"
This returns a JSON list of all model objects, each with an id field containing the model identifier string.
Using Grok in Third-Party Tools
Many tools and frameworks support xAI's Grok models. The model identifier string is the same — you just need the xAI API base URL:
- Base URL:
https://api.x.ai/v1 - Model ID:
grok-4.1(or whichever variant you need) - Authentication: Bearer token with your xAI API key
Grok vs Other Frontier Models (March 2026)
How does Grok stack up against the competition?
| Feature | Grok 4.1 | GPT-5.2 | Claude Opus 4.6 | Gemini 3.1 Pro |
|---|---|---|---|---|
| Parameters | ~3T (MoE) | ~2T (est.) | Not disclosed | Not disclosed |
| Context Window | 256K | 400K | 1M | 1M |
| SWE-bench | ~78% | 80.0% | 80.8% | 80.6% |
| GPQA Diamond | ~90% | 92.4% | 91.3% | 94.3% |
| ARC-AGI-2 | ~55% | 52.9% | 68.8% | 77.1% |
| API Pricing (in/out) | $5/$15 | $15/$60 | $15/$75 | $2/$12 |
| Open Source | Grok-1 only | No | No | No |
| Real-time Data | Yes (X, Tesla) | Limited (browsing) | No | Yes (Google Search) |
Key Takeaways
- Grok leads on real-time data — access to X and Tesla fleet data is a genuine differentiator no other lab can match.
- Grok trails on reasoning — the ARC-AGI-2 gap (55% vs 77.1% for Gemini) is significant. Grok-5 needs to close this.
- Grok is competitively priced — cheaper than GPT-5.2 and Claude Opus, more expensive than Gemini 3.1 Pro.
- Parameter count doesn't determine quality — Grok-4.1 has the most disclosed parameters but doesn't lead benchmarks. Architecture, training data, and RLHF matter more.
Understanding Mixture-of-Experts (MoE) Parameters
When xAI says Grok has "3 trillion parameters," that number deserves context. In a Mixture-of-Experts architecture, only a fraction of parameters are active during each forward pass.
How MoE works:- The model has many "expert" sub-networks
- A router network decides which experts to activate for each token
- Typically 2-4 experts out of 8-16+ are active per token
- Total parameters are large, but compute cost is closer to a smaller dense model
Grok-1 is the only model where xAI confirmed the exact numbers: 314B total, 86B active (8 experts, 2 active). The later models have not had their expert configurations publicly confirmed.
Frequently Asked Questions
How many parameters does the current Grok model have?
Grok-4.1, the current flagship model, has approximately 3 trillion parameters in a Mixture-of-Experts architecture. The active parameter count per inference is lower, likely 300-600 billion.
What is the model identifier string for Grok?
The current model identifier for the xAI API is grok-4.1 for the flagship model and grok-4.1-mini for the fast variant. Use these strings in your API calls.
Is Grok open source?
Only Grok-1 (314B parameters) is open-source under an Apache 2.0 license. All subsequent models (Grok-2, 3, 4.1, and the upcoming 5) are proprietary.
How does Grok's parameter count compare to GPT-5?
Grok-4.1 at ~3 trillion parameters is the largest disclosed parameter count among frontier models. GPT-5.2's parameter count has not been confirmed but is estimated around 2 trillion. However, parameter count alone does not determine model quality.
When will Grok-5 be released?
xAI has indicated a Q1 2026 release for Grok-5 with 6 trillion parameters. As of March 2026, no exact date has been announced. It could arrive any day or slip to Q2.
What context window does Grok support?
Grok-4.1 supports 256K tokens. Grok-4.1-mini supports 128K tokens. Grok-5 is expected to expand this to 512K or more.
Build With Any AI Model
The AI model landscape in 2026 is fragmented — Grok, GPT, Claude, Gemini all have different strengths. The smart move is building products that can swap between models as the landscape shifts.
Y Build gives you the full growth stack for AI-powered products: deployment, Demo Cut product videos, AI SEO, and analytics. Works with any model, any framework. Start free →Sources:
Be first to build with AI
Y Build is the AI-era operating system for startups. Join the waitlist and get early access.