Choose How You Code with AI
Run it locally, bring your own keys, or let us handle everything. One extension, three ways to power your development.
Local AI
Run AI on your machine
LobsAI detects your hardware specs, recommends the best AI model, and auto-installs it via Ollama. Your data never leaves your machine.
Get StartedWhat's included
- Hardware auto-detection (RAM, GPU, CPU)
- AI model recommendation based on your specs
- One-click Ollama installation
- 100% offline — data stays on your machine
- Models: Llama 3, CodeLlama, Mistral, Phi, etc.
- Unlimited usage — no token limits
- Full code context with local processing
- Community support via Discord
Bring Your Key
Use your own API keys
Connect your existing API keys from any major provider. You control which models you use and pay providers directly for API usage.
Get StartedWhat's included
- OpenAI (GPT-4o, GPT-4, o1, o3)
- Anthropic (Claude 3.5, Claude 4)
- Google Gemini (Pro, Ultra, Flash)
- Groq (ultra-fast inference)
- AWS Bedrock & Azure OpenAI
- OpenRouter (100+ models)
- DeepSeek, Mistral, and more
- Priority email support
Cloud Pro
All models, zero setup
Like GitHub Copilot — access every AI model through our servers. Monthly token allocation included, pay only for extra usage.
Start Free TrialWhat's included
- Access to ALL AI models (GPT-4o, Claude, Gemini, etc.)
- Monthly token allocation included
- No API keys needed — just sign in
- Automatic model routing for best results
- Usage dashboard & spending controls
- Extra tokens at competitive rates
- Priority support & onboarding
- Team management (coming soon)
How Each Plan Works
Every plan gives you the full LobsAI Coder experience — the difference is where the AI runs
Local AI
LobsAI scans your hardware (RAM, GPU, CPU)
Recommends the best AI model for your specs
Auto-installs the model via Ollama
Code 100% locally — nothing leaves your machine
Bring Your Key
Enter your API key from any supported provider
LobsAI validates and securely stores it locally
Requests go directly to the provider (no middleman)
Full power of GPT-4o, Claude 4, Gemini, etc.
Cloud Pro
Sign in — no API keys or setup needed
Access every model through LobsAI servers
Monthly token allocation auto-applied
Usage dashboard with spending controls
Feature Comparison
See exactly what you get with each plan
| Feature | Local AI $5/mo | Bring Your Key $5/mo | Cloud Pro $20/mo |
|---|---|---|---|
| AI Models | |||
| Local models (Llama, Mistral, Phi) | — | ||
| OpenAI (GPT-4o, o1, o3) | — | ||
| Anthropic (Claude 3.5, Claude 4) | — | ||
| Google Gemini | — | ||
| Groq, DeepSeek, Mistral API | — | ||
| Automatic model selection | — | — | |
| Features | |||
| File creation & editing | |||
| Terminal command execution | |||
| Browser automation | |||
| MCP server support | |||
| Hardware auto-detection | — | — | |
| Usage dashboard | — | — | |
| Spending controls | — | — | |
| Support & Billing | |||
| Community Discord support | |||
| Email support | — | ||
| Priority support | — | — | |
| Per-computer licensing | — | ||
| Per-user licensing | — | — | |
| Data stays on your machine | — | — | |
Frequently Asked Questions
Have questions? We've got answers.
Ready to get started?
Install LobsAI Coder in VS Code and choose your plan. Start coding with AI in under 2 minutes.