/providers
Overview
The /providers command opens the provider management interface where you can connect and configure AI providers for use with Apex. This is where you set up your API keys to enable AI-powered penetration testing.
Usage
Supported Providers
Apex supports the following AI providers:
Recommended - Claude Pro/Max or API key. Best performance for penetration testing tasks.
GPT-4 and other OpenAI models.
Gemini models via Google’s Generative AI API.
Access multiple AI models through one API. Great for flexibility and model comparison.
Amazon Bedrock AI models for enterprise deployments.
Managed inference — connect via /auth instead, no API key needed.
Anthropic is recommended for optimal penetration testing performance. Claude models provide superior reasoning capabilities for security testing.
How It Works
Enter API Key
Input your API key for the selected provider. Apex verifies your key against the provider’s API before saving — if the key is invalid, you’ll see an inline error and can try again.
Selecting Pensar Console from the provider list routes you to the /auth device-flow login instead of the API key input screen.
Provider Setup
Anthropic
Best for: All penetration testing tasks
Get Your API Key
- Visit console.anthropic.com
- Sign up or log in to your account
- Navigate to API Keys section
- Create a new API key
Configure in Apex
- Run
/providers - Select “Anthropic”
- Paste your API key
- Press Enter to save
Claude Pro/Max subscribers can use their subscription. API keys provide more flexibility for heavy usage.
OpenAI
Best for: Users already invested in the OpenAI ecosystem
Get Your API Key
- Visit platform.openai.com
- Navigate to API Keys
- Create a new secret key
Configure in Apex
- Run
/providers - Select “OpenAI”
- Paste your API key
- Press Enter to save
Google
Best for: Users wanting access to Gemini models
Get Your API Key
- Visit aistudio.google.com
- Sign in with your Google account
- Generate an API key
Configure in Apex
- Run
/providers - Select “Google”
- Paste your API key
- Press Enter to save
OpenRouter
Best for: Access to multiple models through a single API
Get Your API Key
- Visit openrouter.ai
- Create an account
- Navigate to Keys section
- Generate a new API key
Configure in Apex
- Run
/providers - Select “OpenRouter”
- Paste your API key
- Press Enter to save
OpenRouter provides access to models from multiple providers including Anthropic, OpenAI, Google, Meta, and more through a unified API.
AWS Bedrock
Best for: Enterprise deployments with existing AWS infrastructure
Prerequisites
- AWS account with Bedrock access
- IAM permissions for Bedrock models
- Model access enabled in your region
Get Your Credentials
- Log in to AWS Console
- Navigate to IAM
- Create or use existing access keys
- Ensure Bedrock permissions are attached
Configure in Apex
- Run
/providers - Select “AWS Bedrock”
- Enter your AWS credentials
- Press Enter to save
Ensure your IAM role has permissions to invoke Bedrock models and that model access is enabled in your AWS region.
Pensar Console
Best for: Getting started quickly without managing API keys
Pensar Console provides managed inference so you don’t need to bring your own API key.
Connect
- Run
/authin Apex (not/providers) - Follow the authentication flow to link your Pensar Console account
- Your models will appear automatically in
/models
Visit console.pensar.dev to create an account and manage credits.
Keyboard Navigation
Provider Status
In the provider selection screen, configured providers display a green checkmark (✓) indicating they’re ready to use.
After Configuration
Once you’ve configured a provider:
- You’ll be automatically redirected to the
/modelscommand - Select a model from your configured provider
- Start testing with
/pentest
Quick Access
You can also access providers from the models screen:
- Press
Ctrl+Pwhile in/modelsto open provider management
Security Best Practices
API Key Security: - Never share your API keys - Rotate keys periodically - Use separate keys for different environments - Monitor usage for unexpected activity
Troubleshooting
API key not working
Issue: Provider shows configured but models won’t load
Solutions:
- Apex verifies keys on entry — if you see a verification error, double-check the key
- Verify the API key is correct (no extra spaces)
- Check that your account has credits/access
- Ensure the key has appropriate permissions
- Try regenerating the key
Provider not showing
Issue: Expected provider not in the list
Solutions:
- Update Apex to the latest version (
pensar upgrade) - Restart Apex
- Check if provider requires additional setup
AWS Bedrock errors
Issue: Cannot connect to AWS Bedrock
Solutions:
- Verify AWS credentials are correct
- Check IAM permissions include Bedrock access
- Ensure model access is enabled in your region
- Verify your account has Bedrock quota