This guide explains how to configure ship to use Anthropic's Claude models instead of local Ollama.
- An Anthropic API key (get one at https://console.anthropic.com/)
Edit your .shiprc.json file and change the provider to anthropic:
{
"llm": {
"provider": "anthropic"
},
"anthropic": {
"model": "claude-3-5-sonnet-20241022",
"maxTokens": 4096
}
}Option A: Using .env file (Recommended)
Create a .env file in your project directory:
# .env
ANTHROPIC_API_KEY=sk-ant-api03-your-key-hereOption B: Using environment variable
Export the variable in your shell:
export ANTHROPIC_API_KEY=sk-ant-api03-your-key-hereclaude-3-5-sonnet-20241022(default) - Best balance of intelligence, speed, and costclaude-3-opus-20240229- Most capable model, best for complex tasksclaude-3-sonnet-20240229- Good balance for most tasksclaude-3-haiku-20240307- Fastest and most cost-effective
Once configured, use ship normally:
# Generate posts
ship work
# Override model
ship work --model claude-3-opus-20240229
# Check available strategies
ship work --list-strategiesIn .shiprc.json:
{
"llm": {
"provider": "anthropic"
},
"anthropic": {
"model": "claude-3-5-sonnet-20241022",
"maxTokens": 4096
},
"generation": {
"temperature": 0.7,
"postsPerTranscript": 8
}
}model: Which Claude model to usemaxTokens: Maximum tokens in response (default: 4096)temperature: Creativity level 0-1 (default: 0.7)
Edit .shiprc.json and change the provider back:
{
"llm": {
"provider": "ollama"
}
}Make sure you've either:
- Created a
.envfile withANTHROPIC_API_KEY=... - Exported
ANTHROPIC_API_KEYin your shell
Check:
- Your API key is valid
- You have internet connectivity
- Your Anthropic account has credits
Anthropic charges per token. Monitor your usage at https://console.anthropic.com/
Approximate costs (as of 2024):
- Claude 3.5 Sonnet: $3/$15 per million tokens (input/output)
- Claude 3 Opus: $15/$75 per million tokens
- Claude 3 Haiku: $0.25/$1.25 per million tokens