The most advanced GUI for AI interactions. Chat with local Ollama models or powerful cloud models. Experience seamless conversations with cutting-edge AI technology.
Everything you need for advanced AI interactions
Run AI models locally with Ollama. Complete privacy and control over your data.
Access cutting-edge cloud AI models with seamless integration.
Beautiful, responsive chat interface with real-time streaming.
Efficient workflows with comprehensive keyboard shortcuts.
Built-in tools for developers and programmers.
Real-time performance monitoring and optimization.
Choose from local privacy or cloud power
Natural conversation and general assistance. Perfect for everyday interactions.
Advanced reasoning and analytical thinking. Great for problem-solving.
Fast code generation and debugging. Optimized for quick solutions.
Complex programming tasks with detailed explanations and best practices.
OpenAI's most advanced model with superior reasoning and creativity.
Google's lightning-fast multimodal AI with excellent performance.
Cloud version of DeepSeek R1 with enhanced reasoning capabilities.
Deep research capabilities with web search integration.
Two powerful ways to use Companion - pick what works best for you
Use Companion directly in your browser with powerful cloud AI models
Download the full Companion app for maximum power and privacy
| Feature | Web Mode | Desktop Mode |
|---|---|---|
| Cloud AI Models | Yes | Yes |
| Local AI Models | No | Yes |
| Offline Usage | No | Yes |
| Installation Required | None | Required |
| Privacy Level | Standard | Enhanced |
| Performance | Good | Excellent |
Experience the power of AI conversation
Integrate Companion's AI capabilities into your projects
Simple HTTP endpoints for conversation management and AI interactions
Access to multiple advanced AI models including GPT-4o, Claude 3.5, DeepSeek R1, Gemini 2.5, Perplexity, and Llama 3.2 through a single unified API.
Generate your API key to start integrating Companion AI into your applications
# Create a new conversation
curl -X POST http://localhost:5000/api/conversations \
-H "Content-Type: application/json" \
-d '{"title": "My AI Chat"}'
# Send a message
curl -X POST http://localhost:5000/api/chat \
-H "Content-Type: application/json" \
-d '{
"message": "Explain quantum computing",
"conversation_id": "uuid-string",
"tools": ["web", "think"]
}'
import requests
# Create conversation
response = requests.post(
'http://localhost:5000/api/conversations',
json={'title': 'My AI Chat'}
)
conversation = response.json()
# Send message with AI tools
response = requests.post(
'http://localhost:5000/api/chat',
json={
'message': 'Latest iPhone 15 price in India',
'conversation_id': conversation['id'],
'tools': ['web', 'research']
}
)
result = response.json()
print(result['response'])
// Create conversation
const conversation = await fetch('/api/conversations', {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({title: 'My AI Chat'})
}).then(r => r.json());
// Send message with intelligent caching
const response = await fetch('/api/chat', {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({
message: 'Current GST rates for electronics',
conversation_id: conversation.id,
tools: ['web', 'think']
})
});
const result = await response.json();
[
'method' => 'POST',
'header' => 'Content-Type: application/json',
'content' => json_encode(['title' => 'My AI Chat'])
]
])
), true);
// Send message
$response = json_decode(file_get_contents(
'http://localhost:5000/api/chat',
false,
stream_context_create([
'http' => [
'method' => 'POST',
'header' => 'Content-Type: application/json',
'content' => json_encode([
'message' => 'Write a PHP function to calculate tax',
'conversation_id' => $conversation['id'],
'tools' => ['code', 'think']
])
]
])
), true);
?>
Get started with the most advanced AI chat interface. Free and open source.