Get help with Companion, find answers to common questions, and learn how to use all features.
Download the repository, install Python dependencies with pip install -r requirements.txt, and run python main.py. Make sure you have Ollama installed for local models.
Companion supports all Ollama models locally (DeepSeek, Llama, Codestral, etc.) and cloud models (GPT-4, Claude, Gemini, etc.).
Yes! When using local Ollama models, all conversations stay on your machine. No data is sent to external servers.
For local models, use ollama pull model-name. For cloud models, they're automatically available in Web Mode.
Yes! The desktop app works completely offline when using local Ollama models. Only cloud models require an internet connection.
Please create an issue on our GitHub repository with details about the problem and steps to reproduce it.
Can't find what you're looking for? Here are more ways to get help:
Phone: +91 76002 30560
Contact: Rajyaguru Aryan
Available Mon-Fri, 9 AM - 6 PM IST
Phone: +91 76002 30560
Email: aryanrajyaguru2007@gmail.com
We'll respond within 24 hours
Email: aryanrajyaguru2007@gmail.com
Please include system details and steps to reproduce
Email: aryanrajyaguru2007@gmail.com
Share your ideas to make Companion better