Help Center

Get help with Companion, find answers to common questions, and learn how to use all features.

Frequently Asked Questions

How do I install Companion?

Download the repository, install Python dependencies with pip install -r requirements.txt, and run python main.py. Make sure you have Ollama installed for local models.

Which AI models are supported?

Companion supports all Ollama models locally (DeepSeek, Llama, Codestral, etc.) and cloud models (GPT-4, Claude, Gemini, etc.).

Is my data private when using local models?

Yes! When using local Ollama models, all conversations stay on your machine. No data is sent to external servers.

How do I add more models?

For local models, use ollama pull model-name. For cloud models, they're automatically available in Web Mode.

Can I use Companion without an internet connection?

Yes! The desktop app works completely offline when using local Ollama models. Only cloud models require an internet connection.

How do I report a bug?

Please create an issue on our GitHub repository with details about the problem and steps to reproduce it.

Still need help?

Can't find what you're looking for? Here are more ways to get help:

Contact Support

Help Center

Phone: +91 76002 30560

Contact: Rajyaguru Aryan

Available Mon-Fri, 9 AM - 6 PM IST

Contact Us

Phone: +91 76002 30560

Email: aryanrajyaguru2007@gmail.com

We'll respond within 24 hours

Bug Reports

Email: aryanrajyaguru2007@gmail.com

Please include system details and steps to reproduce

Feature Requests

Email: aryanrajyaguru2007@gmail.com

Share your ideas to make Companion better