Ollama Client door SleekAddons
Chat with your local Ollama AI models right from the browser
14 gebruikers14 gebruikers
Metagegevens van extensie
Schermafbeeldingen
Over deze extensie
Chat with your local Ollama AI models right from the browser. Connect to your Ollama server and talk to any model you have installed. No cloud services, no subscriptions, no data leaving your machine.
How it works
Open the popup or side panel, pick a model, and start typing. The extension connects to your local Ollama server and sends your messages directly to it. You can also send the content of the current page to the AI for summaries or explanations.
Features
Who is this for
Anyone who runs Ollama locally and wants a quick way to chat with models without leaving the browser. Developers, writers, researchers, and anyone who prefers local AI over cloud services.
Why install it
Lightweight, fully private, and works without any accounts or subscriptions. Everything runs locally and no data ever leaves your device.
Open Source Software
This extension is fully open source. The complete source code is publicly available on GitHub, so you can review exactly how it works, verify that your data stays private, audit it for security, or build it yourself from source.
Source code: https://github.com/SleekAddons/browser-extensions
Contributions, bug reports, and feature requests are welcome!
How it works
Open the popup or side panel, pick a model, and start typing. The extension connects to your local Ollama server and sends your messages directly to it. You can also send the content of the current page to the AI for summaries or explanations.
Features
- Select from all models on your server, they show up automatically
- Send the current page to the AI to summarize, explain, or answer questions about it
- Attach images, code files, CSVs, markdown, and other text formats
- Built in commands like Summarize and Explain, plus custom commands that act as system prompts
- Connect to multiple Ollama instances, each with its own URL, name, and optional auth token
- Context usage gauge shows how much of the model context window you have used
- Conversation history saved locally so you can browse past chats or continue where you left off
- Side panel support so you can chat while browsing with page context updating as you switch tabs
- No accounts or API keys required
Who is this for
Anyone who runs Ollama locally and wants a quick way to chat with models without leaving the browser. Developers, writers, researchers, and anyone who prefers local AI over cloud services.
Why install it
Lightweight, fully private, and works without any accounts or subscriptions. Everything runs locally and no data ever leaves your device.
Open Source Software
This extension is fully open source. The complete source code is publicly available on GitHub, so you can review exactly how it works, verify that your data stays private, audit it for security, or build it yourself from source.
Source code: https://github.com/SleekAddons/browser-extensions
Contributions, bug reports, and feature requests are welcome!
Met 0 gewaardeerd door 0 beoordelaars
Toestemmingen en gegevens
Vereiste machtigingen:
- Uw gegevens voor alle websites benaderen
Gegevensverzameling:
- De ontwikkelaar zegt dat deze extensie geen gegevensverzameling vereist.
Meer informatie
- Add-on-koppelingen
- Versie
- 1.0.7
- Grootte
- 2,09 MB
- Laatst bijgewerkt
- 2 maanden geleden (5 mrt. 2026)
- Verwante categorieën
- Licentie
- MIT-licentie
- Versiegeschiedenis
- Toevoegen aan collectie