Ollama Client par Shishir Chaurasiya
Privacy-first Ollama Chrome extension to chat with local AI models like LLaMA, Mistral, Gemma â fully offline.
111 utilisateurs·trices111 utilisateurs·trices
MĂ©tadonnĂ©es de lâextension
Ă propos de cette extension
Ollama Client â Local LLM Chat in Your Browser (MultiâProvider)
A privacyâfirst, offline AI chat experience for local LLMs with multiâprovider support.
No cloud inference. No data leaving your machine.
What It Is
Ollama Client is a browserâbased frontend UI for local LLM servers. It connects to your selfâhosted LLM backend and lets you chat inside your browser. Supports Ollama, LM Studio, and llama.cpp servers.
Key Features
- Provider & model management: connect multiple local servers, switch models, view provider status
- Chat & session management: streaming responses, stop/regenerate, session history
- File & webpage context: local file attachments and optional page context for better answers
- Customisation & performance: prompt templates, model parameters, responsive UI
- Privacy & local storage: data stored locally; no external transfer required
Supported Providers
- Ollama (Ollama UI)
- LM Studio (LM Studio client)
- llama.cpp servers (OpenAIâcompatible local endpoints / llama.cpp UI)
Privacy & LocalâOnly Guarantee
- No cloud inference
- No external data transfer
- All data stays on your machine and local network
Who Itâs For
- Developers working with local AI models
- Researchers evaluating selfâhosted LLMs
- Students learning with offline AI chat
- Privacyâconscious users who avoid cloud services
Setup Summary
1) Install the extension
2) Run a supported local LLM server
3) Connect via
4) Start chatting
Disclaimer
- Performance depends on your hardware and the backend server
- The extension does not include models or run inference itself
Useful Links
Chrome Web Store: https://chromewebstore.google.com/detail/ollama-client/bfaoaaogfcgomkjfbmfepbiijmciinjl
Setup Guide: https://ollama-client.shishirchaurasiya.in/ollama-setup-guide
Landing Page: https://ollama-client.shishirchaurasiya.in/
Privacy Policy: https://ollama-client.shishirchaurasiya.in/privacy-policy
GitHub: https://github.com/Shishir435/ollama-client
Bug: https://github.com/Shishir435/ollama-client/issues
Start chatting in seconds â private, fast, and fully local AI conversations on your own machine.
Built for developers, researchers, and anyone who values speed, privacy, and offline AI control.
ollama #privacy #olama-client #opensource #offline #ollama-ui #ollamachat #gpt-oss #lm-studio #llama.cpp
A privacyâfirst, offline AI chat experience for local LLMs with multiâprovider support.
No cloud inference. No data leaving your machine.
What It Is
Ollama Client is a browserâbased frontend UI for local LLM servers. It connects to your selfâhosted LLM backend and lets you chat inside your browser. Supports Ollama, LM Studio, and llama.cpp servers.
Key Features
- Provider & model management: connect multiple local servers, switch models, view provider status
- Chat & session management: streaming responses, stop/regenerate, session history
- File & webpage context: local file attachments and optional page context for better answers
- Customisation & performance: prompt templates, model parameters, responsive UI
- Privacy & local storage: data stored locally; no external transfer required
Supported Providers
- Ollama (Ollama UI)
- LM Studio (LM Studio client)
- llama.cpp servers (OpenAIâcompatible local endpoints / llama.cpp UI)
Privacy & LocalâOnly Guarantee
- No cloud inference
- No external data transfer
- All data stays on your machine and local network
Who Itâs For
- Developers working with local AI models
- Researchers evaluating selfâhosted LLMs
- Students learning with offline AI chat
- Privacyâconscious users who avoid cloud services
Setup Summary
1) Install the extension
2) Run a supported local LLM server
3) Connect via
localhost or your LAN IP4) Start chatting
Disclaimer
- Performance depends on your hardware and the backend server
- The extension does not include models or run inference itself
Useful Links
Chrome Web Store: https://chromewebstore.google.com/detail/ollama-client/bfaoaaogfcgomkjfbmfepbiijmciinjl
Setup Guide: https://ollama-client.shishirchaurasiya.in/ollama-setup-guide
Landing Page: https://ollama-client.shishirchaurasiya.in/
Privacy Policy: https://ollama-client.shishirchaurasiya.in/privacy-policy
GitHub: https://github.com/Shishir435/ollama-client
Bug: https://github.com/Shishir435/ollama-client/issues
Start chatting in seconds â private, fast, and fully local AI conversations on your own machine.
Built for developers, researchers, and anyone who values speed, privacy, and offline AI control.
ollama #privacy #olama-client #opensource #offline #ollama-ui #ollamachat #gpt-oss #lm-studio #llama.cpp
Noté 5 par 1 personne
Autorisations et données
Autorisations nécessaires :
- Bloquer du contenu sur nâimporte quelle page
- Accéder aux onglets du navigateur
- Accéder à vos données pour tous les sites web
Plus dâinformations
- Liens du module
- Version
- 0.6.0
- Taille
- 2,55Â Mo
- DerniĂšre mise Ă jour
- il y a 3 jours (8 févr. 2026)
- Catégories associées
- Licence
- Licence MIT
- Politique de confidentialité
- Lire la politique de confidentialité de ce module
- Historique des versions
- Ajouter Ă la collection