Nettlesartillegg for Firefox
  • Utvidingar
  • Tema
    • for Firefox
    • Ordbøker og språkpakkar
    • Andre nettlesarplassar
    • Utvidingar for Android
Logg inn
Førehandsvising Ollama Client

Ollama Client av Shishir Chaurasiya

Privacy-first Ollama Chrome extension to chat with local AI models like LLaMA, Mistral, Gemma — fully offline.

0 (0 reviews)0 (0 reviews)
30 brukarar30 brukarar
Du treng Firefox for å bruke denne utvidinga
Last ned Firefox og få utvidinga
Last ned fil

Metadata for utvidingar

Om denne utvidinga
🧠 Ollama Client – Chat with Local LLMs Inside Your Browser

Ollama Client is a lightweight, privacy-first Ollama Chrome extension that brings the power of local AI models and offline AI chat directly to your browser. No cloud dependencies. No API keys. No data sent externally.

Just fast, secure, Ollama browser extension–powered offline AI chat powered by open-source models like LLaMA 3, GPT-OSS, Mistral, Gemma, CodeLLaMA — all running on your own machine using the Ollama backend.

✨ Works on all Chromium-based browsers (Chrome, Edge, Brave) and Firefox (with additional setup). 100% open-source.

🚀 Key Features
🔌 Local Ollama Integration – Connect to a local Ollama server (no API keys)
💬 In-Browser Chat UI – Lightweight, minimal, fast (Ollama-ui alternative)
🛡️ 100% Local and Private – All storage and inference happen on your device (frontend interface for Ollama)
⚙️ Custom Settings – Control model parameters, themes, prompt templates
🔄 Model Switcher – Switch between models in real time
🔍 Model Search & Pull – Pull models directly in the UI (with progress indicator)
🗑️ Model Deletion with Confirmation – Clean up unused models from the UI
🧳 Load/Unload Models – Manage Ollama memory footprint efficiently
🎛️ Tune Parameters – Temperature, top_k, top_p, repeat penalty, stop sequences
🧠 Transcript & Page Summarization – Works with YouTube, Udemy, Coursera & web articles
🔊 TTS – Built-in Text-to-Speech via Web Speech API
🗂️ Multi-Chat Sessions – Save/load/delete local chats
🧯 Declarative Net Request (DNR) – Automatic CORS handling(v0.1.3)
🛡️ 100% Local and Private – All storage and inference happen on your device
📋 Copy & Regenerate – Quickly rerun or copy AI responses

🧭 Tab Access (Optional)

Want your LLM to understand the content of a page you're viewing? Enable Tab Access in the settings to fetch page content or transcripts for better contextual answers.

✔️ Fully opt-in
✔️ You choose which tabs to share
✔️ Customizable exclude list (regex supported)
✔️ No tab data ever leaves your device

⚙️ Installation & Setup

1️⃣ Install Ollama Client from the Chrome Web Store
2️⃣ Install Ollama on your machine from https://ollama.com and run ollama serve
3️⃣ Pull your favorite models (e.g., ollama pull llama3:8b, gemma:2b) and start chatting!

Advanced users can customize themes, model parameters, prompt templates, and excluded URLs from the Options page.

🎯 Who Should Use Ollama Client?

👩‍💻 Developers building with or debugging LLMs
📚 Researchers who want local, private LLM interfaces
🎓 Students using AI as study aids on local hardware
🔐 Privacy advocates avoiding cloud AI and APIs
🤖 AI tinkerers and open-source model enthusiasts

⚡ Performance & Hardware Recommendations

💻 8 GB RAM (no GPU): gemma:2b, mistral:7b-q4
💻 16 GB RAM (no GPU): gemma:3b-q4, gemma:2b
🚀 16 GB+ with GPU (6GB VRAM): llama3:8b-q4, gemma:3b
💥 32 GB+ or high-end GPU: llama3:8b, codellama:13b
🔥 RTX 3090+, Apple M3 Max: llama3:70b, mixtral

Note: Ollama Client Chrome extension is a frontend interface only. All LLM generation happens via your local Ollama install. Speed and output depend on your system.

🔗 Useful Links

🌐 Chrome Web Store: https://chromewebstore.google.com/detail/ollama-client/bfaoaaogfcgomkjfbmfepbiijmciinjl
📘 Setup Guide: https://ollama-client.shishirchaurasiya.in/ollama-setup-guide
💻 Landing Page: https://ollama-client.shishirchaurasiya.in/
🔒 Privacy Policy: https://ollama-client.shishirchaurasiya.in/privacy-policy
🧑‍💻 GitHub: https://github.com/Shishir435/ollama-client
🐞 Bug: https://github.com/Shishir435/ollama-client/issues

🚀 Start chatting in seconds — private, fast, and fully local AI conversations on your own machine.

Built for developers, researchers, and anyone who values speed, privacy, and offline AI control.

ollama #privacy #olama-client #opensource #offline #ollama-ui #ollamachat #gpt-oss
Vurdert 0 av 0 meldarar
Logg inn for å vurdere denne utvidinga
Ingen vurderingar enno

Stjernevurdering lagra

5
0
4
0
3
0
2
0
1
0
Ingen vurderingar enno
Løyve og dataLes meir

Påkravde løyve:

  • Blokker innhald på alle sider
  • Få tilgang til nettlesarfaner
  • Tilgang tiil dataa dine frå alle nettsider
Meir informasjon
Lenker for tillegg
  • Heimeside
  • Brukarstøtteside
  • E-post for brukarstøtte
Versjon
0.2.3
Storleik
1,32 MB
Sist oppdatert
13 timar sidan (25. aug. 2025)
Liknande kategoriar
  • Web Development
  • Privacy & Security
Lisens
MIT License
Personvernpraksis
Les personvernpraksisen for denne utvidinga
Versjonshistorikk
  • Vis alle versjonar
Etikettar
  • chat
  • youtube
Legg til i samling
Rapporter dette tillegget
Versjonsnotat for 0.2.3
🚀 New Features

Chat Import/Export
  • Export Chat Sessions
  • Export a single chat session as PDF or JSON.
  • Export all chat sessions at once as PDF or JSON.
  • Import Chat Sessions
  • Import chat sessions from single or multiple JSON files.
  • Quickly restore or migrate chats from previous sessions.

🐞 Bug Fix
  • Fixed an issue where the welcome screen was incorrectly showing up on page reload.

Notes
  • Exported files can be saved locally for backup or sharing.
  • Imported sessions are added to your current session list and can be accessed immediately.

What's Changed
  • Feature/import export chat sessions by @Shishir435 in https://github.com/Shishir435/ollama-client/pull/27

Full Changelog: https://github.com/Shishir435/ollama-client/compare/0.2.1...0.2.3
Fleire utvidingar av Shishir Chaurasiya
  • Ingen vurderingar enno

  • Ingen vurderingar enno

  • Ingen vurderingar enno

  • Ingen vurderingar enno

  • Ingen vurderingar enno

  • Ingen vurderingar enno

Gå til Mozilla-heimesida

Utvidingar

  • Om
  • Firefox tilleggsblogg
  • Utvidingsverkstad
  • Utviklarsenter
  • Utviklarpraksis
  • Fellesskaps-blogg
  • Forum
  • Rapporter ein feil
  • Vurderingsguide

Nettlesar

  • Desktop
  • Mobile
  • Enterprise

Produkt

  • Browsers
  • VPN
  • Relay
  • Monitor
  • Pocket
  • Bluesky (@firefox.com)
  • Instagram (Firefox)
  • YouTube (firefoxchannel)
  • Personvern
  • Infokapslar
  • Juridisk

Om ikkje noko anna er spesifisert, er innhaldet på denne nettstaden lisensiert under Creative Commons Attribution Share-Alike License v3.0 eller ein seinare versjon.