Firefox 瀏覽器附加元件
  • 擴充套件
  • 佈景主題
    • 用於 Firefox
    • 字典與語言套件
    • 其他瀏覽器網站
    • Android 版的附加元件
登入
Ollama Client 預覽

Ollama Client 作者: Shishir Chaurasiya

Privacy-first Ollama Chrome extension to chat with local AI models like LLaMA, Mistral, Gemma — fully offline.

0 (0 reviews)0 (0 reviews)
28 Users28 Users
必須使用 Firefox 才能使用此擴充套件
下載 Firefox 並安裝擴充套件
下載檔案

擴充套件後設資料

關於此擴充套件
🧠 Ollama Client – Chat with Local LLMs Inside Your Browser

Ollama Client is a lightweight, privacy-first Ollama Chrome extension that brings the power of local AI models and offline AI chat directly to your browser. No cloud dependencies. No API keys. No data sent externally.

Just fast, secure, Ollama browser extension–powered offline AI chat powered by open-source models like LLaMA 3, GPT-OSS, Mistral, Gemma, CodeLLaMA — all running on your own machine using the Ollama backend.

✨ Works on all Chromium-based browsers (Chrome, Edge, Brave) and Firefox (with additional setup). 100% open-source.

🚀 Key Features
🔌 Local Ollama Integration – Connect to a local Ollama server (no API keys)
💬 In-Browser Chat UI – Lightweight, minimal, fast (Ollama-ui alternative)
🛡️ 100% Local and Private – All storage and inference happen on your device (frontend interface for Ollama)
⚙️ Custom Settings – Control model parameters, themes, prompt templates
🔄 Model Switcher – Switch between models in real time
🔍 Model Search & Pull – Pull models directly in the UI (with progress indicator)
🗑️ Model Deletion with Confirmation – Clean up unused models from the UI
🧳 Load/Unload Models – Manage Ollama memory footprint efficiently
🎛️ Tune Parameters – Temperature, top_k, top_p, repeat penalty, stop sequences
🧠 Transcript & Page Summarization – Works with YouTube, Udemy, Coursera & web articles
🔊 TTS – Built-in Text-to-Speech via Web Speech API
🗂️ Multi-Chat Sessions – Save/load/delete local chats
🧯 Declarative Net Request (DNR) – Automatic CORS handling(v0.1.3)
🛡️ 100% Local and Private – All storage and inference happen on your device
📋 Copy & Regenerate – Quickly rerun or copy AI responses

🧭 Tab Access (Optional)

Want your LLM to understand the content of a page you're viewing? Enable Tab Access in the settings to fetch page content or transcripts for better contextual answers.

✔️ Fully opt-in
✔️ You choose which tabs to share
✔️ Customizable exclude list (regex supported)
✔️ No tab data ever leaves your device

⚙️ Installation & Setup

1️⃣ Install Ollama Client from the Chrome Web Store
2️⃣ Install Ollama on your machine from https://ollama.com and run ollama serve
3️⃣ Pull your favorite models (e.g., ollama pull llama3:8b, gemma:2b) and start chatting!

Advanced users can customize themes, model parameters, prompt templates, and excluded URLs from the Options page.

🎯 Who Should Use Ollama Client?

👩‍💻 Developers building with or debugging LLMs
📚 Researchers who want local, private LLM interfaces
🎓 Students using AI as study aids on local hardware
🔐 Privacy advocates avoiding cloud AI and APIs
🤖 AI tinkerers and open-source model enthusiasts

⚡ Performance & Hardware Recommendations

💻 8 GB RAM (no GPU): gemma:2b, mistral:7b-q4
💻 16 GB RAM (no GPU): gemma:3b-q4, gemma:2b
🚀 16 GB+ with GPU (6GB VRAM): llama3:8b-q4, gemma:3b
💥 32 GB+ or high-end GPU: llama3:8b, codellama:13b
🔥 RTX 3090+, Apple M3 Max: llama3:70b, mixtral

Note: Ollama Client Chrome extension is a frontend interface only. All LLM generation happens via your local Ollama install. Speed and output depend on your system.

🔗 Useful Links

🌐 Chrome Web Store: https://chromewebstore.google.com/detail/ollama-client/bfaoaaogfcgomkjfbmfepbiijmciinjl
📘 Setup Guide: https://ollama-client.shishirchaurasiya.in/ollama-setup-guide
💻 Landing Page: https://ollama-client.shishirchaurasiya.in/
🔒 Privacy Policy: https://ollama-client.shishirchaurasiya.in/privacy-policy
🧑‍💻 GitHub: https://github.com/Shishir435/ollama-client
🐞 Bug: https://github.com/Shishir435/ollama-client/issues

🚀 Start chatting in seconds — private, fast, and fully local AI conversations on your own machine.

Built for developers, researchers, and anyone who values speed, privacy, and offline AI control.

ollama #privacy #olama-client #opensource #offline #ollama-ui #ollamachat #gpt-oss
由 1 位評論者給出 0 分
登入後即可幫此擴充套件評分
目前沒有評分

已儲存星等

5
0
4
0
3
0
2
0
1
0
還沒有評論
權限與資料了解更多

必要權限:

  • 封鎖任何頁面上的內容
  • 存取瀏覽器分頁
  • 存取您所有網站中的資料
更多資訊
附加元件網址
  • 首頁
  • 技術支援網站
  • 技術支援信箱
版本
0.1.15
大小
1.03 MB
最近更新
7 天前 (2025年8月15日)
相關分類
  • 網頁開發
  • 隱私權與安全性
授權條款
MIT License
隱私權保護政策
閱讀此附加元件的隱私權保護政策
版本紀錄
  • 瀏覽所有版本
標籤
  • chat
  • youtube
新增至收藏集
檢舉此附加元件
Shishir Chaurasiya 製作的更多擴充套件
  • 目前沒有評分

  • 目前沒有評分

  • 目前沒有評分

  • 目前沒有評分

  • 目前沒有評分

  • 目前沒有評分

前往 Mozilla 官網

附加元件

  • 關於
  • Firefox 附加元件部落格
  • 擴充套件工作坊
  • 開發者交流中心
  • 開發者政策
  • 社群部落格
  • 討論區
  • 回報 Bug
  • 評論撰寫指南

瀏覽器

  • Desktop
  • Mobile
  • Enterprise

產品

  • Browsers
  • VPN
  • Relay
  • Monitor
  • Pocket
  • Bluesky (@firefox.com)
  • Instagram (Firefox)
  • YouTube (firefoxchannel)
  • 隱私權
  • Cookie
  • 法律資訊

除另有註明外,本站內容皆採用創用 CC 姓名標示—相同方式分享條款 3.0 或更新版本授權大眾使用。