Firefox 浏览器附加组件
  • 扩展
  • 主题
    • 适用于 Firefox
    • 字典和语言包
    • 其他浏览器网站
    • 适用于 Android 的附加组件
登录
Ollama Client 预览

Ollama Client 作者: Shishir Chaurasiya

Privacy-first Ollama Chrome extension to chat with local AI models like LLaMA, Mistral, Gemma — fully offline.

0 (0 reviews)0 (0 reviews)
28 个用户28 个用户
您需要 Firefox 来使用此扩展
下载 Firefox 并安装扩展
下载文件

扩展元数据

关于此扩展
🧠 Ollama Client – Chat with Local LLMs Inside Your Browser

Ollama Client is a lightweight, privacy-first Ollama Chrome extension that brings the power of local AI models and offline AI chat directly to your browser. No cloud dependencies. No API keys. No data sent externally.

Just fast, secure, Ollama browser extension–powered offline AI chat powered by open-source models like LLaMA 3, GPT-OSS, Mistral, Gemma, CodeLLaMA — all running on your own machine using the Ollama backend.

✨ Works on all Chromium-based browsers (Chrome, Edge, Brave) and Firefox (with additional setup). 100% open-source.

🚀 Key Features
🔌 Local Ollama Integration – Connect to a local Ollama server (no API keys)
💬 In-Browser Chat UI – Lightweight, minimal, fast (Ollama-ui alternative)
🛡️ 100% Local and Private – All storage and inference happen on your device (frontend interface for Ollama)
⚙️ Custom Settings – Control model parameters, themes, prompt templates
🔄 Model Switcher – Switch between models in real time
🔍 Model Search & Pull – Pull models directly in the UI (with progress indicator)
🗑️ Model Deletion with Confirmation – Clean up unused models from the UI
🧳 Load/Unload Models – Manage Ollama memory footprint efficiently
🎛️ Tune Parameters – Temperature, top_k, top_p, repeat penalty, stop sequences
🧠 Transcript & Page Summarization – Works with YouTube, Udemy, Coursera & web articles
🔊 TTS – Built-in Text-to-Speech via Web Speech API
🗂️ Multi-Chat Sessions – Save/load/delete local chats
🧯 Declarative Net Request (DNR) – Automatic CORS handling(v0.1.3)
🛡️ 100% Local and Private – All storage and inference happen on your device
📋 Copy & Regenerate – Quickly rerun or copy AI responses

🧭 Tab Access (Optional)

Want your LLM to understand the content of a page you're viewing? Enable Tab Access in the settings to fetch page content or transcripts for better contextual answers.

✔️ Fully opt-in
✔️ You choose which tabs to share
✔️ Customizable exclude list (regex supported)
✔️ No tab data ever leaves your device

⚙️ Installation & Setup

1️⃣ Install Ollama Client from the Chrome Web Store
2️⃣ Install Ollama on your machine from https://ollama.com and run ollama serve
3️⃣ Pull your favorite models (e.g., ollama pull llama3:8b, gemma:2b) and start chatting!

Advanced users can customize themes, model parameters, prompt templates, and excluded URLs from the Options page.

🎯 Who Should Use Ollama Client?

👩‍💻 Developers building with or debugging LLMs
📚 Researchers who want local, private LLM interfaces
🎓 Students using AI as study aids on local hardware
🔐 Privacy advocates avoiding cloud AI and APIs
🤖 AI tinkerers and open-source model enthusiasts

⚡ Performance & Hardware Recommendations

💻 8 GB RAM (no GPU): gemma:2b, mistral:7b-q4
💻 16 GB RAM (no GPU): gemma:3b-q4, gemma:2b
🚀 16 GB+ with GPU (6GB VRAM): llama3:8b-q4, gemma:3b
💥 32 GB+ or high-end GPU: llama3:8b, codellama:13b
🔥 RTX 3090+, Apple M3 Max: llama3:70b, mixtral

Note: Ollama Client Chrome extension is a frontend interface only. All LLM generation happens via your local Ollama install. Speed and output depend on your system.

🔗 Useful Links

🌐 Chrome Web Store: https://chromewebstore.google.com/detail/ollama-client/bfaoaaogfcgomkjfbmfepbiijmciinjl
📘 Setup Guide: https://ollama-client.shishirchaurasiya.in/ollama-setup-guide
💻 Landing Page: https://ollama-client.shishirchaurasiya.in/
🔒 Privacy Policy: https://ollama-client.shishirchaurasiya.in/privacy-policy
🧑‍💻 GitHub: https://github.com/Shishir435/ollama-client
🐞 Bug: https://github.com/Shishir435/ollama-client/issues

🚀 Start chatting in seconds — private, fast, and fully local AI conversations on your own machine.

Built for developers, researchers, and anyone who values speed, privacy, and offline AI control.

ollama #privacy #olama-client #opensource #offline #ollama-ui #ollamachat #gpt-oss
评分 0(1 位用户)
登录以评价此扩展
目前尚无评分

已保存星级评分

5
0
4
0
3
0
2
0
1
0
尚无评价
权限与数据详细了解

必要权限:

  • 拦截任何页面上的内容
  • 获取浏览器标签页
  • 访问您在所有网站的数据
更多信息
附加组件链接
  • 主页
  • 用户支持网站
  • 支持邮箱
版本
0.1.15
大小
1.03 MB
上次更新
7 天前 (2025年8月15日)
相关分类
  • 网页开发
  • 隐私和安全
许可证
MIT 许可证
隐私政策
阅读此附加组件的隐私政策
版本历史
  • 查看所有版本
标签
  • chat
  • youtube
添加到收藏集
举报此附加组件
Shishir Chaurasiya 制作的更多扩展
  • 目前尚无评分

  • 目前尚无评分

  • 目前尚无评分

  • 目前尚无评分

  • 目前尚无评分

  • 目前尚无评分

转至 Mozilla 主页

附加组件

  • 关于
  • Firefox 附加组件博客
  • 扩展工坊
  • 开发者中心
  • 开发者政策
  • 社区博客
  • 论坛
  • 报告缺陷
  • 评价指南

浏览器

  • Desktop
  • Mobile
  • Enterprise

产品

  • Browsers
  • VPN
  • Relay
  • Monitor
  • Pocket
  • Bluesky (@firefox.com)
  • Instagram (Firefox)
  • YouTube (firefoxchannel)
  • 隐私
  • Cookie
  • 法律

除非另有注明,否则本网站上的内容可按知识共享 署名-相同方式共享 3.0 或更新版本使用。