Beoordelingen voor Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models door Muhammed Nazeem
Beoordeling door Firefox-gebruiker 18939203
Waardering: 5 van 5
door Firefox-gebruiker 18939203, één jaar geleden73 beoordelingen
- Waardering: 5 van 5door 飞舞的冰龙, 28 minuten geledenWhy this great addon cannot be updated? It has already been version 1.5.65 on GitHub.
- Waardering: 5 van 5door Firefox-gebruiker 14457244, 4 dagen geleden
- Waardering: 5 van 5door MarsLife, 24 dagen geleden
- Waardering: 5 van 5door TA, één maand geledenVery good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - Waardering: 5 van 5door Firefox-gebruiker 12490047, één maand geleden
- Waardering: 4 van 5door jonuno, 2 maanden geledengreat but how do I copy the text? its not selectable, so it takes away a lot of the functionality for what I need.
- Waardering: 5 van 5door James, 3 maanden geledenVery good browser extension for AI model use. It is programmed in a very clever way, lightweigt and ressource saving, but full functionality. Compared to all other AI frontends, it is the best one i know. Congratulations! Greetings from Germany :-)
- Waardering: 5 van 5door Zunami, 4 maanden geleden
- Waardering: 5 van 5door Robin Filer, 4 maanden geleden
- Waardering: 5 van 5door Michal Mikoláš, 4 maanden geledenCompatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- Waardering: 5 van 5door Firefox-gebruiker 19622186, 5 maanden geleden
- Waardering: 4 van 5door Firefox-gebruiker 14478686, 5 maanden geledenI'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- Waardering: 5 van 5door Nik, 5 maanden geledensave your self from all the open-webui headache with this amazing addon.
- Waardering: 5 van 5door HumanistAtypik, 6 maanden geleden
- Waardering: 5 van 5door codefossa, 7 maanden geledenThis easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Waardering: 5 van 5door Vaz-Dev, 7 maanden geledenAbsolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - Waardering: 5 van 5door Firefox-gebruiker 19258258, 9 maanden geleden
- Waardering: 5 van 5door Firefox-gebruiker 19244952, 9 maanden geleden
- Waardering: 5 van 5door Vick, 10 maanden geledenThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Waardering: 5 van 5door Firefox-gebruiker 19104715, 10 maanden geleden
- Waardering: 5 van 5door Henrique, één jaar geleden
- Waardering: 5 van 5door FFFire, één jaar geleden
- Waardering: 5 van 5door sun-jiao, één jaar geleden