Recensioni per Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models di Muhammed Nazeem
Recensione di HumanistAtypik
Valutata 5 su 5
di HumanistAtypik, 6 mesi fa73 recensioni
- Valutata 5 su 5di 飞舞的冰龙, un giorno faWhy this great addon cannot be updated? It has already been version 1.5.65 on GitHub.
Replica dello sviluppatore
pubblicato il un giorno faHi, I’m uploading the latest version to the store, but the add-on has not been approved yet. It is still under review. It has been two months, and there is no timeline for when Mozilla Firefox will approve it. I’m sorry. - Valutata 5 su 5di Utente Firefox 14457244, 6 giorni fa
- Valutata 5 su 5di MarsLife, 25 giorni fa
- Valutata 5 su 5di TA, un mese faVery good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - Valutata 5 su 5di Utente Firefox 12490047, un mese fa
- Valutata 5 su 5di Robin Filer, 4 mesi fa
- Valutata 5 su 5di Michal Mikoláš, 5 mesi faCompatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- Valutata 5 su 5di Utente Firefox 19622186, 5 mesi fa
- Valutata 4 su 5di Utente Firefox 14478686, 5 mesi faI'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- This easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Absolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - Valutata 5 su 5di Utente Firefox 19258258, 9 mesi fa
- Valutata 5 su 5di Utente Firefox 19244952, 9 mesi fa
- Valutata 5 su 5di Vick, 10 mesi faThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Valutata 5 su 5di Utente Firefox 19104715, 10 mesi fa
- Valutata 5 su 5di Henrique, un anno fa
- Valutata 5 su 5di FFFire, un anno fa
- Valutata 5 su 5di Utente Firefox 18939203, un anno fa
- Valutata 5 su 5di sun-jiao, un anno fa