Análises para Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models por Muhammed Nazeem
Análise por jonuno
Avaliado em 4 de 5
por jonuno , há 2 meses great but how do I copy the text? its not selectable, so it takes away a lot of the functionality for what I need.
73 análises
Resposta do programador
publicado a há 3 diasHi, I’m uploading the latest version to the store, but the add-on has not been approved yet. It is still under review. It has been two months, and there is no timeline for when Mozilla Firefox will approve it. I’m sorry.- Avaliado em 5 de 5por Utilizador do Firefox 14457244 , há 7 dias
- Very good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - Avaliado em 5 de 5por Utilizador do Firefox 12490047 , há um mês
- Avaliado em 5 de 5por James , há 3 mesesVery good browser extension for AI model use. It is programmed in a very clever way, lightweigt and ressource saving, but full functionality. Compared to all other AI frontends, it is the best one i know. Congratulations! Greetings from Germany :-)
- Avaliado em 5 de 5por Zunami , há 4 meses
- Avaliado em 5 de 5por Robin Filer , há 4 meses
- Avaliado em 5 de 5por Michal Mikoláš , há 5 mesesCompatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- Avaliado em 5 de 5por Utilizador do Firefox 19622186 , há 5 meses
- Avaliado em 4 de 5por Utilizador do Firefox 14478686 , há 5 mesesI'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- Avaliado em 5 de 5por Nik , há 5 mesessave your self from all the open-webui headache with this amazing addon.
- Avaliado em 5 de 5por HumanistAtypik , há 6 meses
- Avaliado em 5 de 5por codefossa , há 7 mesesThis easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Avaliado em 5 de 5por Vaz-Dev , há 7 mesesAbsolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - Avaliado em 5 de 5por Utilizador do Firefox 19258258 , há 9 meses
- Avaliado em 5 de 5por Utilizador do Firefox 19244952 , há 9 meses
- Avaliado em 5 de 5por Vick , há 10 mesesThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Avaliado em 5 de 5por Utilizador do Firefox 19104715 , há 10 meses
- Avaliado em 5 de 5por Utilizador do Firefox 18939203 , há um ano