Pohódnoćenja za Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models wot Muhammed Nazeem
Pohódnoćenje wot Elliott
Z 4 z 5 pohódnoćeny
wot Elliott, a year agoVery nice ollama frontend. It automatically recognises the ollama daemon already running on PC if installed. It provides a nice way to interact with local LLMs, complete with web search integration which, having now seen it firsthand, really transforms the usefulness of the models; even those that appear stupid without web search can be good at summarising information, and become actually helpful when they don't have to rely on only their built in knowledge. This extension is probably the easiest way to get any graphical interface for ollama running, particularly with integrated web search.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama.
73 pohódnoćenjow
- Z 5 z 5 pohódnoćenywot 飞舞的冰龙, 2 days agoWhy this great addon cannot be updated? It has already been version 1.5.65 on GitHub.
Wotmołwa wuwiwarja
je so 2 days ago napisałHi, I’m uploading the latest version to the store, but the add-on has not been approved yet. It is still under review. It has been two months, and there is no timeline for when Mozilla Firefox will approve it. I’m sorry. - Z 5 z 5 pohódnoćenywot Wužiwar Firefox 14457244, 6 days ago
- Z 5 z 5 pohódnoćenywot MarsLife, a month ago
- Z 5 z 5 pohódnoćenywot TA, a month agoVery good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - Z 5 z 5 pohódnoćenywot Wužiwar Firefox 12490047, a month ago
- Z 4 z 5 pohódnoćenywot jonuno, 2 months agogreat but how do I copy the text? its not selectable, so it takes away a lot of the functionality for what I need.
- Z 5 z 5 pohódnoćenywot James, 3 months agoVery good browser extension for AI model use. It is programmed in a very clever way, lightweigt and ressource saving, but full functionality. Compared to all other AI frontends, it is the best one i know. Congratulations! Greetings from Germany :-)
- Z 5 z 5 pohódnoćenywot Zunami, 4 months ago
- Z 5 z 5 pohódnoćenywot Robin Filer, 4 months ago
- Z 5 z 5 pohódnoćenywot Michal Mikoláš, 5 months agoCompatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- Z 5 z 5 pohódnoćenywot Wužiwar Firefox 19622186, 5 months ago
- Z 4 z 5 pohódnoćenywot Wužiwar Firefox 14478686, 5 months agoI'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- Z 5 z 5 pohódnoćenywot Nik, 5 months agosave your self from all the open-webui headache with this amazing addon.
- Z 5 z 5 pohódnoćenywot HumanistAtypik, 6 months ago
- Z 5 z 5 pohódnoćenywot codefossa, 7 months agoThis easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Z 5 z 5 pohódnoćenywot Vaz-Dev, 7 months agoAbsolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - Z 5 z 5 pohódnoćenywot Wužiwar Firefox 19258258, 9 months ago
- Z 5 z 5 pohódnoćenywot Wužiwar Firefox 19244952, 9 months ago
- Z 5 z 5 pohódnoćenywot Vick, 10 months agoThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Z 5 z 5 pohódnoćenywot Wužiwar Firefox 19104715, 10 months ago
- Z 5 z 5 pohódnoćenywot Henrique, a year ago
- Z 5 z 5 pohódnoćenywot FFFire, a year ago
- Z 5 z 5 pohódnoćenywot Wužiwar Firefox 18939203, a year ago
- Z 5 z 5 pohódnoćenywot sun-jiao, a year ago