Ocene za Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models — Muhammed Nazeem
Ocena uporabnika Elliott
Ocenjeno z 4 od 5
— Elliott, pred 9 meseciVery nice ollama frontend. It automatically recognises the ollama daemon already running on PC if installed. It provides a nice way to interact with local LLMs, complete with web search integration which, having now seen it firsthand, really transforms the usefulness of the models; even those that appear stupid without web search can be good at summarising information, and become actually helpful when they don't have to rely on only their built in knowledge. This extension is probably the easiest way to get any graphical interface for ollama running, particularly with integrated web search.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama.
59 ocen
- Ocenjeno z 5 od 5— HumanistAtypik, pred 5 dnevi
- Ocenjeno z 5 od 5— codefossa, pred 20 dneviThis easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Ocenjeno z 5 od 5— Vaz-Dev, pred enim mesecemAbsolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - Ocenjeno z 5 od 5— Uporabnik Firefoxa 19258258, pred 3 meseci
- Ocenjeno z 5 od 5— Uporabnik Firefoxa 19244952, pred 3 meseci
- Ocenjeno z 5 od 5— Vick, pred 4 meseciThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Ocenjeno z 5 od 5— Uporabnik Firefoxa 19104715, pred 4 meseci
- Ocenjeno z 5 od 5— Henrique, pred 4 meseci
- Ocenjeno z 5 od 5— FFFire, pred 6 meseci
- Ocenjeno z 5 od 5— Uporabnik Firefoxa 18939203, pred 7 meseci
- Ocenjeno z 5 od 5— sun-jiao, pred 7 meseci
- Ocenjeno z 5 od 5— Jean Louis, pred 7 meseci🌟 My Exciting Review of Page Assist! 🌟
Wow, Page Assist has completely transformed the way I work online! As someone who’s all about maximizing efficiency and keeping my data private, I couldn’t be happier with this extension. Let me gush over why it’s now my go-to browser buddy! ✨
🖥️ Local AI Magic
First off, the local OpenAI-compatible endpoint support is just brilliant! Running models like llama.cpp locally means I get lightning-fast responses without worrying about privacy. It’s like having a super-smart, personal assistant right at my fingertips! 🔒💨
📚 Versatile Features Galore
- Sidebar for All the Fun: The sidebar is like a Swiss Army knife for productivity. Whether I’m jotting down notes or brainstorming ideas, it’s always there, ready to help. 📝
- Vision Model Wizardry: From analyzing images to extracting text with OCR, the vision models are nothing short of magical. It’s like having a mini-photographer and typist all in one! 📸✍️
- Chat with My Docs: Interacting with PDFs and other documents directly in the sidebar is a dream come true. It’s like having a conversation with my files—how cool is that? 📄💬
🎨 Minimal Web UI
The web UI is sleek and user-friendly. It’s so intuitive that even my grandma could use it without breaking a sweat! 🧙♂️👵
🌐 Internet Search & More
And let’s not forget the internet search capability. Combining local AI power with the vastness of the web is like having the best of both worlds. 🌍🔍
🎉 Conclusion
In short, Page Assist is an absolute gem! It’s made my digital life so much more efficient and enjoyable. If you’re on the fence, just go for it—you won’t regret it! 🚀
Jean Louis - Ocenjeno z 5 od 5— Iván Campaña, pred 7 meseci
- Ocenjeno z 5 od 5— Uporabnik Firefoxa 14643647, pred 8 meseci
- Ocenjeno z 5 od 5— plter, pred 8 meseci
- Ocenjeno z 5 od 5— Uporabnik Firefoxa 18867716, pred 8 meseci
- Ocenjeno z 5 od 5— paulcalebfarmer, pred 8 meseci
- Ocenjeno z 5 od 5— Jujaga, pred 8 meseciThis addon is by far one of the best and easiest ways to dive into using local LLMs. This addon is a severely underrated web interface for Ollama, and comes with many very useful features. It is a must have in your toolkit if you work with local AI models.
- Ocenjeno z 5 od 5— 酱油炒饭, pred 8 meseci
- Ocenjeno z 5 od 5— aepex, pred 8 meseci
- Ocenjeno z 5 od 5— 明一, pred 9 meseci
- Ocenjeno z 5 od 5— Eduardo, pred 9 meseci
- Ocenjeno z 5 od 5— 晓明, pred 9 meseci
- Ocenjeno z 4 od 5— Uporabnik Firefoxa 10396487, pred 9 meseci