Show HN: Aide – A customizable Android assistant (voice, choose your provider)
7 points
1/21/1970
2 days ago
by yincrash
Comments
subscribed
20 hours ago
newsdeskx
does this work with purely local models through Ollama, or do you still need the Ollama server running on another machine? been looking for something that actually works offline for basic voice commands
a day ago
yincrash
Still needs a server. You could run a server locally if you had a model that your device could handle then point aide to the localhost URL.
a day ago
subscribed
New phones can run Gemma 4 quants pretty nicely. It's a surprisingly good model. Google's Edge Gallery also offers some choice to try.
20 hours ago
subscribed
Missed the window for edit: I agree that ideally I'd have a tiny local MOE-kind of model, able to establish the complexity of the request, route simple local requests to the instantly available local agent, and route all the rest outside (to one of several models).
16 hours ago
Looks cool, but I think your maths isn't mathing :)
It's a second day of the first week (as per Google Play), and it shows $9.99 already (£8.99 in the Play Store).
I'm not saying it's expensive, feature wise it's awesome, I'm saying it's inconsistent :)))
BTW, is there any chance for the trial key (even one day)? My phone is running GrapheneOS and I would need to see if all I'd like works (or I can make it work).
Maybe beta programme?