Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Local LLM models can be many GBs each. One can have many of them.


Is that something you actually do with your phone?


Look at all the hype around AI assistants lately. Having those run locally is definitely much better from a privacy perspective.


Many models of many GB each on a phone with only 8GB of DRAM doesn't make sense. Even on Android phones that have more DRAM than that, there's not enough performance for large models to be useful, let alone multiple models consuming tens of GB of storage.


> Many models of many GB each on a phone with only 8GB of DRAM doesn't make sense.

> there's not enough performance for large models to be useful

For iPhone, the Private LLM app offers 44 models in its easy-install menu right now (on 15 promax), with avg model size about 2.7 GB.

Total GB is ~120 GB to install and compare them all.

People could argue about what being useful means, but asking these different models the same question can give quite different (and indeed thought-provokingly different) results.

They all have either different foundations or different fine tuning or something. The same prompt gives rather different answers depending on what model you pick. It's fun that way.

So to compare their outputs, one would need 120GB free at once, then install them all, or else do some very inefficient and weird dance of uninstall-then-reinstall n times while picking and choosing which ones to try out for a given prompt.

And that's just for one offline "gen AI" app. There might be others.

This can be done offline, while on a walk, or whatever, without hitting any servers at all, in the palm of one's hand. Performance is good enough on iPhone for a bunch of tasks/prompts.

So, IDK. Does part of that not make sense? These models not being run run at once, and don't need to be in RAM at once. They're being stored on disk at once to be used when wanted. They initialize one at a time when in use. The large disk size is for storing the models. The phone can then use the models when wanted.

(Granted, it's not the "same" as hitting huge versions of the models sitting on some beefy infra, but we're talking about mobile devices here and actual current uses for having a bunch of local storage available on them.)


So what I'm hearing is that you don't have a use case for having 44 LLMs on your phone simultaneously, aside from testing different LLMs for the sake of testing different LLMs. There's no way a user would want to keep all 44 LLMs around indefinitely; nobody could possibly remember which one they prefer for which type of query across that many models. I can believe there are uses for having a handful of models around and for trying out new ones when they become available, but I can't see many users bothering to thoroughly evaluate dozens of models at once.


We're getting away from the point. What does it matter what the specific use case is? Who are you to be the arbiter of a valid or invalid use case? This one guy didn't justify himself to fulfill your arbitrary personal requirements; I guess that means nobody should even have the option of removable storage and that we should all be okay with paying an extra few hundred dollars for five more gigabytes of space on our phones (or whatever the insane going rate is now).


> What does it matter what the specific use case is?

Recall where the thread started: https://news.ycombinator.com/item?id=41631782 A question about what use cases people have for large amounts of storage on phones, aside from photos and videos that are easily offloaded to the cloud. When someone replies that they need tons of storage for LLMs, it's quite reasonable to want to understand where that need comes from, how much storage is actually needed for that use case, and whether it's anything that could become popular enough to influence product design decisions or if it's something that will only ever be done by HN nerds. If the best use cases HN can come up with for more storage on phones are things that are unlikely to go mainstream, then we should be assuming that phone manufacturers aren't going to be feeling much pressure to change their storage strategies.

Complaints about product design choices rarely produce interesting discussion when they can be reduced down to the complainer not making the distinction between their personal usage patterns and those of the target market in general. LLMs are still a very new class of workloads for phones, and what an early-adopter HN user is doing with LLMs on a phone could be a preview of an emerging trend that may go mainstream—very on-topic for HN discussion. But in this case it looks more like a use case that will forever remain an outlier.

(It's totally okay to be an outlier and do things with your gadgets that hardly anyone else would choose to do or even think of doing.)


Intuitively you shouldn't even need to ask this question though. There are a billion and a half iPhone users out there. Numbers at this scale can be tricky to comprehend, but the takeaway should be that even if just 10% of users want the option (an extremely conservative estimate), that's still 150 million people with a need. That's more people than there are HN viewers (not just users) there are per day, several times over, so this is absolutely not a need only floated by "HN nerds." Hell, even 1% of the iPhone userbase vastly outnumbers HN by every metric. There is obviously demand, and I mean popular, non-nerd demand. Ask your friends or parents if they would like more space on their iPhone or if they have ever had to delete an app to make space for another or if they have ever been reluctant to install an app because it was too big. A very common one I see especially with older folks is that they take too many pictures and agonize over deleting some to get more space, and yes, they could use iCloud, but they're very old and hardly understand software, much less as a subscription service. Much easier for their grandkid to pop in a 2 TB micro SD and call it a day (not to mention several times cheaper, but let's get to that next!). This is not up for debate, demand absolutely exists at a massive scale.

You are right however that "manufacturers [specifically Apple] aren't going to be feeling much pressure" to change their design because right now they are absolutely gouging the general public and making hundreds of millions of dollars off of it. An iPhone 16 Pro 128 GB is $1000, and a TB is $1500. They are selling less than a terabyte of space for five hundred dollars, over five times the cost of a gold standard 1 TB micro SD which costs under $100. If you opt for iCloud you pay almost a thousand dollars a year, well over 10x as much as a micro SD. Of course there is no pressure to divert from their current model: They're getting away with highway robbery. Scale that to the 150 million or so people with a valid need for storage (again, this is an extremely conservative estimate) and you have numbers in the billions. They will ABSOLUTELY ignore a valid need and never fix the problem of not offering a reasonable storage method if their insane storage method results in so much money.


> but the takeaway should be that even if just 10% of users want the option (an extremely conservative estimate),

Do you mean 10% of iPhone users want to keep a few dozen LLMs on their phone (obvious bullshit), or that in aggregate at least 10% of iPhone users would buy more storage if it was cheaper (to say nothing of the customers who always buy the most expensive config as conspicuous consumption)?


I meant 10% of iPhone users have a valid need for additional storage space. That can mean LLMs or just having a lot of apps or photos. If you had read past the first sentence you would have known that...


Yes, and it's awesome.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: