> Apple have done their homework and they will not release ANYTHING unless they think they'll either make money from it or at the very least, not LOSE money from it. It's all about money, nothing else.
> It's coming whether we like it or not now :(
Um, I think the public outcry showed them pretty clearly that they will lose customers and money precisely if they go ahead with this. I think the people championing this inside Apple (some rumors say Cook himself) were expecting this to be some kind of feel-good, "we're thinking of the children" PR piece - not the shitstorm it turned out to be.
I have a feeling it already came in 15.2. They are scanning texts for “nudity” in images. Not that they are the same thing. Or being used in the same way. But I have a feeling CSAM scanning just needs it’s flag turned on to work.
Thank you for your “feelings”. Except your feelings have no basis in reality.
Why would you blame a company for something that they clearly haven’t done?
I see where you're coming from, but Apple brought this on themselves by annoucing this invasive spyware garbage. This is what you get when you undermine your user's trust.
Well, their gonna push it at some point. Do you think they’ll tell you when they do, given the backlash.
Tell you what, put up 20k USD and I’ll go through the formal process of verifying binaries and whatnot. Otherwise, I think my “feeling” still contributes to the discussion. I don’t need to put in 200 hours to prove something I’m not misrepresenting as fact.
CSAM scanning will come. Apple probably won’t tell you.
> It's coming whether we like it or not now :(
Um, I think the public outcry showed them pretty clearly that they will lose customers and money precisely if they go ahead with this. I think the people championing this inside Apple (some rumors say Cook himself) were expecting this to be some kind of feel-good, "we're thinking of the children" PR piece - not the shitstorm it turned out to be.