![]() ![]() WhatsApp, iMessage and others replicate that ubiquity, but they also open your phone to serious new risks. Second, this exposes an alarming risk with iMessage, that this secure messenger provides a tunnel through to your iPhone, through which bad actors with an iCloud account can push malicious exploits that your device then silently processes.Īnd this brings us to the risk from “unknown senders.” The ubiquity of SMS means that, broadly speaking, you can text any phone from your own. And that suggests Apple needs a serious architecture rethink. One could speculate that a reason Apple has been reluctant to publicly claim or even hint that it has blocked the Pegasus exploit is that NSO may just pull another one from its shelf, threading another needle through Apple’s OS, causing embarrassment when it comes to light. ![]() All of which is made much worse by Apple’s “black box,” making it difficult for third-party researchers and software to investigate and then block attacks. Remember, the attack likely came from an iCloud account. And if it isn’t fixed, why isn’t it fixed? What’s going on?”įirst, there are the reports that Pegasus exploited iMessage, iCloud Photo Streaming and Apple Music, that the architecture that stitches these in-house apps and services together has security holes. Everyone wants this to be fixed, but we don’t know if it’s fixed. The real problem is Apple’s complete lack of transparency. “Even the best security researchers can’t say. “There were indications that it might have been fixed, but we don’t know” STC’s Kate O’Flaherty warns in this week’s video. This lack of clarity has shocked many iPhone users. While that means they are not a threat to the overwhelming majority of our users, we continue to work tirelessly to defend all our customers.” When WhatsApp was hit with its own Pegasusgate in 2019, a company spokesperson told the media that “engineers had worked around the clock in San Francisco and London to close the vulnerability,” and that all users should upgrade to install the urgent fix “to protect against potential targeted exploits.”Ĭontrast that with Apple this time: “Attacks like the ones described are highly sophisticated, cost millions of dollars to develop, often have a short shelf life, and target specific individuals. Now we can add a third-what happens on your iPhone no longer stays on your iPhone. Pegasus raised two serious concerns-that Apple’s ecosystem, including iMessage, has dangerous vulnerabilities, and that Apple’s opaque communications and “black box” security made for a very unhealthy mix. No-one saw that coming.Īpple’s timing is dreadful. But Apple using technology to snoop on our private, seemingly encrypted iMessages? Apple being the first to introduce client-side content analysis on a flagship, end-to-end encrypted messenger. Fears that WhatsApp might share our data with Facebook were bad, albeit predictable. And the consequences of this first move need to be understood.Īnd so, yet again, we are left with the sinking feeling that nothing is as it should be. Screening iCloud Photos on your iPhone is one thing, but adding client-side screening of any kind to iMessage on your iPhone is quite another. ![]() We all want to see technology deployed to tackle abuse, and I have suggested that Facebook reverse plans to encrypt Messenger for this reason, but breaking existing end-to-end encryption is simply that. This intrusion is growing with intensity and often packaged in a way that is for the greater good.” The secondary concern, however, is that it highlights the power in which Apple holds with the ability to read what is on devices and match any images to those known on a database. ![]() “The initial potential concern is that this new technology could drive CSAM further underground,” warns ESET’s Jake Moore, “but at least it is likely to catch those at the early stages of their offending. “We want to help protect children from predators who use communication tools to recruit and exploit them,” Apple says, “and limit the spread of Child Sexual Abuse Material.” This is much less controversial-online photo services already screen content for CSAM. Apple is also launching an on-device screener for photos that users send to iCloud, hashing images to check against content flagged by law enforcement. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |