It’s Almost Impossible to Tell if Your iPhone Has Been Hacked
Credit to Author: Lorenzo Franceschi-Bicchierai| Date: Tue, 14 May 2019 19:18:43 +0000
Hackers have been breaking into iPhones allegedly using a powerful spy tool sold to governments and taking advantage of a previously unknown vulnerability in the popular messaging app WhatsApp.
The hacking tool, as well as the WhatsApp exploit, were made by the infamous Israeli hacking and surveillance tool vendor NSO Group, according to The Financial Times, which first reported the story on Monday. WhatsApp found out about the flaw—and eventually patched it—after a victim got in touch with the digital security research group Citizen Lab, which in turn warned the Facebook-owned company.
The incident called into question the much vaunted security of the iPhone, a device considered by many to be the most secure consumer device on the planet. Some iOS security experts say this is yet another incident that shows iOS is so locked down it’s hard—if not impossible—to figure out if your own iPhone has been hacked.
“The simple reality is there are so many 0-day exploits for iOS,” Stefan Esser, a security researcher that specializes in iOS, wrote on Twitter. “And the only reason why just a few attacks have been caught in the wild is that iOS phones by design hinder defenders to inspect the phones.”
Have a tip about iPhone security? You can contact this reporter securely on Signal at +1 917 257 1382, OTR chat at lorenzofb@jabber.ccc.de, or email lorenzo@motherboard.tv
As of today, there is no specific tool that an iPhone user can download to analyze their phone and figure out if it has been compromised. In 2016, Apple took down an app made by Esser that was specifically designed to detect malicious jailbreaks. Moreover, iOS is so locked down that without hacking or jailbreaking it first, even a talented security researcher can do very little analysis on it. That is why security researchers crave expensive iPhone prototypes that have security features disabled, as a Motherboard investigation revealed earlier this year.
Claudio Guarnieri, a technologist at Amnesty International, who found that a colleague of his was targeted by NSO spyware last year, said that the “irony” is that there are better tools for attackers who want to do forensics on iOS—such as Cellebrite and GrayShift—than for defenders who want to help victims.
“These security controls have made mobile devices extremely difficult to inspect, especially remotely, and particularly for those of us working in human rights organizations lacking access to adequate forensics technology. Because of this, we are rarely able to confirm infections of those who we even already suspect being targeted,” Guarnieri wrote in a mailing list message. “Quite frankly, we are on the losing side of a disheartening asymmetry of capabilities that favors attackers over us, defenders.”
Apple did not respond to a request for comment.
Several iOS security researchers who spoke with Motherboard agree that the iPhone is too locked down for its own good. That makes it very hard for even experts to tell if a device has been compromised without jailbreaking it first, a feat that is not feasible for most users anymore.
“The bad guys will find a way in one way or another. Shouldn’t we enable the good guys to do their job?” said Zuk Avraham, a security researcher who studies iOS attacks, and who is the founder of ZecOps and Zimperium.
Avraham said that in the last few months he’s seen a lot of targeted attacks against iPhone users, so many that is “mind-blowing.” He declined to provide more evidence or details about the attacks, however.
Jonathan Levin, a researcher who has written books about iOS exploitation and provides training on iPhone security, said that in his opinion, so few iOS zero-days have been caught because they are worth a lot of money, and thus rarely used.
“To exacerbate the situation, payloads are often tested and perfected for weeks or more before deployment, thus ensuring a high chance of exploitation, and, inversely, a low chance of detection—especially in the case of ‘0 click’ attacks requiring no user interaction,” Levin said.
But unless Apple makes fundamental changes in how iOS is architected, “there is no practical way to tell an iPhone got ‘infected,’” according to a security researcher who goes by the alias Xerub, and who is the organizer of 0x41, an iOS only conference.
A security researcher who has extensive experience developing exploits, who asked to remain anonymous because he didn’t want to openly criticize potential customers, said that the fundamental problem is that iOS is “a bug rich environment,” and that Apple’s strategy only works against “hobbyist attackers” but is “quite counterproductive against professional attackers.”
“Of all the mainstream operating systems kernels, you compare the Windows kernel to the Linux kernel to the OSX kernel and iOS kernel, iOS and OSX kernel is routinely the one with more disastrous bugs,” the security researcher said.
The result is that—for the vast majority of people—the iPhone is still a very secure device. But all software, be it a secure messaging app like WhatsApp, or an operating system like iOS, have vulnerabilities. And when those vulnerabilities are exploited on an iPhone, there’s often no way of knowing.
Listen to CYBER, Motherboard’s new weekly podcast about hacking and cybersecurity.
This article originally appeared on VICE US.