TrueCrypt cryptographic audit turns up little to fear
Open-source projects, whether in the world of free software or other license structures, supposedly had the advantage that anyone could examine the code for flaws or injections.
That's turned out not to be the case, but things are getting better.
Truly cryptic
TrueCrypt is open-source virtual and full-disk encryption software that remains the only viable multiplatform option one could recommend that wasn't tied to a company. The independent project was developed by anonymous programmers for a decade; they still aren't identified. It works in Windows XP and later, many flavors of Linux, and Mac OS X.
In 2013, the nonprofit Open Crypto Audit Project (OCAP) was founded and raised over $70,000 to perform a thorough independent audit of TrueCrypt's codebase. The first phase, related to the "bootloader" software that worked only in Windows for full-disk encryption (FDE), finished in April 2014, and found no back doors or "super critical" bugs. (TrueCrypt can't manage an OS X boot volume. Read more about FDE and OS X's FileVault 2 in a previous Private I column.)
Then, abruptly, the project shut down in May 2014 with the release of a new version (7.2) that could only decrypt virtual disks and real partitions and drives. The developers put a note at the top of a stripped-down webpage, "WARNING: Using TrueCrypt is not secure as it may contain unfixed security issues." They also implied that the end of official Microsoft support for XP was part of the reason. Later versions of Windows can use Microsoft-supplied and third-party full-disk encryption.
Mac users can also create encrypted virtual disk images with Disk Utility and encrypt external volumes with a simple Control-click on a volume in the Finder. But these have two associated issues: first, they're not portable to other platforms; second, we rely on Apple's codebase, which isn't externally and independently audited. TrueCrypt brings portability, and because the code is available for inspection, the opportunity to confirm it's not hiding secrets.
This raised many questions, none of which have been answered. Did the team get tired of the work after a decade Did they discover a flaw so severe they felt they couldn't fix it Did a government (one or more) discover their identities and pressure them to install weaker encryption or a backdoor It's simply unknown, and none of my security sources have any strong inclination as to the reason.
After delays related to the project's shutdown, OCAP today released its long-awaited second audit phase, which looked more deeply at many aspect of TrueCrypt 7.1a, the penultimate release in 2012 that many people still rely on, and which was thought to be secure, even though it hadn't been proven. It's also important because of two projects that rely on the TrueCrypt codebase.
Ciphershed (alpha release) and VeraCrypt are "forked" releases, which expand and change the TrueCrypt format. Both support OS X. There remains some concern that TrueCrypt's software license doesn't allow these sort of forks, but these projects are proceeding nonetheless. (The anonymous developers would conceivably either have to uncloak or obtain counsel in order to pursue a copyright violation, and it's not crystal clear if they would prevail.)
The OCAP report found a few problems, none of them seemingly intentionally designed to allow unwanted access. The most severe is only an issue under Windows, and can be fixed relatively easily. The two descendant project say they've already fixed some problems they've found, and this audit should improve them even more.
The rest of the code
Without insinuating anything troubling about Apple, but rather understanding both the nature of government intrusion and gag orders, as well as remembering "gotofail," it's valid to ask questions about their code.
While Apple doesn't use the OpenSSL encryption library, we as iOS and OS X users are constantly connecting with servers and other software that does. Last year, the Heartbleed bug was discovered, a truly devastating security risk. Despite OpenSSL's extremely wide use and its collaborative, open-source approach, its code had become a poorly maintained mess over years despite a dedicated core of volunteers.
After Heartbleed, tech companies and foundations poured money into the project to allow it to hire and devote consistent programming time to improving it, and thousands of fixes have followed. Just a few days ago, the group sent out an alert in advance about a potential high-severity problem, which turned out to be obscure, but which they were able to find, patch, and release in a timely fashion. This is the direction one hopes things continue to go.
More recently, after Julia Angwin of ProPublica wrote about Werner Koch, the developer and maintainer of GNU Privacy Guard (GPG), which I've previously written about, he received grants and funding to continue his efforts at a sustainable and higher level. One guy was responsible, and lived sometimes on near-starvation wages, to keep a project of global utility going.
Apple could at a future point be unable to resist legally and comment publicly on changes required in their software and hardware. And it doesn't write bug-free code. No one does; no one can. Whatever internal procedures they have in-house, many eyes can improve on code, though there are plenty of times when critical flaws are introduced and unnoticed or remain in place for years or decades in other projects.
More importantly than worrying about Apple's competency, integrity, or ability to resistant government requests (not just from the United States), competition and alternatives spur improvement. And a little bit of funding--crowdfunding, grants, and individual donations--keep these projects alive and audited.