Facebook and Google landed in hot water with Apple this week after two investigations by TechCrunch revealed the misuse of internal-only certificates — leading to their revocation, which led to a day of downtime at the two tech giants.
Confused about what happened? Here’s everything you need to know.
How did all this start, and what happened?
On Monday, we revealed that Facebook was misusing an Apple-issued enterprise certificate that is only meant for companies to use to distribute internal, employee-only apps without having to go through the Apple App Store. But the social media giant used that certificate to sign an app that Facebook distributed outside the company, violating Apple’s rules.
The app, known simply as “Research,” allowed Facebook unparalleled access to all of the data flowing out of a device. This included access to some of the users’ most sensitive network data. Facebook paid users — including teenagers — $20 per month to install the app. But it wasn’t clear exactly what kind of data was being vacuumed up, or for what reason.
It turns out that the app was a repackaged app that was effectively banned from Apple’s App Store last year for collecting too much data on users.
Apple was angry that Facebook was misusing its special-issue enterprise certificates to push an app it already banned, and revoked it — rendering the app unable to open. But Facebook was using that same certificate to sign its other employee-only apps, effectively knocking them offline until Apple re-issued the certificate.
Then, it turned out Google was doing almost exactly the same thing with its Screenwise app, and Apple’s ban-hammer fell again.
What’s the controversy over these enterprise certificates and what can they do?
If you want to develop Apple apps, you have to abide by its rules — and Apple expressly makes companies agree to its terms.
A key rule is that Apple doesn’t allow app developers to bypass the App Store, where every app is vetted to ensure it’s as secure as it can be. It does, however, grant exceptions for enterprise developers, such as to companies that want to build apps that are only used internally by employees. Facebook and Google in this case signed up to be enterprise developers and agreed to Apple’s developer terms.
Each Apple-issued certificate grants companies permission to distribute apps they develop internally — including pre-release versions of the apps they make, for testing purposes. But these certificates aren’t allowed to be used for ordinary consumers, as they have to download apps through the App Store.
What’s a “root” certificate, and why is its access a big deal?
Because Facebook’s Research and Google’s Screenwise apps were distributed outside of Apple’s App Store, it required users to manually install the app — known as sideloading. That requires users to go through a convoluted few steps of downloading the app itself, and opening and trusting either Facebook or Google’s enterprise developer code-signing certificate, which is what allows the app to run.
Both companies required users after the app installed to agree to an additional configuration step — known as a VPN configuration profile — allowing all of the data flowing out of that user’s phone to funnel down a special tunnel that directs it all to either Facebook or Google, depending on which app you installed.
This is where the Facebook and Google cases differ.
Google’s app collected data and sent it off to Google for research purposes, but couldn’t access encrypted data — such as the content of any network traffic protected by HTTPS, as most apps in the App Store and internet websites are.
Facebook, however, went far further. Its users were asked to go through an additional step to trust an additional type of certificate at the “root” level of the phone. Trusting this Facebook Research root certificate authority allowed the social media giant to look at all of the encrypted traffic flowing out of the device — essentially what we call a “man-in-the-middle” attack. That allowed Facebook to sift through your messages, your emails and any other bit of data that leaves your phone. Only apps that use certificate pinning — which reject any certificate that isn’t its own — were protected, such as iMessage, Signal and additionally any other end-to-end encrypted solutions.
Google’s app might not have been able to look at encrypted traffic, but the company still flouted the rules — and had its separate enterprise developer code-signing certificate revoked anyway.
What data did Facebook have access to on iOS?
It’s hard to know for sure, but it definitely had access to more data than Google.
Facebook said its app was to help it “understand how people use their mobile devices.” In reality, at root traffic level, Facebook could have accessed any kind of data that left your phone.
Will Strafach, a security expert with whom we spoke for our story, said: “If Facebook makes full use of the level of access they are given by asking users to install the certificate, they will have the ability to continuously collect the following types of data: private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed.”
Remember: this isn’t “root” access to your phone, like jailbreaking, but root access to the network traffic.
How does this compare to the technical ways other market research programs work?
In fairness, these aren’t market research apps unique to Facebook or Google. Several other companies, like Nielsen and comScore, run similar programs, but neither ask users to install a VPN or provide root access to the network.
In any case, Facebook already has a lot of your data — as does Google. Even if the companies only wanted to look at your data in aggregate with other people, it can still hone in on who you talk to, when, for how long and, in some cases, what about. It might not have been such an explosive scandal had Facebook not spent the last year cleaning up after several security and privacy breaches.