Google's intentions are good, but implementation leave MORE users vulnerable to hacking than before
In 2010 I wrote an article about a flaw Google discovered, and published working exploit code when no fix or mitigation existed. This allowed attackers to immediately start using the flaw to hack Google's own users (in this case, the world). Since then Google has announced a new program 'Project Zero' which from the project page "Our objective is to significantly reduce the number of people harmed by targeted attacks.". I think this is a really great idea, and a company sponsoring such an initiative is great stuff. Unfortunately the way this is being handled in actuality is harming users by publishing exploits without mitigation or patches available from the vendor (or even Google).
Microsoft
This month Google announced a vulnerability in windows and provided exploit code without providing mitigation instructions, or the vendor having a patch. This obviously upset Microsoft and in the end left more users vulnerable. The bug report of the issue shows MANY people upset with the fact this was published without a patch or mitigation advice. It's important to note that Microsoft had a patch scheduled for 92 days after discovery instead of 90 days. Google was unwilling to wait the 2 days, and instead, provided attackers with the tools to hack anyone with the vulnerable software for 2 days with impunity. Many people were unhappy with this, including respected security researchers which also presented great arguments for both sides.
Next up, Apple
Google also dropped 3 0day's on Mac OS X this month following the same 90 day exploit publication policy. As of this writing no patch is available and it's unclear if it was being exploited before, or is currently being exploited. Unfortunately without a mitigation/patch available for each issue, us users are the ones negatively impacted while Google plays out some vulnerability crusade against vendors. Not cool.
Arguments
Below are the general arguments I've been hearing about this topic with some realities that aren't being talked about, or intentionally ignored.
1. "Bad guys are already exploiting this, and they're forcing action by the vendor by releasing an exploit"
- Reality: Some people may have known about this unpatched/unmitigated flaw, now more know about it and abuse will only increase. Google's stance has already resulted in more hacked users. It's an idealist perspective to think you're only making people safer, but it's also the equivalent of sticking your head in the sand.
2. "90 days should be plenty of time to fix something
- Reality: Not every flaw can always be fixed in 90 days. This is due to product lifecycles, and more often than not, the need to properly QA a fix across MANY different variants of a product. Companies that make software that's supported for a decade+ may have to test a hundred different languages/builds/OS versions. If they fuck it up for 1 version they'll get shit on and have to ensure every impacted version works properly. This can take time. Some companies require a full OS upgrade every few years (often for a price) and simply stop supporting software/security fixes if it's a few years old. It's also naive to say that every company should just ditch older OS versions when clearly they are making $ with them (otherwise they wouldn't be investing the $ into them). Patch cycles are often a consequence of many concurrent builds/releases which is why not everything is a simple n day fix.
3. "You must not support full disclosure"
- My response: I do actually. There are differences between publishing as an individual and a company. A company has an ethical duty/responsibility to not open their own users to unnecessary risk, something an individual doesn't have to deal with. This is clearly a conflict of interest. Releasing an advisory without an exploit (at the very least until a patch was available, or some form of mitigation advice for sysadmins) would have accomplished the same effect. More details on better ways to approach disclosure below.
- Additionally: For individuals the motivations will differ widely. Individuals unlike companies have no users or legal obligation to customers.
4. "If they didn't release this information the vendor wouldn't have fixed it!"
- Reality: Microsoft has been doing a really good job fixing issues for the past several years. They learned the hard way and have spent considerable resources to correct it. Microsoft had a patch planned within days, releasing exploit code without a end user solution didn't benefit users.
- Reality: Having personally reported a vulnerability to Apple I have found, like Microsoft, they are responsive to vulnerability reports. Assume that they couldn't fix the issue within even 120 days. There's no issue releasing an advisory (without exploit code) at the 90 day mark to put some public pressure. This has proven very effective against major companies in the past.
Constructive criticism
Rather than being yet another person on the internet bitching and not providing constructive advice, here's advice that would allow Google (or anyone else considering this approach at the company level) to continue their great program in a more responsible manner.
- When vendors ignore the vuln report: I've been in this situation myself, and it's beyond frustrating. When this occurs you have to publish an advisory, it's the only way to get a vendor to fix the issue. 90 days to publish an advisory (note: not exploit) is more than reasonable. If after say 30 days after publishing the advisory the vendor has not scheduled a fix or commented, publishing an exploit WITH mitigation advice may be reasonable. Without providing fix advice you're not helping the end user. Rarely will people go through the trouble to write their own patches and publish them for the world to see. This does happen, but it isn't as consistent as the vuln hippies make it out to be.
- When vendors are working on a resolution: If the resolution is beyond 90 days, publish an advisory instead without exploit code. Even better, provide or point to a solution that sysadmins and users can use. Without a solution end users aren't being helped.
- Exploit release: I am for full disclosure of vuln details, and I'm not saying an exploit should never be released. In the case of a company publishing details that could hurt their own users, some sort of escalation process should occur first, and only after a mitigation/patch of SOME SORT is available. This will ultimately result in attackers obtaining access to hack people they wouldn't normally have access to, but giving people a chance to protect themselves is essential before publishing such details.
Conclusion
Google is full of extremely smart, and innovative people but the actions of this program show a lack of respect for end users. Handling vuln disclosure in this way is doing more harm than good and leads many people (more than a dozen I've spoken with) to believe that the company has some other agenda here, or that the process could be improved. Infosec people love that Google is pushing the envelope and funding such an initiative, we just hope they rethink how it's being handled, and at the end of the day make their user's safety a higher priority.
This post represents my views, and not that of my employer. If you want to debate I can be found on twitter @robertauger.
I understand that end users will be frustrated by the Project Zero's way of handling exploits, however, I suppose that Google is thinking about improving the responsiveness of companies that have a vulnerability in their software products. Fully exposing these problems to public will show that software security is an important issue and will force the companies to release their fixes much faster.
Posted by: Mark | Mar 13, 2015 1:42:02 AM
Personally, I take your view (aka industry view) of exploit being posted with mitigation info as being professionally responsible. Goog, otoh, deserves a nod for displaying how lethargic patching can be. Not that Goog represents moral high ground, not at all. Goog being the 1200 pound gorilla in the cage with other software Goliaths means lay users are no longer going to be told we don't understand the complexities of the modern leased operating system. This simply pushes the issue into the light, just a little bit harder than before. Brazen and inconsiderate to all parties.
Posted by: Anonymous | Feb 2, 2017 3:58:32 PM