Hacked Opinions: Vulnerability disclosure – Robert Hansen
Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focus on disclosure and how pending regulation could impact it. In addition, we asked about marketed vulnerabilities such as Heartbleed and bounty programs, do they make sense
CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A, or feel free to suggest topics for future consideration.
Where do you stand: Full Disclosure, Responsible Disclosure, or somewhere in the middle
Robert Hansen (RH) Vice President of WhiteHat Labs, WhiteHat Security:
I'm definitely in the middle. There are [types of] vulnerabilities that would cause far more harm than good if they got out. If I know that the vendor in question will act responsibly and close the vulnerability as fast as possible, I'm far more likely to tell them. If I know the vendor in question won't fix the vulnerability quickly, or if I know that the vendor is unethical, I'm more likely to go full disclosure. But for the most part, most companies do their best, and act correctly, so I use responsible disclosure.
If a researcher chooses to follow responsible / coordinated disclosure and the vendor goes silent -- or CERT stops responding to them -- is Full Disclosure proper at this point If not, why not
RH: Absolutely. At some point companies have to learn that their customers' security is important and deal with it correctly. And the researcher ultimately will side with the customer because they want the vulnerability closed. If the company refuses to close the vulnerability or respond, what other option do they really have Let the customers stay vulnerable That feels ethically questionable.
I've got a long history of going full disclosure against advertising companies as an example. If they're stealing peoples' privacy, I have less interest in protecting them.
Bug Bounty programs are becoming more common, but sometimes the reward being offered is far less than the perceived value of the bug / exploit. What do you think can be done to make it worth the researcher's time and effort to work with a vendor directly
RH: Ultimately this is a supply and demand question - who is going to pay the researcher whatever they want to get paid Some researchers have ethics that prohibit them from disclosing to anyone other than the company, but if they aren't getting paid enough they'll probably just stop doing the research altogether.
There are many vulnerabilities that are worth a lot to adversaries, and if the company isn't willing to pay fair market value, or even close, it's not a stretch to say that researchers with pure profit motives are going to look to more questionable markets.
Do you think vulnerability disclosures with a clear marketing campaign and PR process, such as Heartbleed, POODLE, or Shellshock, have value
RH: Somewhat. I think for the most part naming vulnerabilities is largely a holdover from virus research, which has a long history of naming viruses. Later it switched to naming vulnerability types and classes as well for convenience. It does make things easier to distinguish and therefore more convenient to talk about. But the hype makes it a bit annoying for researchers who have to deal with the aftermath.
If the proposed changes pass, how do you think Wassenaar will impact the disclosure process Will it kill full disclosure with proof-of-concept code, or move researchers away from the public entirely preventing serious issues from seeing the light of day Or, perhaps, could it see a boom in responsible disclosure out of fear of being on the wrong side of the law
RH: In most situations it probably won't matter much, but it will impact a handful of companies that do trade in 0days. There is some grey area though, where companies like WhiteHat find 0days in companies' websites on a regular basis. It's unclear how it would affect us and similar companies. Also, there is no accounting for the chilling effect that this type of regulation will have on the industry as a whole.
It won't kill full disclosure. If a researcher wants to go full disclosure, they will certainly find a way. But it may reduce it in cases where individuals don't stand to profit and don't want to risk running into legal issues in the process. The largest effect will be on the end consumer and companies, who will remain vulnerable longer than they need to be, due to the chilling effect.
The question people really need to be asking themselves is, in what way does this regulation actually thwart actual black markets or real adversaries It's a fairly small subset of people who will care about this regulation that wouldn't also fall under existing regulation. This is more or less a witch-hunt, that provides very little real value to the companies who use vulnerable vendors or the consumers who rely on companies to do the right thing.
It's extremely unlikely that someone who is concerned about the law will do anything other than hoard their knowledge, or go full disclosure over the dark-net. Why risk going public when the legal system appears to want to punish them every chance it gets. This is just another dangerous example of poorly thought out cyber-security legislation that will almost certainly cause more harm than good to an already complex ecosystem. Most well-intentioned cyber-security legislation doesn't stand up against the scrutiny of actual security research needs.