Hacked Opinions: The legalities of hacking – Casey Ellis
Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. This week CSO is posting the final submissions for the second set of discussions examining security research, security legislation, and the difficult decision of taking researchers to court.
CSO encourages everyone to take part in the Hacked Opinions series. If you have thoughts or suggestions for the third series of Hacked Opinions topics, or want to be included as a participant, feel free to email Steve Ragan directly.
What do you think is the biggest misconception lawmakers have when it comes to cybersecurity
Casey Ellis, CEO, Bugcrowd (CE): The biggest misconception lawmakers have is that creating legislation for cybersecurity can be approached in the same way as it is done in the physical realm. Typical legislation is built with certain prerequisites for effectiveness, most of which require a physical presence and a physical jurisdiction. The Internet blurs those to the point where they should be treated as if they don't exist.
What advice would you give to lawmakers considering legislation that would impact security research or development
CE: Measure twice, cut once. Listen to the community of experts that are already at the table to help, and put real effort into understanding how this community interacts with your constituents and your economy to make it safer. Despite the fact that we may present like scary hacker people, we want the same things you do when it comes to the safety of the Internet and it's users; we’re on the same side here.
If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be
CE: I was asked this question before the changes to the DMCA. I was going to say I'd like to see exemptions for copyright and reversing laws made available for good-faith security research. I guess the Copyright Offices ears were burning.
Now, given what you've said, why is this one line so important to you
CE: I fear ambiguity between hacking law and copyright law, creating a condition where the legality or illegality of a researcher can be determined by the vendors before the fact. This has great potential to send the industry underground, which is not a good outcome.
Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not
CE: It depends. If the bug is fixed and the term of engagement was responsible disclosure, then definitely not. If the researcher was engaged with an agreement of non-disclosure before the fact, then definitely yes. Those are the easy ones. Then there's this extremely blurry bit in the middle, which ultimately should never be necessary.
What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us
CE: Corporate to government data including known bad actors (identity, IP, netblock, etc.) and threat intelligence in the form of bad actor activity. Vulnerabilities are a tricky one because corporations have to trust the government to do the right thing with them, but ultimately patterns that suggest 0-day or 0-day behavior should be shared as well.
I believe everything the government aggregates from the corporate world should be shared back, but the unfortunate tendency with stuff like this is someone comes in and deems it classified once the data set reaches a critical mass. This is a known issue in government circles and I'm hopeful that it will change at some point, but I don't expect anything soon.