Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.
CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.
What do you think is the biggest misconception lawmakers have when it comes to cybersecurity
Justin Harvey, CSO at Fidelis Cybersecurity (JH): The biggest misconception is that today’s cybersecurity problems can be solved with technology or threat intelligence [sharing]. The problem today is there is a shortage in information security skills and user security awareness is still quite low. Security is a people problem. Lawmakers need to understand this.
What advice would you give to lawmakers considering legislation that would impact security research or development
JH: Start ‘em young. We need to invest in the future of the nation's cybersecurity talent, there is no technical solution to help us right now. There are programs like 1nterrupt out of Boston that is teaching middle and high school teenagers about cybersecurity and how to detect/respond to cyber threats.
Programs like this are going to turn the tide in years to come to build up the collective cybersecurity knowledge. Imagine if there was a shortage of police investigators You can’t simply hire detectives off the street and hope they develop the necessary skills, they undergo years of specialized training, scenarios and require a certain aptitude. The same is true for cyber-sleuths.
If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be
JH: President Obama lost a big opportunity with the latest agreement made with President Xi. The agreement was for China and the US to not spy on each other where there was a direct effect of commercial or competitive gain.
The problem with this approach is that it basically left all other cyber espionage in the game. Had he simply said all cyber espionage, this would have at least given the US grounds to escalate or take action the next time they committed a cyber attack.
On a related note, I don’t think there’s anything that can save CISA at this point, regardless of how many lines I could add. The government should be focusing on how we speed detection and response capabilities within the private and public sectors.
Threat intelligence [sharing] only speeds up detection when the attack has been seen before, curated and distributed. Many of the worst [targeted] attacks that are perpetrated are known as “zero days.” This means that they take advantage of vulnerabilities in software that the developers don’t even know exist.
Now, given what you've said, why is this one line so important to you
JH: It’s important because billions of dollars of intellectual property has been sucked out of this country. Gen. Keith Alexander, previous director of the NSA called it the “greatest transfer of wealth in history.”
This has become a national epidemic and there is no sign of it stopping. It is important for me this to stop because if left unchecked, China will become an economic, political and military superpower that rivals the United States. It is unknown what could happen in the future.
Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work Why, or why not
JH: It depends, is the researcher producing work that is considered intellectual property of the company Does their employment agreement outline these expectations If not, then it is a toss-up. I could see cases where the researcher is working on a project that exposes a company’s intellectual property or compromises an upcoming patent.
Again, this all comes down to the employment agreement between the researcher and the company. How about I do you one better Do you think a government should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work
What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government What should the government be sharing with the rest of us
JH: Companies should feel free to share threat intelligence and information on attacks with the government, [but] it’s completely optional. This goes for IP’s, domains, malware samples, hashes, tactics/techniques/procedures, etc. However, they shouldn’t be mandated to disclose this information if they don’t want to, unless by court order and even then, there better be a pressing national security issue at stake.
The government should absolutely be sharing the threat intelligence that the NSA, FBI and CIA are collecting with companies in a secure, controlled manner. I recognize that there are cases where the government can’t share the intelligence based upon national security or an on-going investigation, but they can do a lot better.