For too long the industry has cultivated a “dark arts” reputation that discourages diversity and undermines effective communication. A cybersecurity CEO offers a three-step course correction.
Culture of Cybersecurity- Last year saw some of the largest security breakdowns of all time. Half a billion Marriott Starwood customers had their personal data compromised. More than 100 U.S. university research programs had valuable intellectual property stolen. Ransomware attacks disrupted municipal services in Atlanta and Baltimore. In the midst of these spectacular failures, spending on cybersecurity exceeded $80 billion in 2018, more than 2,000 security vendors are operating in the U.S. alone, and corporate executives and boards are carving out more time than ever to consider security risks. In what other arena is widespread success so elusive, and why does this strange anomaly persist in security?
Yes, cybersecurity is hard and always will be. Attackers will continue to innovate with dynamic new techniques, and opportunities to sow mayhem will proliferate as we bring more of our everyday lives into the digital realm.
But we can do better.
The big problem in security isn’t people, process, or technology. While imperfect, the industry is filled with hardworking and talented people, security awareness and processes are improving rapidly in most organizations, and there is no shortage of good technology.
The big problem is cultural, and it is at the root of all these other shortcomings.
Security too often wraps itself in an immature, dark-arts culture that hurts the people, the process, and the tech. This culture enables a lack of diversity in its talent base and deters new entrants, furthers its weaknesses in communicating effectively with its real constituents–the corporate and government leaders on whom citizens depend for security in their daily digital lives, and encourages tolerance for arcane, overly complex, and hard-to-use tools.
That this culture exists should not surprise us. There is a mystique to the cyber world. Many security researchers and engineers take pride in having earned their skills after countless energy-drink-fueled, late-night hours, often in classified arenas or in roles otherwise subject to confidentiality. The teams, the approaches, and the tools born of this mind-set and favored across the industry today are therefore mostly self-referential, aimed at experts, and indecipherable–by design–to outsiders. But there are not enough of these hyper-skilled defenders to fill the ranks of organizations faced with increasingly sophisticated attacks, and members of the security community aren’t the people it exists to serve.