SEC Charges CISO - What It Means

On 10/30/2023, the SEC announced charges against SolarWinds and its CISO based on the allegation that they provided misleading information about the company's cybersecurity policies and known risks in the wake of the SUNBURST attack. I was surprised by the release. This post digs a bit into the detail, what it means given evolving SEC requirements and what it means for CISOs. TL;DR - the job is getting even harder - but there is also an opportunity to turn this into an empowering development as a forcing function for appropriate funding.

The Charges

Basically everyone in the security industry knows that SolarWinds was a victim of a long term campaign targeting their clients through their platform. It has become a poster child (in a bad way) of problems with supply chain security.

The crux of the charges in this complaint are that SolarWinds public statements about its security were not aligned to what it knew internally.

The detail seems to be that:

  • Engineers had raised concerns that were not addressed
  • Engineers couldn't keep up with reported incidents and the company knew this
  • The company provided an incomplete disclosure in their Form 8-K filing in December 2020
  • The security function at SolarWinds was aware of security deficiencies but did not report them

This all sounds really bad, right?

Where Is The Bar?

In a way it does sound terrible. But I would counter that every large company has security deficiencies they're not telling us about. Every company has engineers raising issues that can't be adequately addressed. All of the incentives (stock price, compensation) align toward minimizing the real and perceived impacts of issues. Most companies gloss over incidents and their stock price recovers and the markets move on.

But perhaps more importantly, there is no clearly defined set of things we can do to secure a company. There are lots of different standards. But none of them specify at a fine grained level of detail what a company does or does not have to do. There is always room for interpretation. This is, in my opinion, in large part due to the inherent complexity of computing systems. It is also why I think (and blogged) that insurance is probably not the right mechanism for this. Ultimately, we can't regulate or insure or sue our way into security if we can't define the difference between a secure system and an insecure system!

This is what I wrote in 2020 when I was advocating for helping Markets to push for better security:

When it comes to software, or IT in general, rather than regulate or insure, I would argue that we need to find ways to make the difference in security visible so that people buying can reason about it.

In some ways, the SEC is trying to make it so that security information is visible and markets can respond. In that sense, I like the direction.

The Newer SEC Rules

One has to believe that the new SEC rules around disclosure (Adopted July 2023) are intended to clarify the responsibilities of companies like SolarWinds. Of course, it doesn't seem like they could apply retroactively to the 2020 December Form 8-K, but maybe SolarWinds even inspired these new rules.

The new rules talk about a lot of things but maybe they boil down to:

  • Disclosing "material" cybersecurity incidents within four days
  • Describing processes around risk, including the risk assessment program
  • Board oversight of cybersecurity

The press release talks about making the disclosures and formats consistent so that investors can compare them. I think that is a good idea, but I didn't see any standard format proposed. As such, comparing company responses will be really hard. As in so hard as to be practically impossible for most investors. It will also be hard for companies to know what to provide.

For all of these reasons, I believe the newer SEC rules will continue to evolve significantly. Many of the comments on previous revisions pointed out that the rule may have scary cybersecurity consequences as well. For example, the rule may force companies to build programs aligned to the SEC reporting or to disclose information related to cybersecurity risks. If only there were a standard way to know about the security of a company...

The Myth of the Security Standard

In the background there is an idea that I think sits behind insurance, regulation and other ideas for more strictly controlling cybersecurity exposure and that is that it is possible to define a security standard that a company can follow that directly translates to security.

There are thousands of best practices that need to be employed to secure even a small business. The detail of configuration is unique to each company and business domain. There are standards like NIST 800-53, ISO 27001 and PCI-DSS that provide some level of guidance, but they are not definitive in most cases. For any control in any of these standards, there are multiple ways of thinking about it and probably within any reasonable sized company, multiple if not a large number of cases to consider, document and audit against. I have never seen a complete security plan.

SOC 2 carries a very low level of definition about things companies must do. It also suffers from huge variation in implementation detail from auditor to auditor. I basically think SOC 2 is hurting the industry now, but that is a side story here because it has no place in a discussion where investors make decisions about security based on something reported to the SEC.

We can also look at Supply Chain Security or Vendor Management or Third Party Risk as an interesting point of reference. There is no standard way to evaluate a third party's security posture. There are dozens of template standards and questionnaires, every one of which is limited and flawed. Companies with great audit and "risk score" coverage get hacked. If there were a standard that really worked here, people would probably unify around it, but there is not. It is a chaos of dysfunction in my opinion. All of these things are only weak proxies for a sense of an organization's real security.

The bottom line is that while some of these standards are super useful for promoting appropriate coverage of a security posture (and we use them widely in eg. securityprogram.io), they are not something that you can just do and then feel certain you will be secure or even safe from future SEC scrutiny.

What This All Means for CISOs

If I were a CISO, I would be really nervous right now. What are you going to report in your next public filing with the SEC? I'm having that conversation with several clients right now.

Is the SEC really expecting companies to disclose material risks from internal control gaps? CISOs already have a very hard job just trying to avoid breaches. Their jobs just got harder and the stakes got higher because they can be subject to specific lawsuits from regulators.

I have no doubt that there are CISOs that obscure the quality of their organization's security. That is clearly not a good thing for the company or investors - but as mentioned before, it might align with their incentives. Some percentage of those CISOs are acting in alignment with their organization's leadership. This is another challenge. How can an outside lawsuit decide when to hold a CISO responsible vs. a board vs. other executives? In cases where the pressure is very high or there isn't great alignment, it is going to be very difficult for organizations to make and justify their security spending decisions and the risks they accept.

There are also "good" CISOs doing their best and fighting for resources across a business that are not doing everything investors would like them to do. Making this visible might make executives and boards more likely to invest more in security and ultimately close more security holes. To the extent this is true, it may mean that companies start to better align their security investment with their risks. It might improve security and safeguard investors. That is kind of the good case or best case scenario.

If you look at CISOs job tenure data, I suspect a big reason for high turnover (in addition to stress) is a lack of alignment at an organizational level with the folks funding the security program. When CISOs turn over every few years but tech debt lasts a decade or two, how can we really hold CISOs responsible for these problems? Clearly in the SolarWinds complaint, the deficiencies in the product were woven in well before the CISO took the position. Ultimately, it is the whole leadership team and the board that should be responsible for chronically underfunding security. The CISO is just one of the parts in that machine.

Share this article with colleagues

Matt Konda

Founder and CEO of Jemurai

Popular Posts

Ready to get started?

Build a comprehensive security program using our proven model.
© 2012-2024 Jemurai. All rights reserved.
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram