Cultural issues: IT and OT, security and operations.
By The CyberWire Staff
Nov 1, 2018

Cultural issues: IT and OT, security and operations.

There may be a growing awareness among corporate board members of the cyber risks to industrial control systems. That's one of the relatively positive outcomes of the pain inflicted by last year's NotPetya infestations. But there are still other cultures whose members need more work on developing mutual understanding and mutual trust. Several speakers addressed these rifts during SecurityWeek's 2018 ICS Cyber Security Conference.

IT and OT remain different worlds.

While there may be an approaching convergence of information technology (IT) and operational technology (OT), the two worlds remain farther apart, culturally and technically, than one might wish. Indegy's Barak Perelman emphasized the informal modes of information transmission still found in OT (that system was inherited, there were lots of changes made along the way, it's been around for years, and there's no documentation) and a lack of IT appreciation for the realities of industrial systems ("No, I can't just restart the turbine").

Consequence-driven risk management.

LEO Cyber Security's Clint Bondungen stated a first principle: "we do cybersecurity because cyber threats pose a risk to the business." This should be obvious, but the extent to which it seems to be overlooked in practice make it worth repeating. He argued that cyber risks should be viewed as process hazards. Identifying consequences helps determine safety controls and define the possible impact of events. (He also offered a skeptical take on the familiar risk equation, which depends upon speculative numbers and lends a specious appearance of rigor to what in fact is a questionable and subjective process. In very simplified form that equation is Risk=Impact X Probability / Cost. The values are all too often impressionistic at best, a little like those applied to the variables of the famous Drake Equation for calculating the likelihood of extraterrestrial life.)

Bondungen argued that risk assessment should be cast in the form of process hazard analysis, with "cyber" viewed as another process hazard. He recommended starting from the engineering and working out from there. Determine criticality, identify vulnerabilities, identify communication paths, and manage risk starting with the most critical and most exposed assets.

Attend to realities, and don't be carried away by the arrogance of understanding.

Two security leaders from Sony, Kristin Demoranville and Stuart King, described the realities of assessing security in factories. A security assessment is neither a tour nor a policy enforcement drill. Their argument was that security comes down to people and process, which is neither surprising nor controversial, but the lessons they drew were instructive. It is essential to recognize, they said that "anything will break production." That is, surprising events that you, the security officer, would not expect to be a problem, in fact can disrupt industrial processes. It's important to discover the factory and understand how it works, and it's important to establish trust with the people who work there. "Hanging out on the line and in the break rooms," will give you a realistic appreciation of a facility's risk. You will find, Demoranville and King said, that not everything that looks like a risk is in fact a risk, and that many things that look benign actually do present hazards.

Among the surprises security inspectors find in some production facilities is that bags of sugar and flour tend to block WiFi. This is the sort of detail one realizes only when one seriously engages the physical plant. 

Finally, they said that "a factory is a family." People work there for years, and they know one another, and they don't generally know you, the security assessor. It's important to work to gain their trust.

Responsible disclosure: a case study of establishing trust between ICS vendors and cyber firms.

Yesterday a joint presentation by CyberX's Phil Neray and Emerson Automation Solutions' Neil Peterson described how responsible disclosure works in the ICS space. Without naming the particular vulnerability CyberX flagged for Emerson and enabled them close, they described the process they followed that enabled them to increase system security:

  1. CyberX researchers first informed ICS-CERT of their discovery.
  2. They contacted Emerson through ICS-CERT, essentially establishing trust and credibility through this mutually trusted third party.
  3. CyberX demonstrated the exploit to Emerson.
  4. Emerson convened its product incident response team.
  5. Emerson made and verified patches.
  6. Emerson privately pushed the patches to its own customers.
  7. Emerson publicly disclosed the issue to warn the community at large.
  8. And, finally, they acknowledged the researchers and their work.

The importance of the trusted intermediary, in this case ICS-CERT, is worth noting. It's also worth noting that there are trust issues between security firms and ICS vendors as well as between security personnel and factory personnel. In both cases it's possible to establish trust, and there are well-understood approaches to doing so.