It was before the Stuxnet virus invaded the world of control systems, before Andy Greenberg’s book Sandworm opened the world’s eyes to the danger of cyber-attacks in operational technology (OT). It was a couple of degrees below zero on a winter’s morning in the South African Highveld. We were in the site office of a top five distributed control systems (DCS) vendor right in the middle of the world’s largest Coal-to-Liquids facility. To this day, I visualise my commute to that office whenever I hear the cyber term “defense in depth”. To enter the “secondary area”, I showed my ID card to an armed guard rugged up against the elements in a thick coat. I then drove through the game reserve surrounding the plant where two herds of impalas stood, escaping the weather in the radiant heat under one of the massive safety flares.
I reached the imposing fence where the real physical security begins. The fence was high enough to prevent hand grenades from being lobbed over with military-grade turrets spaced out every two hundred meters. A relic from the days when the country was on the brink of a civil war. To get through the gate to the “primary area” I swiped my access card while a guard inspected my vehicle. My passengers had to get out and enter through cage-like turnstiles.
I was early for the meeting which allowed me to observe the pre-meeting activities. The typical early 2000s big site boardroom was dominated by a huge table in the middle and a large projector screen in the front. One of the seats in front had a mouse and keyboard that connected to the room’s PC below the desk, with a cable to connect the overhead projector.
Non-typically, this boardroom had two PCs connected to two networks. One to the business network using a normal ethernet cable, and one connected to the process control network (PCN) with a bright red connector piece on the ethernet connection.
The presenter, who was upstanding and trusted enough to make it through all the physical barriers, was getting ready to show us something on his laptop. People started to trickle into the room. The smartest guy in the room took a seat opposite the presenter two seats down while other people started to take their seats. Being a junior, I literally took a back seat.
Then it happened.
The presenter needed an internet connection for his demonstration and started to plug the red-capped PCN cable into his laptop. In a split second, the smartest guy in the room launched himself across the table. He didn’t stop to think about personal safety or dignity. He lunged across the table grabbing the PCN cable and flinging it as far away from the laptop as possible.
"That's the PCN cable, you need to use the other one," he said dryly as he retreated back to his seat.
The meeting continued without interruption. But it wasn’t the content of the presentation that left the greatest impression, it was the guy who leapt across the table.
In the fifteen years since I have often thought about that jump. Why was the smartest guy in the room not willing to endure a millisecond of connectivity from the presenter's laptop?
Was it because business IT was so infected with viruses that we cannot trust upstanding people's laptops?
Was the PCN so defenseless that it couldn’t withstand a half-minute contact with the outside world?
I came to realise the jump was not motivated by the difference in probabilities of a cyber-attack coming through the two cables, but rather the impact of an attack coming through the cable with the red cap. The business IT world is only concerned with ones and zeros, while in OT systems, ones and zeros have immediate, and potentially catastrophic consequences in the physical world.
I have seen OT ones and zeros close the hydrogen feed into a furnace, raising the temperature high enough for long enough to change the metallurgy of the internal pipes. The now brittle pipes with pure hydrogen inside and a fiery inferno outside were essentially converted into a bomb. Thankfully it was not a malicious, cover-your-tracks cyber-attack, only a very costly error that was detected and promptly dealt with.
In the last 15 years, cyber-security in OT has come a long way, but there are still lessons I take from that jump:
IT and OT should be physically and logically segregated: Without the red cap on the ethernet cable there would be no jump and more risk. If the PCN computer were in a dedicated room, there would be no jump and no risk. In today’s work, this means separate networks, separate hosting environments, and separate authentication.
The right way should be the easy way: Engineers need a space to conduct reviews on the PCN as a group. Failure to provide a secure easy place to do so left engineers to “make a plan” that led to a PCN cable being within easy reach in the boardroom. Today, this means easy processes for obtaining OT credentials and secure jump boxes for remote access.
Most important of all, everyone should be fanatical about following OT access processes (and actions speak louder than words). My story of seeing a senior engineer sprawled across the table should serve as a timely reminder for all of us about the importance of protecting the OT network. Fortunately, with the cyber security monitoring available today, we no longer have to rely on people jumping over office furniture. High fidelity alerts followed by decisive action when breaches are detected will do just as good.