The enemy within your PLC
26 October 2010
Recent research* has shown that while people are aware of computer viruses and the need for caution when using the Internet, they still see work-related security as someone else’s responsibility. Attitudes like this can have major implications for the security of control systems when a virus or other malware – perhaps specifically written - breaches peripheral security systems, warns David Robinson
The high profile IT security breach of the Iranian Bushehr nuclear power plant demonstrates just how easy it is for an employee to introduce malware into an infrastructure provider. Fortunately, the Stuxnet virus that infected Bushehr only got as far as staff PCs and did not gain access to the large number of PLCs operating in the main plant. Siemens wasn’t so lucky; the Stuxnet virus reportedly targeted the company’s industrial control software. Once hijacked, the Stuxnet code can re-programme the PLC, which subsequently sends new instruction to the machine it controls, with potentially disastrous results.
In 2009, the US government admitted that software had been found that could shut down the nation’s power grid. This new type of virus has a boot file built-in. This activates as soon as the memory stick is powered up on insertion into a USB port. But it’s not just memory sticks that are putting these systems at risk. These days anyone with a laptop or a device that connects remotely to a wireless network inside a company’s firewall, is putting that company at risk. It will just be a matter of time before Stuxnet is evolved to wreak havoc on control systems and any other system that the user connects to if their laptop or portable device is infected.
My company has commissioned research among ordinary workers and found that more than one in two of those surveyed are more cautious with security issues when using their own PC/laptop that they are with their work one, a worrying statistic for any business. And more than three in four workers would expect a pop up to appear on their screen to alert them to a breach of security, which, of course, is not always going to happen.
In most cases, SCADA systems are independent of the Internet, which is how this very clever virus works: it targets the systems from within; once it is past the perimeter fence, the main vein is easily infected. This not only has implications for the processes and controls, but also for the health and safety of plant personnel. The health and safety implications of a computer security breach are generally indirect in their results. If a piece of malware infects a system and it is compromised, it can potentially behave abnormally. And when systems, machines, processes working on some networks behave abnormally, it can put people and premises at risk.
You only have to think of disrupted waste processes, where chemicals or other waste products can get into rivers and quite possibly contaminating water supplies to homes and businesses. That’s a point well illustrated by the story of a disgruntled Australian engineer who, some ten years or so ago, breached his company’s computer system to dump hundreds of thousands of gallons of sewage into rivers and parks in Australia.
Such examples are rare, but that the potential was definitely there. At the other end of the scale, systems corrupted by simple virus infection or more sinister malware can cause havoc with relatively simple processes. Breakdown in communications between machines, disrupted control systems – if there’s a delay when someone presses a button the ramifications are far worse than just failing to print something out. Everything can have a knock on effect that might eventually compromise health and safety.
Another case in point is the Slammer worm which was the culprit a few years ago when it infected the safety monitoring system of the Davis-Besse nuclear power plant in the USA. The worm came in via a contractor’s laptop and bypassed the plant’s firewall. It then spread from the business network to the plant network, where it found an unpatched Windows server and crashed the nuclear plant network. You might not run a nuclear network, but just imagine an ostensibly innocent contractor coming in and crashing the system that you run to keep many thousands of people supplied with communications, water, power or energy.
Much of the problem is a lack of understanding of the risks associated with increased connectivity between former ‘islands of automation’ such as process plants, manufacturing sites, distribution centres and so on, and the business systems operated in companies’ head offices.
Large organisations and individual workers alike often work with a false sense of security. The corporate view might be that so long as there is an IT department, then its staff will have made sure that all the necessary precautions are taken at every point and that no one else needs to worry. The individual worker believes that his or her computer will always pop up a warning if a virus is present, and that necessary action can then be safely taken.
Both are labouring under the misapprehension that all departments are in constant contact with the IT people, and that everything is always up to date and that the authors of malicious malware are not one step ahead of less than robust procedures.
David Robinson, UK country manager at Norman Data Defense, has fifteen years experience working with companies such as Mitsubishi, Rockwell and Intellution, working on SCADA and plant intelligence software
* The survey by Populus of more than 1,000 working adults across the UK was commissioned by Norman Data Defense. It showed that eight out of ten people expected their computer to tell them when there was a virus present. Half of them said they were more cautious with their personal computers than with their work ones, and more than two thirds would happily open attachments from a friend.
Contact Details and Archive...