Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1 of 6
http://ecmweb.com/print/computers-amp-software/security-risks
print | close
Until a few years ago, breaches in cybersecurity, largely consisting of data theft and invasions of privacy for
financial gain, were strictly the domain of information technology (IT). However, in March 2007, researchers at
the Idaho National Laboratory, Idaho Falls, Idaho, at the behest of the U.S. Department of Homeland Security
(DHS), demonstrated the vulnerability of a diesel generator by hacking in from a remote access point and
causing it to self-destruct. Although the official report on the experiment, named Project Aurora, is classified, a
video showing the generator malfunctioning with thick smoke coming out of it captivated viewers on the
Internet. Project Aurora demonstrated the potential of malicious cyber attacks to compromise critical
infrastructure through manipulation of industrial control systems (ICSs). However, at the time, this type of
attack seemed only a remote possibility.
07-03-15 5:39 PM
Security Risks
2 of 6
http://ecmweb.com/print/computers-amp-software/security-risks
Windows Explorer to covertly place an encrypted file from the USB stick onto computers, German ICS security
expert Ralph Langner eventually figured out that its true aim was Simatic WinCC Step7 software, an ICS
manufactured by German electronics and electrical engineering manufacturer Siemens, which has been
installed to run the PLCs that control motors, valves, and switches in industries worldwide.
Langners conclusion was a surprising revelation. Previously, ICS and automation equipment were proprietary
and ran on isolated, stand-alone networks. Because of this, they were believed to be off the radar regarding cyber
attacks or not profitable to hackers who were mostly seeking financial gain. Therefore, an alarming lesson
learned from Stuxnet was its ability to infect machines that werent connected to a computer network or the
Internet. Furthermore, in recent years, ICSs have increased their interoperability and are more often connected
to the Internet. Creating more reliable and usable systems through automation and remote access can actually
make them more vulnerable.
Arguably, Stuxnet was the first case of a sophisticated attack that was meant to damage equipment, says Joe
Weiss, managing partner of the San Francisco Bay Area-based Applied Control Solutions, which provides
consulting services relating to the optimization and security of ICSs. Weiss, in addition to Langner, is recognized
as one of the few experts in ICS security. A 35-year veteran in the field of industrial instrumentation controls and
automation, with more than 10 years in ICS cybersecurity, he serves on numerous ICS security standards
organizations and is author of the book Protecting Industrial Control Systems from Electronic Threats.
Before, this was pretty simple, Weiss continues. If you had a problem, it was because a valve had a bad
design, and that was easy enough to fix. Or you had a problem with boiler control software. When you fixed the
boiler control software, it was done. Now this is ongoing. Every time you make a change, the question is: Have
you done something to create a cyber vulnerability that wasnt there before?
Law of Opposite
Unlike Stuxnet, not all cyber incidents involving control systems are intentional. For example, the natural gas
pipeline failure in San Bruno, Calif., which killed eight people in September 2010, was most likely caused by
work being done on an uninterruptible power supply (UPS) system located about 39.33 miles southeast of the
accident site, according to the National Transportation Safety Boards report. During the course of this work, the
power supply from the UPS system to the supervisory control and data acquisition (SCADA) system
malfunctioned so that instead of supplying a predetermined output of 24V of direct current (DC), the UPS
system supplied approximately 7VDC or less to the SCADA system. Because of this anomaly, the electronic
signal to the regulating valve for Line 132 was lost the loss of the electrical signal resulted in the regulating
valve moving from partially open to the full open position, as designed.
07-03-15 5:39 PM
Security Risks
3 of 6
http://ecmweb.com/print/computers-amp-software/security-risks
100 Days
07-03-15 5:39 PM
Security Risks
4 of 6
http://ecmweb.com/print/computers-amp-software/security-risks
Because of its stealthy nature, the Stuxnet worm was not discovered for more than a year as it did not directly
affect the performance and safety of its targeted nuclear plant. Moreover, because the vulnerability it exploited
was a design flaw and not patchable, DHS didnt even call it a vulnerability, says Weiss. It wasnt a design
deficiency for reliability or safety; only for security, he says.
However, since the discovery of Stuxnet and the vulnerabilities it exploited in the Siemens controllers, theres
been a move toward finding similar vulnerabilities in other controllers, particularly with the hard-coded default
passwords. Thats common in controllers from many of our vendors, says Weiss. Theyve always been there,
but now people are starting to find them.
07-03-15 5:39 PM
Security Risks
5 of 6
http://ecmweb.com/print/computers-amp-software/security-risks
Currently, Rios and McCorkle only share their research with ICS-CERT. However, they have plans to go public
with a database of their findings. At some point, were going to make these vulnerabilities public, says Rios.
That way, people implementing the software will have some idea that they have these exposures, and they can
take other mitigating actions to prevent exploitation of these vulnerabilities.
In the meantime, Rios discusses some of the research on his blog at: http://xs-sniper.com. When the
announcement of the final project comes, it will be published there.
Until the vendors take notice, however, both they and the integrators are in a bind, according to Weiss. The
vendors arent going to build a new secure system that costs more money if the end-users arent willing to pay
more money for a secure system, he says. And the end-users arent going to specify a more secure system that
costs more money and may or may not be as reliable as an older one without security if theyre not forced to do
so. Its a catch-22.
07-03-15 5:39 PM
Security Risks
6 of 6
http://ecmweb.com/print/computers-amp-software/security-risks
Despite the potential enormous impact the revelations about Stuxnet could bring to the industry, very little has
been done to improve security. So even though this was a big thing, its impact on changing peoples behavior
and taking security more seriously has been limited, says Weiss. We expected a lot more. Its not clear to me
that we have made much progress since 2000.
Steps can be taken to beef up security without sacrificing reliability or safety. Traditional security strategies,
such as adding a firewall or changing default passwords, dont necessarily work in a control system
environment. In fact, changing the default passwords in a PLC could effectively shut down the PLC. What is
important is to learn how to secure them while allowing them to continue to do their jobs, says Weiss, who
advises that because many control systems (especially field devices) have no security and may not be patchable,
it is critical that they be secured by policies and procedures meant for control systems.
But very few facilities have written ICS cybersecurity policies. Everybody has IT policies, but IT policies are not
adequate for control systems, continues Weiss. This is why the International Society of Automation (ISA)
initiated S99, Security for Industrial Automation and Control Systems. You cant just give this to IT and walk
away.
Thats not to say operators shouldnt take advantage of the security measures that come with the Windows
human machine interface (HMI) component of the ICS. That not only can but also should have all kinds of
available security, says Weiss. Its a Windows system. If youre not using the security, then shame on you.
However, security is more difficult as you move into the hardware component of control systems, such as PLCs,
chemical analyzers, variable-frequency drives (VFDs), and smart transmitters. If you start trying to change
them, with all sorts of crazy modifications, you can hurt reliability, Weiss says.
Asset management plays an important part in security also. Its important to know what equipment is installed
in the field and how its connected. Very few organizations know what they have installed in the field that can be
affected by cyber threats, says Weiss.
Furthermore, Weiss says much can be learned from past and recent incidents. He keeps a database with more
than 200 actual control system cyber incidents.
Security and compliance are not the same. You can be secure but not compliant, which is a paperwork
problem, or you can be compliant but not secure, which is a real problem, says Weiss.
There have been cases where two major electric utilities mandated the use of anti-malware for protective relays
even though the anti-malware could have prevented the relays from operating.
The question of responsibility for the security of the systems controlling infrastructure is complex. Vendors,
owners, and operators, along with researchers and regulators, all bear a portion. Its everybodys and
nobodys, says Weiss. Its the octopus with tentacles everywhere. Its a genuinely big problem, because it is
communication and it affects everyone.
The point of advanced automation technologies is to improve reliability and safety by having remote access to
substitute having experts at all of these locations. The irony is we are making these systems more vulnerable by
trying to make them more reliable and usable.
Source URL: http://ecmweb.com/computers-amp-software/security-risks
07-03-15 5:39 PM