Sei sulla pagina 1di 6

Security Risks

1 of 6

http://ecmweb.com/print/computers-amp-software/security-risks

print | close

Electrical Construction and Maintenance


Beck Ireland

Beck Ireland, Staff Writer


Sun, 2012-01-01 09:00

Until a few years ago, breaches in cybersecurity, largely consisting of data theft and invasions of privacy for
financial gain, were strictly the domain of information technology (IT). However, in March 2007, researchers at
the Idaho National Laboratory, Idaho Falls, Idaho, at the behest of the U.S. Department of Homeland Security
(DHS), demonstrated the vulnerability of a diesel generator by hacking in from a remote access point and
causing it to self-destruct. Although the official report on the experiment, named Project Aurora, is classified, a
video showing the generator malfunctioning with thick smoke coming out of it captivated viewers on the
Internet. Project Aurora demonstrated the potential of malicious cyber attacks to compromise critical
infrastructure through manipulation of industrial control systems (ICSs). However, at the time, this type of
attack seemed only a remote possibility.

The Stuxnet virus brought awareness to security


issues surrounding control systems at facilities
such as power plants like this nuclear plant.
Then, in July 2010, it was revealed that a virus had targeted the programmable logic controllers (PLCs)
controlling the centrifuges of an Iranian nuclear facility in Natanz located in central Iran. After inspectors with
the International Atomic Energy Agency noticed an unusual number of decommissioned damaged centrifuges at
the plant, it was eventually discovered that in June 2009, the malicious software, or malware, had made its way
most likely through infected USB flash drives into computers in the plant.
The virus, named Stuxnet, used a number of zero-day exploits, which means the worm was created to exploit
vulnerabilities in software that had no known patches. Although the malware first exploited the LNK file of

07-03-15 5:39 PM

Security Risks

2 of 6

http://ecmweb.com/print/computers-amp-software/security-risks

Windows Explorer to covertly place an encrypted file from the USB stick onto computers, German ICS security
expert Ralph Langner eventually figured out that its true aim was Simatic WinCC Step7 software, an ICS
manufactured by German electronics and electrical engineering manufacturer Siemens, which has been
installed to run the PLCs that control motors, valves, and switches in industries worldwide.
Langners conclusion was a surprising revelation. Previously, ICS and automation equipment were proprietary
and ran on isolated, stand-alone networks. Because of this, they were believed to be off the radar regarding cyber
attacks or not profitable to hackers who were mostly seeking financial gain. Therefore, an alarming lesson
learned from Stuxnet was its ability to infect machines that werent connected to a computer network or the
Internet. Furthermore, in recent years, ICSs have increased their interoperability and are more often connected
to the Internet. Creating more reliable and usable systems through automation and remote access can actually
make them more vulnerable.
Arguably, Stuxnet was the first case of a sophisticated attack that was meant to damage equipment, says Joe
Weiss, managing partner of the San Francisco Bay Area-based Applied Control Solutions, which provides
consulting services relating to the optimization and security of ICSs. Weiss, in addition to Langner, is recognized
as one of the few experts in ICS security. A 35-year veteran in the field of industrial instrumentation controls and
automation, with more than 10 years in ICS cybersecurity, he serves on numerous ICS security standards
organizations and is author of the book Protecting Industrial Control Systems from Electronic Threats.
Before, this was pretty simple, Weiss continues. If you had a problem, it was because a valve had a bad
design, and that was easy enough to fix. Or you had a problem with boiler control software. When you fixed the
boiler control software, it was done. Now this is ongoing. Every time you make a change, the question is: Have
you done something to create a cyber vulnerability that wasnt there before?

Law of Opposite
Unlike Stuxnet, not all cyber incidents involving control systems are intentional. For example, the natural gas
pipeline failure in San Bruno, Calif., which killed eight people in September 2010, was most likely caused by
work being done on an uninterruptible power supply (UPS) system located about 39.33 miles southeast of the
accident site, according to the National Transportation Safety Boards report. During the course of this work, the
power supply from the UPS system to the supervisory control and data acquisition (SCADA) system
malfunctioned so that instead of supplying a predetermined output of 24V of direct current (DC), the UPS
system supplied approximately 7VDC or less to the SCADA system. Because of this anomaly, the electronic
signal to the regulating valve for Line 132 was lost the loss of the electrical signal resulted in the regulating
valve moving from partially open to the full open position, as designed.

07-03-15 5:39 PM

Security Risks

3 of 6

http://ecmweb.com/print/computers-amp-software/security-risks

In 2007, a DHS demo showed that emergency


generators can be vulnerable to cyber attacks.
Whether intentional or malicious, Weiss uses the NIST definition of a cyber incident, which is electronic
communication between systems that affects confidentiality, integrity, and/or availability the CIA triad. For
control systems, availability has the biggest impact. Were talking about the availability of a system or a
process, says Weiss. Thats where this gets to be very different from IT, which is about the loss of
communications to an IT computer.
As an example of loss of communication in control systems, Weiss uses a broadcast storm (too much data on the
network) at the Browns Ferry Nuclear Plant, which resulted in the loss of two 10,000-horsepower main coolant
pumps.
Until recently, the priorities for design requirements for control systems consisted of performance, reliability,
and safety not security. Security is not only a new constraint, but also often goes in the opposite direction of
reliability and safety, says Weiss, who explains that securing a system requires retrofitting new security
requirements where none previously existed. Youre not building a new system from scratch. Youre putting
requirements into a system that go exactly the opposite of why the system works in the first place. Youre making
it more complicated.
According to Weiss, Stuxnet is a great example of the differences between security for control systems and IT.
Many people focused on the Windows zero-days, but they were simply a delivery vehicle, Weiss explains. The
warhead affected the controller by changing the controller logic. This was an unexpected ICS attack for which no
IT security solution applied then or applies now.
Therefore, security for control systems should not be approached in the context of IT. Consequently, says Weiss,
the security requirements are not based on what it takes to secure a control system against control system
threats, but for IT systems (Windows servers and PCs) used in control system applications against IT threats.
Unfairly then, ICS devices, which have proven safe and reliable for years, have been accused of failures in
security. The devices are really good, well-designed, reliable, and safe systems, says Weiss. They were never
meant to be security systems, but now theyre being accused of not doing what they were not designed to do.
Hows that for a double negative?

100 Days

07-03-15 5:39 PM

Security Risks

4 of 6

http://ecmweb.com/print/computers-amp-software/security-risks

Because of its stealthy nature, the Stuxnet worm was not discovered for more than a year as it did not directly
affect the performance and safety of its targeted nuclear plant. Moreover, because the vulnerability it exploited
was a design flaw and not patchable, DHS didnt even call it a vulnerability, says Weiss. It wasnt a design
deficiency for reliability or safety; only for security, he says.
However, since the discovery of Stuxnet and the vulnerabilities it exploited in the Siemens controllers, theres
been a move toward finding similar vulnerabilities in other controllers, particularly with the hard-coded default
passwords. Thats common in controllers from many of our vendors, says Weiss. Theyve always been there,
but now people are starting to find them.

In the past, availability has trumped security for


facilities such as water treatment plants.
For instance, for the last six months, Billy (BK) Rios, security expert at Google and former security engineer for
Microsoft, and Terry McCorkle, an information security red team member for Boeing, have undertaken an
independent ICS security project. The two set out to find 100 security flaws in 100 days so that they could then
present their findings at a conference. We wanted to take a look at the state of security for other control systems
in general, says Rios. We wanted to approach it from the perspective of what can two regular guys do without a
charter and without a lot of money when it comes to control systems and their security.
The pair began requesting free trial software and auditing it in their spare time on nights and weekends. Instead
of the 100 bugs in 100 days, they ended up finding more than 600 in 100 days. When I look at what a modern,
secure machine and robust software look like, control systems software does not fall into that category, says
Rios. The software robustness and security of Apples iTunes is probably better than 99% of the control systems
Ive seen out there, both at the hardware and the software level thats pretty sad.
Rios and McCorkle reported their findings to the Industrial Control Systems Cyber Emergency Response Team
(ICS-CERT), overseen by DHS, which has since issued some public advisories based on their research. Recently,
ICS-CERT published an alert regarding Internet-facing control systems. The ICS-CERT warning lists five reports
so far in 2011 of SCADA and ICS systems exposed using scanners, and warns asset holders to secure their
systems. The thing we found to be the most disappointing piece is that implementers usually dont know about
this kind of stuff, says Rios. They buy the software from whatever vendor, and they implement it into their own
environments, on their own networks, and theyre just completely vulnerable to all sorts of crazy stuff.
In addition, Rios and McCorkle work with ICS-CERT to notify software vendors of their vulnerabilities. Weve
been working with those guys to basically contact vendors and have them fix their software, says Rios. Its a
very sensitive subject. They dont want to go to their customers and say, Yeah, our software quality from a
security standpoint isnt very good. They dont want to say that to people, for obvious reasons.

07-03-15 5:39 PM

Security Risks

5 of 6

http://ecmweb.com/print/computers-amp-software/security-risks

Currently, Rios and McCorkle only share their research with ICS-CERT. However, they have plans to go public
with a database of their findings. At some point, were going to make these vulnerabilities public, says Rios.
That way, people implementing the software will have some idea that they have these exposures, and they can
take other mitigating actions to prevent exploitation of these vulnerabilities.
In the meantime, Rios discusses some of the research on his blog at: http://xs-sniper.com. When the
announcement of the final project comes, it will be published there.
Until the vendors take notice, however, both they and the integrators are in a bind, according to Weiss. The
vendors arent going to build a new secure system that costs more money if the end-users arent willing to pay
more money for a secure system, he says. And the end-users arent going to specify a more secure system that
costs more money and may or may not be as reliable as an older one without security if theyre not forced to do
so. Its a catch-22.

The Official Octopus


Eventually, Langner came to realize that Stuxnet wasnt just targeting the Siemens controller; its code contained
information about the specific technical configuration of the nuclear facility. It was only targeting that particular
facility. As such, Stuxnet was not treated as a wide threat to critical infrastructure that relies on automated
systems, such as the electrical grid, transit systems, and sewage treatment plants and dams, or even non-critical
systems such as automation for manufacturing facilities and electrical and mechanical systems for offices,
schools, and hospitals. Also, because the attack slowed down the Iranian nuclear program, it was reported in a
positive light. Because it happened to Iran, and it was done to a centrifuge, people looked at it as a good thing,
says Weiss. People are going, Well, I dont have a centrifuge; therefore, it cant affect me. So, unfortunately,
the security guys went nuts with it, but most of the operations people basically went, Well, thats interesting, but
it doesnt affect me. It hasnt had nearly as much of an impact as we thought or we hoped it would in terms of
getting people to do the right thing and secure their systems.
Still, there have been some efforts made to improve ICS security. In the last year, the Obama administration
launched a cyber command at the U.S. Department of Defense, which has improved coordination between the
Pentagons efforts and the DHS initiative on the civilian response to cyber threats. Yet, although some
regulatory agencies have laid out best practices for guarding against exploits, compliance is voluntary, and,
according to Weiss, is often missing input from ICS experts. Arguably, there are only a limited number of
people who are actually control system cybersecurity experts, says Weiss. However, those people are generally
not consulted when the subject of control system security is raised. The Enduring Security Framework (ESF)
Operations Group not only has no control system experts, but it also hasnt even included control system
suppliers in the mix.
In addition, Weiss finds the recent U.S. Department of Energy (DOE) and DHS road maps are vague and do not
address the control system cybersecurity issues actually being faced. According to Weiss, DOEs draft
Electricity Sector Cybersecurity Risk Management Process Guideline does not distinguish between IT and
control systems. NISTs National Initiative for Cybersecurity Education (NICE) does not address control
systems. The recent MIT report on the future of the electric grid does not adequately address cybersecurity of
control systems. Furthermore, Weiss points to the omission of Stuxnet in the critical infrastructure protection
standards (CIPs) put out by the North American Electric Reliability Corp. (NERC), which upholds the mission of
ensuring the reliability of the North American bulk power system. NERC Standards CIP-002 through CIP-009
provide a cybersecurity framework for the identification and protection of Critical Cyber Assets to support
reliable operation of the bulk electric system. Not only do the NERC CIPs essentially exclude Stuxnet, but they
also exclude almost 70% of the power generation in North America from being examined for cybersecurity
threats, Weiss says. That doesnt make sense, does it?

07-03-15 5:39 PM

Security Risks

6 of 6

http://ecmweb.com/print/computers-amp-software/security-risks

Despite the potential enormous impact the revelations about Stuxnet could bring to the industry, very little has
been done to improve security. So even though this was a big thing, its impact on changing peoples behavior
and taking security more seriously has been limited, says Weiss. We expected a lot more. Its not clear to me
that we have made much progress since 2000.
Steps can be taken to beef up security without sacrificing reliability or safety. Traditional security strategies,
such as adding a firewall or changing default passwords, dont necessarily work in a control system
environment. In fact, changing the default passwords in a PLC could effectively shut down the PLC. What is
important is to learn how to secure them while allowing them to continue to do their jobs, says Weiss, who
advises that because many control systems (especially field devices) have no security and may not be patchable,
it is critical that they be secured by policies and procedures meant for control systems.
But very few facilities have written ICS cybersecurity policies. Everybody has IT policies, but IT policies are not
adequate for control systems, continues Weiss. This is why the International Society of Automation (ISA)
initiated S99, Security for Industrial Automation and Control Systems. You cant just give this to IT and walk
away.
Thats not to say operators shouldnt take advantage of the security measures that come with the Windows
human machine interface (HMI) component of the ICS. That not only can but also should have all kinds of
available security, says Weiss. Its a Windows system. If youre not using the security, then shame on you.
However, security is more difficult as you move into the hardware component of control systems, such as PLCs,
chemical analyzers, variable-frequency drives (VFDs), and smart transmitters. If you start trying to change
them, with all sorts of crazy modifications, you can hurt reliability, Weiss says.
Asset management plays an important part in security also. Its important to know what equipment is installed
in the field and how its connected. Very few organizations know what they have installed in the field that can be
affected by cyber threats, says Weiss.
Furthermore, Weiss says much can be learned from past and recent incidents. He keeps a database with more
than 200 actual control system cyber incidents.
Security and compliance are not the same. You can be secure but not compliant, which is a paperwork
problem, or you can be compliant but not secure, which is a real problem, says Weiss.
There have been cases where two major electric utilities mandated the use of anti-malware for protective relays
even though the anti-malware could have prevented the relays from operating.
The question of responsibility for the security of the systems controlling infrastructure is complex. Vendors,
owners, and operators, along with researchers and regulators, all bear a portion. Its everybodys and
nobodys, says Weiss. Its the octopus with tentacles everywhere. Its a genuinely big problem, because it is
communication and it affects everyone.
The point of advanced automation technologies is to improve reliability and safety by having remote access to
substitute having experts at all of these locations. The irony is we are making these systems more vulnerable by
trying to make them more reliable and usable.
Source URL: http://ecmweb.com/computers-amp-software/security-risks

07-03-15 5:39 PM

Potrebbero piacerti anche