As fleets get newer and offer more advanced technology, the portion of these machines that are made of and operated by computers is growing exponentially. And this makes the entire unit susceptible to infiltration. How can mines make sense of it all – and protect themselves?
by Andrew Ginter
Mining machinery is increasingly automated; simple pick-up trucks contain over 300 CPUs each, and more critical equipment is heavily instrumented. The software in these CPUs is the root of the cybersecurity problem: all software contains defects, and some defects are security vulnerabilities.
Some of these vulnerabilities we know about and are working to fix, while others our enemies are using against us without our knowledge. Worse, an ever-increasing volume of safety-critical functions are implemented in software, because software safeties are often cheaper and more accurate than electro-mechanical safeties.
Historically, mining enterprises have seen IT teams responsible for cybersecurity, but this is a poor fit for modern cyber-sabotage attacks. Engineers have a responsibility to be involved in addressing all risks that result from their automation system designs – risks to worker safety, equipment integrity and efficient operations, including risks due to cyber-sabotage.
Attack patterns
Addressing cyber risk starts with understanding cyberattacks in order to evaluate the risks posed on operations and to designing effective defenses. While attack details are very technical, there are only two kinds of attacks: cyber-espionage (steals information) and cyber-sabotage (misoperates computers and automation).
Modern ransomware uses both. It is when criminals steal information (espionage) and threaten to publish that information unless a ransom is paid. The criminals also encrypt critical systems (sabotage) and provide decryption keys only if they are paid.
Cyber-espionage is the responsibility of IT/enterprise security teams to protect information as an asset. However, in operations, the physical operation is the asset, and all information is a potential threat. The only way to change an automation system from a normal state to a compromised state is for cyber-sabotage attack information to enter the system, and every way to move information is a way to move attacks:
- Online – attack information is sent across networks.
- Offline – attack information is physically carried into protected automation systems on USB keys, in laptops, in brand new (heavily automated) equipment, or even in the head of a disgruntled employee.
Online attacks tend to be operated by remote control. Attackers take over a machine by persuading an employee or contractor to click on a malicious attachment or link. The attackers can then gain access to operate compromised machines by remote control, across the internet.
Once accessed, attacks can “pivot” from one compromised machine to attack other nearby machines. Attackers pivot routinely through firewalls by stealing the firewall password, exploiting defects and vulnerabilities in firewall software, stealing passwords for systems that we log into through the firewalls, or other ways.
Offline attacks can also be by remote control – when the attack software activates in the target network, the software connects directly or indirectly to an internet-based command-and-control center, asking for instructions. Offline attacks can also be autonomous – spreading like a virus without human intervention or “time-bombed” to erase hard drives or cause other more subtle damage.
Consequences
If a cyber-sabotage attack succeeds, consequences can include:
- Safety issues, such as misoperating elevators or ventilation equipment;
- Equipment damage, such as destroying large electric motors by rapid disconnection and reconnection to AC power sources;
- “Bricked” automation – erasing automation firmware so thoroughly that entire circuit boards must be replaced;
- Simple production outages of varying lengths; or
- More subtle problems, such as premature equipment aging, inefficient operations, or other ways to make a mine much less profitable.
Engineering teams must evaluate the potential consequences for a given site or subsystem, because it is engineers who understand what mechanical safeties (i.e., elevator emergency brakes) or equipment protection (such as protective relays) are in place to prevent consequences. Engineering teams generally need help from IT attack experts to evaluate the credibility of attacks and consequences. Most protective relays for example are software and as mentioned, all software has vulnerabilities.
Engineering-grade mitigations
When the worst-case physical consequences of credible cyber-sabotage attacks are unacceptable, engineering teams must adapt their designs to prevent those consequences. The most cyber-resistant mitigations are non-cyber: spring-loaded elevator safety brakes, electro-mechanical timers forcing motors to “spin down” before re-engaging with AC power, and manual overrides to operate ventilation equipment and other vital safety gear manually through cyber emergencies. These mitigations are deployed as a last line of defense, after layers of more sophisticated cyber safety systems. This is the domain of the Security PHA Review methodology (for more on that, check out http://www.isa.org/products/security-pha-review-for-consequence-based-cybe-1).
Then, to prevent production shutdowns and impaired mining efficiencies from online nation-state and ransomware criminal attacks, we need network engineering: designs that deterministically prevent online attacks from pivoting across consequence boundaries – connections between networks with dramatically different worst-case consequences of compromise. Examples include monitor-only industrial internet of things (IIoT) networks, analog signalling, unidirectional gateways, and hardware-enforced remote access (HERA).
IT-grade mitigations
While many offline threat vectors can be addressed physically, it is generally not possible to deploy physical mitigations for all offline communications paths. For example, removing CD drives from computers that do not need them and physically plugging unused USB ports. Anti-virus updates and new or upgraded software versions must enter our systems sooner or later, usually through offline means. This means we must deploy conventional cybersecurity measures as well and deploy those measures aggressively on machines most exposed to routine offline information transfers.
Essentially all mines must implement the detect, respond and recover pillars of the NIST Cybersecurity Framework (NIST CSF). Even if we manage to mitigate the most serious cyber threats with engineering-grade defenses, there is always residual risk. We need to monitor our automation computers and embedded devices and networks for anomalous activity, we need cyber incident response plans and teams, and we need secure backups and a practiced recovery plan to recover normal operations quickly from known-good media, when we discover cyber-sabotage.
Cyber-informed engineering
All the topics in this article are part of Cyber-Informed Engineering (CIE) (https://inl.gov/national-security/cie), a new way of looking at Operational Technology (OT) security. CIE is a new “umbrella” framework that sees OT security as “a coin with two sides.” One side is cybersecurity – to teach engineering teams about cyber threats, attacks, cyber mitigations and its limitations. The other side is engineering – powerful engineering tools and approaches for managing physical risk. These latter tools exist only in the OT space and have no IT analogues, including tools from safety engineering, protection engineering, automation engineering and network engineering.
In a sense, CIE has arrived just in time. Unlike conventional threats to safe, reliable, and efficient operations, cyber-sabotage threats are increasing steadily. Mining is becoming steadily cyber-automated with more targets for cyber attacks, automation is becoming steadily connected with more opportunities for attacks to pivot, and our enemies are becoming more capable in their attacks. We cannot afford to be put out of business by these threats in another year or three.
CIE points out that engineering teams have a vital role in addressing cyber-sabotage threats to safety and critical infrastructure. Cyber-sabotage risks are increasing, yes, but so is our understanding of how to address those risks, by combining engineering and cybersecurity approaches, under the umbrella of Cyber-Informed Engineering.
About the author:
Andrew Ginter is the vice president of Industrial Security for Waterfall Security Solutions, former co-host and chief security officer of Industrial Security Podcast, and the author of three books on OT security including his latest, Engineering-grade OT Security – A Manager’s Guide.
Ginter, who leads a team responsible for industrial cybersecurity research and contributions to industrial cybersecurity standards and regulations, has 25 years of experience managing the development of products for computer networking, industrial control systems and industrial cybersecurity for leading vendors including Hewlett-Packard, Agilent Technologies and Industrial Defender.
He is a cybersecurity expert who offers valuable insights to a whole spectrum of solutions from mechanical pressure relief valves to network engineering to conventional cybersecurity. He holds a B.Sc in applied mathematics and an M.Sc in computer science from University of Calgary.