The future of warfare is being debated in Washington next week.
Next week is ‘Cyber Week.’ Washington is debating four cyber security bills. Lots of noise. Lots of discussion. Lots of differences. Continue reading
The future of warfare is being debated in Washington next week.
Next week is ‘Cyber Week.’ Washington is debating four cyber security bills. Lots of noise. Lots of discussion. Lots of differences. Continue reading
Cyber Security is going ERM.
The US Department of Energy (DOE) released for public comment the Electricity Subsector CyberSecurity Risk Management Process. You can download it at:
(http://energy.gov/sites/prod/files/RMP%20Guideline%20Second%20Draft%20for%20Public%20Comment%20-%20March%202012.pdf
It may be a game changer in risk frameworks. Most risk frameworks are linear risk assessment processes.
The DOE standard is ERM process based, inputs => activities => outputs, hierarchal (tiered), and follows a novel cycle.
Let’s discuss a few of these:
The RM model is tiered: 1. Tier 1: Organization; 2. Tier 2: Mission and Business Processes; and Tier 3: IT and Industrial Control Systems.
The RM model has a cycle of: Frame => Assess => Respond => Monitor.
Each tier follows a process, much like the Project Management Institute Body of Knowledge (PMBOK)
Different RM model. ERM based. Interesting. Novel. Check it out.
Cybersecurity needs are not hypothetical, as the recent DHS warning of a cyberattack on the US natural gas industry shows. Why then was a post-9/11 initiative to secure US utilities dropped?
By Mark Clayton, Staff writer / May 17, 2012
A natural gas pipeline is seen under construction near East Smithfield in Bradford County, Pa., in this January 7 file photo.
Les Stone/Reuters/File
With America now trying to thwart a cyberattack on its natural gas industry, it is helpful to recall the hectic days after 9/11, when industry scientists raced to shield from potential terrorist cyberattacks hundreds of thousands of vulnerable devices that control vital valves and switches on America’s gas pipelines, water plants, and power grid.
It was a race that seemed winnable. After five years of intense effort, a 35-member team of industrial-control-system wizards from the gas, water, and electric utilities industries had created a powerful new encryption system to shield substations, pipeline compressors, and other key infrastructure from cyberattack.
But just weeks before it was to be finalized in 2006, the funding plug was pulled on the encryption system, called AGA-12, by the American Gas Association and its partners at the electric power and water utility industries, some who worked on the project recall.
To this day, the cancelation of the project has called into question whether US utilities will, on their own, invest in measures necessary to protect their networks.
Tested at a Los Angeles water treatment plant, a gas utility in Chicago, and other locations, AGA-12 worked well. National labs verified it. Experts said it was good to go. Yet with 9/11 receding in memory, utility industry executives had begun worrying anew about the cost of deploying the system, former project participants say.
Today, six years after AGA-12 was aborted and 11 years after the World Trade Center attacks, the US natural gas industry is trying to thwart a real cyberattack campaign, according to the US Department of Homeland Security (DHS). Congress, meanwhile, is still debating whether voluntary or mandatory security standards are the best way to secure America’s critical infrastructure.
All of which leaves researchers who helped develop AGA-12 frustrated and a little wistful about the digital shield that they say would have provided a badly needed layer of security – especially in light of a trend toward cyberattacks on critical infrastructure companies.
“Technically it was an excellent standard and we were almost done with it when the project was terminated,” says William Rush, a now-retired scientist formerly with the Gas Technology Institute, who chaired the effort to create the AGA-12 standard. “One of the things I wake up in the middle of night and worry about is what to do if we’ve just been attacked. That’s not the time to worry about it – now’s the time.”
AGA-12, he says, was designed to secure older industrial control system devices out in the field, many of which still today communicate by modem and phone line, radio, or even wireless signal, but were never designed with cybersecurity in mind and remain highly vulnerable today.
It’s not clear that AGA-12 could have stopped the “spear-phishing” type of cyberattack now under way against the natural gas industry, experts say. But it could stop at least one kind: attacks directly on systems in the field of the kind DHS has highlighted in numerous studies and reports.
Installed in front of each vulnerable device would have been an AGA-12 gatekeeper, a sealed black box with a processor and cryptographic software inside, he explains. That “bump in the wire” would sift and decipher commands coming in from legitimate operators, but shield the vulnerable industrial control systems behind them from any false signals that might allow a hacker to take over.
“It was never intended to be a silver bullet,” Dr. Rush says. “But it would definitely have provided quite a lot more protection for critical infrastructure like gas pipelines and the power grid than we have right now.”
The reality of the cyberthreat was driven home in late March, when DHS issued the first of four confidential “alerts” warning of a cyberattack campaign against US natural gas pipeline companies’ computer networks. Some researchers have linked the attack to a 2011 attack for which US officials blame China.
Those recent attacks follow a trend in which corporate and industrial networks belonging to critical infrastructure companies are seen to be a growing target. In April, the cybersecurity company McAfee and the Center for Strategic and International Studies (CSIS), a Washington think tank, found that 40 percent of electric utility company officials in 14 countries said their networks were under attack and more vulnerable than ever.
Meanwhile, in an election year, Congress and the Obama administration are wrangling over new cybersecurity standards for critical infrastructure companies – primarily whether they should be based on a voluntary or mandatory approach.
“The issue isn’t a lack of standards,” says James Lewis, director of the Technology and Public Policy Program at CSIS. “It’s the lack of a business case for individual companies to spend for public safety. This [AGA-12 case] just confirms it. They know what to do to make things secure and have chosen not to do it for sound business reasons. A voluntary approach doesn’t work.”
At least six energy industry organizations that have developed voluntary cybersecurity standards for their industrial control systems would disagree. They include the North American Electric Reliability Corporation (NERC), International Electrotechnical Commission, American Petroleum Institute, and the AGA. But because the standards are voluntary or are “guidelines,” it’s unclear how widely they have been acted upon.
Asked if field devices have received added protections that supplanted the need for AGA-12, Jake Rubin, an AGA spokesman, says the AGA, federal government, and industry groups “have put cybersecurity guidelines in place that independent operators are using currently in the field.” However, he adds, “The ‘bump in the wire’ concept cannot be applied to all existing systems.”
“AGA members are committed to the safe and reliable delivery of clean natural gas to their customers at affordable and stable prices,” says Mr. Rubin, an AGA spokesman in an e-mail response. “They must make decisions that balance these factors, with safety always being the top priority for America’s natural gas utilities.”
But other observers say that while some newer equipment with better security has been adopted in recent years, many of the same vulnerabilities remain because long-lived industrial control systems are rarely replaced if still functioning. Without a mandate, few companies will incur the cost to deploy enhanced security systems, they say.
“We found that the adoption of security measures in important civilian industries badly trailed the increase in threats over the last year,” Stewart Baker, a former DHS official who led the CSIS and McAfee study, said in a statement in April.
An amateur enthusiast has found evidence that hackers could exploit a security vulnerability in the systems of a company that serves power plants and military installations.
By Mark Clayton, Staff writer / April 25, 2012
An amateur cybersecurity researcher who bought industrial computer networking equipment on e-Bay for fun has discovered a critical weakness in equipment that helps run railroads, power grids, and even military installations nationwide.
The American Electric Power corporate headquarters in Columbus, Ohio. AEP is a customer of RuggedCom.
Paul Vernon/AP/File
The vulnerability means that hackers or other nations could potentially take control of elements within crucial American infrastructure – from refineries to power plants to missile systems – sabotaging their ability to operate from within.
Analysts say the problem is likely fixable, but the enthusiast says he has gone public only because the company that manufactures the equipment, RuggedCom of Concord, Ontario, has declined to address the issue since he made it known to them a year ago.
“It’s clearly a huge risk,” says Dale Peterson, CEO of Digital Bond, a control systems security firm in Sunrise, Fla. “Anytime someone can take down your network infrastructure, essentially cause a loss of control of the process – or your ability to monitor it, very dangerous things can happen.”
The vulnerability has to do with what is known as a digital “back door.” The back door is a secret login that allows the manufacturer to get into the equipment’s control systems without anyone knowing about it – even the purchaser. In theory, manufacturers could use their back doors to send updates to the equipment, but since they are secret, their use is not well known.
The discovery of back doors built into digital industrial control systems is not unprecedented. In fact, RuggedCom was recently acquired by a subsidiary of Siemens AG, the giant German industrial engineering company that has been criticized for using hidden, yet vulnerable, back doors in its control systems.
What is unusual is that RuggedCom’s equipment is often used as a digital fortress, protecting from hackers far more vulnerable systems that throw mechanical switches or close and open valves. Also surprising, experts say, is that the password needed to enter through this back door appears to be relatively easy to hack.
If hackers can get through the back door of RuggedCom’s routers and digital switches, the entire system that they are a part of becomes vulnerable. For example, Stuxnet, the world’s first publicly identified cyber super weapon, in 2009 wreaked havoc on Iran‘s nuclear centrifuge refining system by exploiting a password hidden inside a Siemen’s operating system.
“It is a very serious threat,” says Robert Radvanovsky, a cybersecurity researcher and cofounder of Infracritical, a think tank focused on shoring up cyber weaknesses in critical infrastructure. “The big concern is that these devices are what connect to the control systems that run the substations where power gets routed.”
RuggedCom sells “hardened” equipment designed to run around the clock in any temperature or weather condition. So it has a variety of clients seeking such robust machinery. Defense-industry customers mentioned on the RuggedCom website include big names like Boeing and Lockheed Martin, while power-industry customers include several of the nation’s largest utilities – American Electric Power, National Grid, Pepco, and others. The systems are also used by transportation authorities in the cities of Houston, Lakeland, Fla., and in Washington State and Wisconsin.
Pipelines, refineries, traffic lights, trains, military systems – all are at greater risk, especially to adept hackers belonging to nation-state intelligence agencies. The “good news,” Peterson says, is that even though the vulnerable systems are widespread, the problem is likely fixable, unless the RuggedCom operating system is too reliant on the back door login and its weak password-encryption system.
A RuggedCom spokesman, responding to an e-mail query, wrote that the company would be unable to respond Wednesday to Monitor queries about the vulnerability.
Feeling the company was dragging its heels and might never fix the problem was a key motivator for Justin W. Clarke, the San Francisco-based researcher who finally decided to reveal the threat a year after he first informed RuggedCom managers about it. RuggedCom said in mid-April that it would need three more weeks to notify customers but did not say whether it planned to fix the back door access with a firmware upgrade, Mr. Clarke says.
“I didn’t do this for money – I didn’t get paid for this,” he says. “I just wanted the problem fixed and nothing I heard from the company ever indicated that would happen.”
Everywhere he went during his day, he says, he saw the systems he knew how to hack sitting there vulnerable – from traffic light control boxes to power substations.
He learned about the vulnerabilities after buying the company’s devices off e-Bay “when they showed up cheap,” says Clarke in an interview. “This is something I do in my spare time with own money. I’m just this guy on street who knows how to do very bad things to important equipment, and I couldn’t stand that feeling so many systems – even in our military – were so vulnerable.”
He hopes a fix will come out now that the US-Computer Emergency Readiness Team, a federal cyberwatchdog, issued a vulnerability warning Tuesday, and its sister agency focused on industrial computerized control systems put out its own warning Wednesday.
Testimonials on the RuggedCom website show how deeply embedded its equipment is inside some of the most important US systems. Located at the end of the Alaska’s Aleutian island chain, about 300 miles from the coast of Siberia, the Shemya Island power plant provides power to National Missile Defense Authority facilities on the island.
“Ruggedcom switches were selected for use in the US Air Force Shemya Power Plant,” wrote Ted Creedon, chief engineer for Creedon Engineering in one testimonial for the company. “All electronics provided to the USAF were disassembled, quality inspected and burned in at the Chief Engineers office in Anchorage. Reliability was not an option.”
Cybersecurity needs are not hypothetical, as the recent DHS warning of a cyberattack on the US natural gas industry shows. Why then was a post-9/11 initiative to secure US utilities dropped?
By Mark Clayton, Staff writer / May 17, 2012
A natural gas pipeline is seen under construction near East Smithfield in Bradford County, Pa., in this January 7 file photo.
Les Stone/Reuters/File
With America now trying to thwart a cyberattack on its natural gas industry, it is helpful to recall the hectic days after 9/11, when industry scientists raced to shield from potential terrorist cyberattacks hundreds of thousands of vulnerable devices that control vital valves and switches on America’s gas pipelines, water plants, and power grid.
It was a race that seemed winnable. After five years of intense effort, a 35-member team of industrial-control-system wizards from the gas, water, and electric utilities industries had created a powerful new encryption system to shield substations, pipeline compressors, and other key infrastructure from cyberattack.
But just weeks before it was to be finalized in 2006, the funding plug was pulled on the encryption system, called AGA-12, by the American Gas Association and its partners at the electric power and water utility industries, some who worked on the project recall.
To this day, the cancelation of the project has called into question whether US utilities will, on their own, invest in measures necessary to protect their networks.
Tested at a Los Angeles water treatment plant, a gas utility in Chicago, and other locations, AGA-12 worked well. National labs verified it. Experts said it was good to go. Yet with 9/11 receding in memory, utility industry executives had begun worrying anew about the cost of deploying the system, former project participants say.
Today, six years after AGA-12 was aborted and 11 years after the World Trade Center attacks, the US natural gas industry is trying to thwart a real cyberattack campaign, according to the US Department of Homeland Security (DHS). Congress, meanwhile, is still debating whether voluntary or mandatory security standards are the best way to secure America’s critical infrastructure.
All of which leaves researchers who helped develop AGA-12 frustrated and a little wistful about the digital shield that they say would have provided a badly needed layer of security – especially in light of a trend toward cyberattacks on critical infrastructure companies.
“Technically it was an excellent standard and we were almost done with it when the project was terminated,” says William Rush, a now-retired scientist formerly with the Gas Technology Institute, who chaired the effort to create the AGA-12 standard. “One of the things I wake up in the middle of night and worry about is what to do if we’ve just been attacked. That’s not the time to worry about it – now’s the time.”
AGA-12, he says, was designed to secure older industrial control system devices out in the field, many of which still today communicate by modem and phone line, radio, or even wireless signal, but were never designed with cybersecurity in mind and remain highly vulnerable today.
It’s not clear that AGA-12 could have stopped the “spear-phishing” type of cyberattack now under way against the natural gas industry, experts say. But it could stop at least one kind: attacks directly on systems in the field of the kind DHS has highlighted in numerous studies and reports.
Installed in front of each vulnerable device would have been an AGA-12 gatekeeper, a sealed black box with a processor and cryptographic software inside, he explains. That “bump in the wire” would sift and decipher commands coming in from legitimate operators, but shield the vulnerable industrial control systems behind them from any false signals that might allow a hacker to take over.
“It was never intended to be a silver bullet,” Dr. Rush says. “But it would definitely have provided quite a lot more protection for critical infrastructure like gas pipelines and the power grid than we have right now.”
The reality of the cyberthreat was driven home in late March, when DHS issued the first of four confidential “alerts” warning of a cyberattack campaign against US natural gas pipeline companies’ computer networks. Some researchers have linked the attack to a 2011 attack for which US officials blame China.
Those recent attacks follow a trend in which corporate and industrial networks belonging to critical infrastructure companies are seen to be a growing target. In April, the cybersecurity company McAfee and the Center for Strategic and International Studies (CSIS), a Washington think tank, found that 40 percent of electric utility company officials in 14 countries said their networks were under attack and more vulnerable than ever.
Meanwhile, in an election year, Congress and the Obama administration are wrangling over new cybersecurity standards for critical infrastructure companies – primarily whether they should be based on a voluntary or mandatory approach.
“The issue isn’t a lack of standards,” says James Lewis, director of the Technology and Public Policy Program at CSIS. “It’s the lack of a business case for individual companies to spend for public safety. This [AGA-12 case] just confirms it. They know what to do to make things secure and have chosen not to do it for sound business reasons. A voluntary approach doesn’t work.”
At least six energy industry organizations that have developed voluntary cybersecurity standards for their industrial control systems would disagree. They include the North American Electric Reliability Corporation (NERC), International Electrotechnical Commission, American Petroleum Institute, and the AGA. But because the standards are voluntary or are “guidelines,” it’s unclear how widely they have been acted upon.
Asked if field devices have received added protections that supplanted the need for AGA-12, Jake Rubin, an AGA spokesman, says the AGA, federal government, and industry groups “have put cybersecurity guidelines in place that independent operators are using currently in the field.” However, he adds, “The ‘bump in the wire’ concept cannot be applied to all existing systems.”
“AGA members are committed to the safe and reliable delivery of clean natural gas to their customers at affordable and stable prices,” says Mr. Rubin, an AGA spokesman in an e-mail response. “They must make decisions that balance these factors, with safety always being the top priority for America’s natural gas utilities.”
But other observers say that while some newer equipment with better security has been adopted in recent years, many of the same vulnerabilities remain because long-lived industrial control systems are rarely replaced if still functioning. Without a mandate, few companies will incur the cost to deploy enhanced security systems, they say.
“We found that the adoption of security measures in important civilian industries badly trailed the increase in threats over the last year,” Stewart Baker, a former DHS official who led the CSIS and McAfee study, said in a statement in April.