Tuesday, September 22, 2009

10 Critical Trends for Cybersecurity

The Internet, private networks, VPNs, and a host of other technologies are quickly weaving the planet into a single, massively complex "infosphere." These connections cannot be severed without overwhelming damage to companies and even economies. Yet, they represent unprecedented vulnerabilities to espionage and covert attack.


"Cybersecurity is the soft underbelly of this country," outgoing U.S. National Intelligence Director Mike McConnell declared in a valedictory address to reporters in mid-January. He rated this problem equal in significance to the potential development of atomic weapons by Iran.
McConnell does not worry so much that hackers or spies will steal classified information from computers owned by government or the military, or by contractors working for them on secret projects. He is afraid they will erase it and thereby deprive the United States of critical data. "It could have a debilitating effect on the country," he said.

With this concern in mind, Forecasting International undertook a study of factors likely to influence the future development of information warfare.

Real-world attacks over the Internet also are possible. In March 2007, the Department of Energy's Idaho National Laboratory conducted an experiment to determine whether a power plant could be compromised by hacking alone. The result was a diesel generator smoking and on fire due to some malicious data that could easily have been sent to it over the Internet from anywhere in the world. In January 2008, a CIA analyst told American utilities that hackers had infiltrated electric companies in several locations outside the United States. In at least one case, they had managed to shut off power to multiple cities.

We conclude that information warfare will be a significant component in most future conflicts. This position is in line with both U.S. military doctrine and white papers published by the Chinese People's Army. One study affirms that as many as 120 governments already are pursuing information warfare programs.

Repeated reports that Chinese computer specialists have hacked into government networks in Germany, the United States, and other countries show that the threat is not limited to relatively unsophisticated lands. A 2007 estimate suggested that hackers sponsored by the Chinese government had downloaded more than 3.5 terabytes of information from NIPRNet, a U.S. government network that handles mostly unclassified material. More disturbingly, The Joint Operating Environment 2008: Challenges and Implications for the Future Joint Force (the JOE) comments that "our adversaries have often taken advantage of computer networks and the power of information technology not only to directly influence the perceptions and will of the United States, its decision-makers, and population, but also to plan and execute savage acts of terrorism."

Many factors guarantee that the role of information warfare in military planning and operations will expand greatly in the next two to three decades. These include the spread of new information technologies such as Internet telephony , wireless broadband, and radio-frequency identification (RFID); the cost and negative publicity of real-world warfare; and the possibility that many information operations can be carried out in secret, allowing successful hackers to stage repeated intrusions into adversaries' computer networks.


10 Critical Trends for Cyberwar

Forecasting International [rates] the following as the 10 most significant trends that will shape the future of information warfare. This ranking is based largely on the responses of our expert panelists, but also on our own judgment, developed over 50 years of trend analysis and extrapolation in military and national-security contexts. In nearly all cases, these two inputs agreed.

1. Technology Increasingly Dominates Both the Economy and Society

New technologies are surpassing the previous state of the art in all fields. Laptop computers and Internet- equipped cell phones provide 24/7 access to e-mail and Web sites.

New materials are bringing stronger, lighter structures that can monitor their own wear. By 2015, artificial intelligence (AI), data mining, and virtual reality will help most organizations to assimilate data and solve problems beyond the range of today's computers. The promise of nanotechnology is just beginning to emerge.

Ultimately, speculations may prove correct that we are approaching the "Singularity's event horizon." At that time, our artifacts will be so intelligent that they can design themselves, and we will not understand how they work. Humanity will be largely a passenger in its own evolution as a technological species.

Implications for Information Warfare and Operations: The growing domination of technology is the ultimate foundation for cyberwar. Complex, often delicate technologies make the world a richer, more-efficient place. However, they also make it relatively fragile, as it becomes difficult to keep industries and support systems functioning when something disrupts computer controls and monitors, and the opportunities for disruption proliferate rapidly.

A frequently overlooked scenario is the use of infotech by organized crime, according to consulting futurist Joseph F. Coates. "It is 2015, and the Mafia electronically wipes out the records of a modest-sized bank in Texas or Nebraska, and then quietly visits a small group of large financial services organizations with a simple message: 'We did it-you could be next. This is what we want, to protect you.'"

Futures-studies scholar Stephen F. Steele notes, "Cyber systems are not simply information, but cyber cultures. Coordinated cyberattacks at multiple levels will be capable of knocking out the macro (national defense systems), meso (local power grids), and micro (starting an automobile) simultaneously."

2. Advanced Communications Technologies Are Changing the Way We Work and Live

Telecommuting is growing rapidly, thanks largely to e-mail and other high-tech forms of communication. However, the millennial generation has already abandoned e-mail for most purposes, preferring to use instant messaging and social-networking Web sites to communicate with their peers. These and other new technologies are building communities nearly as complex and involved as those existing wholly in the real world.

Implications for Information Warfare and Operations: This is one of the two or three critical trends that give information warfare and operations their significance.

As our institutions integrate their operations, their connectivity makes them more vulnerable to unauthorized access. As they redesign their operations to take advantage of the efficiencies that computers offer, they also open them to disruption by technologically sophisticated adversaries.

Disruption may not be overt or easily detected. With manufacturing systems increasingly open to direct input from customers, it might be possible to reprogram computer-controlled machine tools to deliver parts that were subtly out of spec-and to rework the specifications themselves so that the discrepancies would never be noticed. If the tampering were carried out with sufficient imagination and care on well-selected targets, the products might conceivably pass inspection, yet fail in the field. This could have significant military implications.

"The Internet is a mess, open to all kinds of uses, misuses, antisocial material, irksome intrusions from ads, identity theft, international swindles, and on and on," observes Coates. "For these reasons, as well as the potential for national-security interventions and general hell raising, it is time to plan, design, and execute over the next five to seven years a replacement for the Internet."

Infotech and business management consultant Lawrence W. Vogel calls attention to the impacts of cloud computing (third-party data hosting and service-oriented computing) and Web 2.0 applications (social networking and interactivity). "The cybersecurity implications associated with cloud computing, whether a public or private cloud, are significant," he says. "As more companies and the government adopt cloud computing, they become more vulnerable to disruption and cyberattacks. This could result in disruption in services and the ability to rapidly access critical software applications. And with the widespread use of Facebook, blogs, and other social- networking applications in our personal lives, government organizations are seeking similar capabilities for communicating and interacting with their stakeholders. Once the government permits interactive, two-way communications over government networks, the chance for cyberattacks dramatically increases."


3. The Global Economy Is Growing More Integrated

Critical factors here include the rise of multinational corporations, the relaxation of national distinctions (e.g., within the European Union), the growth of the Internet, and computerized outsourcing of jobs to low-wage countries.

Implications for Information Warfare and Operations: The Internet, private networks, virtual private networks, and a host of other technologies are quickly weaving the planet into a single, massively complex "infosphere." These nearly infinite connections cannot be severed without overwhelming damage to companies and even to national economies. Yet, they represent unprecedented vulnerabilities to espionage and covert attack. This is another major trend for information warfare and operations.

"Another thing to think about here [is that] the sheer volume of information racing through the 'infosphere' enhances the opportunity for cyberwar operators to embed encrypted information within routine data flows," says law enforcement strategic planner John Kapinos. "This could take the form of system-disabling viruses, or secret message traffic concealed within an ocean of regularly transmitted, legitimate data. Sophisticated data-monitoring programs designed to detect unusual patterns would be needed to counteract such a scheme."

Futurologist Ian D. Pearson adds that, as interactions become more complex, "it will be harder to spot points of vulnerability. Fraud and cyberterrorism will increase."

The actors in cyberwarfare now include non-state entities, points out strategic-planning consultant Frank Sowa of the Xavier Group Ltd. "Corporations in the twenty-first century are borderless and are not geopolitical," he argues. "The key to actively thwarting cyberwarfare is to recognize corporations and organized religions on the same-or even higher protocol-than geopolitical governments and borderless, non-geopolitical terror and extremist operations."

4. Research and Development Play a Growing Role in the World Economy

Total U.S. outlays on R&D have grown steadily in the past three decades. Similar trends are seen in China, Japan, the European Union, and Russia.

Implications for Information Warfare and Operations: This trend is responsible for the accelerating technological advances seen in recent decades. It is another critical factor in the development of information warfare.


The chief product of R&D is not clever new merchandise or technologies, but information. Even the most sensitive output from research results is routinely stored in computers, shipped through company intranets, and usually transmitted over the Internet. This accessibility makes it a prime target for espionage, whether industrial or military. This problem has been growing nearly as quickly as the mass of information available to prying. It will be a still greater concern for security specialists in the years ahead.

Many R&D programs promote the dissemination of research results, observes public-policy specialist Mark Callanan of the Institute for Public Administration in Dublin. "While this is of course entirely sensible for the vast majority of research, the emphasis on getting as much information out there [as possible] may pose additional security dilemmas in terms of cybercrime," he argues.

Pearson adds that "the downside is that R&D also occurs in weapons tech, so there is always a background arms race. High-capability technologies will present enormous threats to mankind in the second half of this century."

5. The Pace of Technological Change Accelerates with Each New Generation of Discoveries and Applications

In fast-moving engineering disciplines, half of the cutting-edge knowledge learned by college students in their freshman year is obsolete by the time they graduate. The design and marketing cycle-idea, invention, innovation , imitation-is shrinking steadily. As late as the 1940s, the product cycle stretched to 30 or 40 years. Today, it seldom lasts 30 or 40 weeks.

The reason is simple: Some 80% of the scientists, engineers, technicians, and physicians who ever lived are alive today-and exchanging ideas in real time on the Internet.

Implications for Information Warfare and Operations: As new technologies arrive, industry will be forced to hire more technology specialists and to train other employees to cope with new demands. Some support functions may be moved offshore, where technically knowledgeable adversaries might have greater access to them, opening the way to disruption.

"It is important in the discussion not to neglect the large amount of information technology now obsolescent or obsolete, but [still] in place," observes Joe Coates.

The advance of machine intelligence will also have confounding implications for cybersecurity. According to knowledge theorist and futurist Bruce LaDuke, "Knowledge creation is a repeatable process that is performed by humans and could be performed by machines exclusively or in systems built to interact with humans ('man-in the-loop' systems). Artificial knowledge creation will usher in [the] Singularity, not artificial intelligence or artificial general intelligence (or technology advancing itself). Artificial intelligence has already been achieved by any computer, because intelligence is appropriately defined as knowledge stored that can be retrieved (by human or computer). The first arriver to [artificial knowledge creation] technology will drive the entire paradigm shift."


6. The United States Is Ceding Its Scientific and Technical Leadership to Other Countries

In June 2009, a U.S. National Security Agency-backed "hacking" competition pitted 4,200 programmers from all over the world in algorithm coding and other contests; of the finalists, 20 were from China, 10 were from Russia, and only two were from the United States, reports Computerworld. "We do the same thing with athletics here that they do with mathematics and science there," says Rob Hughes, president of TopCoder, the software development company that operates the annual competition. Hughes argues that the United States needs to put more emphasis-and earlier-on math and science education.

"The scientific and technical building blocks of our economic leadership are eroding at a time when many other nations are gathering strength," the National Academy of Sciences warns. "Although many people assume that the United States will always be a world leader in science and technology, this may not continue to be the case inasmuch as great minds and ideas exist throughout the world."

R&D spending is growing in raw-dollar terms, but, when measured as a percentage of the total federal budget or as a fraction of the U.S. GDP, research funding has been shrinking for the last 15 years. Only half of U.S. patents are granted to Americans, a proportion that has been declining for decades.

More than half of U.S. scientists and engineers are nearing retirement. At the rate that U.S. students are entering these fields, the retirees cannot be replaced except by recruiting foreign scientists.

Implications for Information Warfare and Operations:

To whatever extent the United States loses its leadership in science and technology, it falls behind other countries in the intellectual and personnel base required for information warfare and operations. If this trend is not reversed, the United States could find itself at a significant disadvantage in this strategically and tactically important area.

"The strength of the United States is in knowledge creation under the auspices of innovation and invention that has been applied in all kinds of technologies," argues LaDuke. "Ceding existing technology as technology converges and rises exponentially is not as significant as not creating the knowledge that is empowering future advances in technology."

Pearson adds, "The increased power of smart individuals is more of a problem, especially in NBIC [nanotech, biotech, infotech, and cognitive science] areas. Unabomberstyle activity from inconspicuous people within a community is more of a danger than hostile states or terrorist groups."

Steele warns that, "not only is the United States ceding the 'left brain' sciences, but the continuation of a linear, industrial model for education has [it] ceding a growing need for 'right brain'-creative and synergistic -- thinking."

7. Technology Is Creating a Knowledge-Dependent Global Society

More and more businesses-and entire industries-are based on the production and exchange of information and ideas rather than exclusively on manufactured goods or other tangible products. At the same time, manufacturers and sellers of physical products are able to capture and analyze much more information about buyers' needs and preferences, making the selling process more efficient and effective.

Implications for Information Warfare and Operations: Increasing dependence on technology effectively translates to growing fragility. Disrupt essential information or communications systems, and a company, government agency, or military unit could be dead in the water, or at least cut off from oversight and coordination with its partners. Telecommuting systems, for example, offer several obvious opportunities to disrupt the operations of the company or agency that depends on them.

"The 'bunker-buster' ammunition that could be brought to bear within the context of cyberwar has not yet been deployed (or at least apparently not yet in a manner that has worked well)," says Cynthia E. Ayers, a security specialist and visiting professor at the U.S. Army War College's Center for Strategic Leadership. "How knowledge-dependent populations react-or how 'new media' societies are capable of reacting-when such weapons are deployed may ultimately determine their fate. The chaos that could be caused either under a limited (homemade) EMP [electromagnetic pulse] scenario or as a result of one or more high-altitude nuclear blasts would be devastating to a Western population in many ways. The losses incurred would make the current economic downturn seem like a mere irritant."

Lt. Col. Kevin Gary Rowlatt of the Australian Army observes that "countermeasures to cyberthreats developed by us will impede our ability to work effectively, let alone efficiently. Firewalls, authentication, and encryption programs have the potential to slow the flow of information. An enemy would love to slow down some decision cycles. This approach would allow them to achieve the aim simply by presenting a threat, be it credible or virtual. We become distrustful of information contained or processed within cyber networks."


8. Militant Islam Continues to Spread and Gain Power

It has been clear for years that the Muslim lands face severe problems with religious extremists dedicated to advancing their political, social, and doctrinal views by any means necessary. The overthrow of Saddam Hussein and the American occupation of Iraq has inspired a new generation of jihadists, who have been trained and battle-hardened in the growing insurgency.

Implications for Information Warfare and Operations: Information systems are another category of attack that Muslim radicals could mount against their chosen enemies in the West. One likely source of such an attack would be India, a land with a substantial Muslim minority (about 150 million people) and strong computer and communications industries.

As Ayers observes, "It has long been noted that radical Islamists have been using the Internet to preach, recruit, glorify suicide-bombers, and perform training on a global basis. The 'e-possibilities' for Islamic militants are obviously limited only [by] the imagination, just as they are for more harmonious or legitimate activities. The cyberworld offers a wealth of opportunity to engage in the spread of Islam, followed by -- or in conjunction with -- a cyberwar that would be seen as just in the Islamic tradition."

9. International Exposure Includes A Growing Risk of Terrorist Attack

Terrorism has continued to grow around the world as the wars in Iraq and Afghanistan proceed, even as the rate of violence in Iraq itself has declined. Nothing will prevent small, local political organizations and special- interest groups from using terror to promote their causes.

On balance, the amount of terrorist activity in the world will continue to rise, not decline, in the next 10 years. In fact, terrorist attacks have risen sharply since the invasion of Iraq, both in number and in severity.

Implications for Information Warfare and Operations: Until the terrorist problem is brought under control- which will probably not happen for at least a generation-we will face a growing threat that Muslim extremists will master computer and Internet technologies and use their skills to disrupt essential communications and data. The impact will be seen in U.S. corporations, research laboratories, universities, utilities companies, and manufacturing. Cyber operations will be at best second choices for many terrorists, who prefer the newsworthy gore of attacks with bombs and firearms. However, their potential for maximum economic impact with minimum risk eventually will make them irresistible to forward-looking extremists.

"National security needs to address the freedom that big business has in moving its IT services off shore," says Rowlatt. "If a business is a major contributor to a nation's GDP, then what right does it have to expose its 'cyber underbelly' to a foreign power, which in turn, exposes the nation to unnecessary cyber risks? Look at how terrorists targeted Mumbai, the cyber center for India, which serviced many international organizations' IT needs."

10. The World's Population Will Grow To 9.2 Billion by 2050

The greatest fertility rate is found in those countries least able to support their existing people-the Palestinian Territories, Yemen, Angola, the Democratic Republic of Congo, and Uganda. In contrast, populations in most developed countries are stable or declining. The United States is a prominent exception.

Implications for Information Warfare and Operations: The world population's growth in itself is less significant than where that growth is concentrated. India already has the largest supply of English-speaking [people]," observes Francis G. Hoffman, research fellow at the Marine Corps Center for Threats and Opportunities. "The educational systems in the latter will not support the advancement of knowledge workers to any degree, and could be swamped by poor governance, lack of services, and chronic disorder. Many places in Asia will experience some of the same downsides of large population growth without adequate governance, services, and education."

Steele adds that the disparities in population growth will widen the gap between developed and developing worlds, producing "environments of anomie and alienation as a breeding ground for terrorist ideology." Moreover, increased education and technological sophistication in the developing world could compound these problems. Steele argues, "A growing proportion of the world's population (including the developing world) is gaining primary and secondary-school-equivalent education. The diffusion of cyber systems in the developing world increases opportunity for global cyberwar."

Conclusion: Lessons for Avoiding Cyberwar

Our major concern is no longer weapons of mass destruction, but weapons of mass disruption. The cost of "going nuclear" is simply too high for atomic weapons to be used by any but a rogue state unconcerned with its own survival. Cyberweapons may kill fewer people, but they can have enormous economic impact. A particularly clever opponent might even carry out a devastating attack without ever being identified or facing retribution. Information has become the battlefield of choice. It will remain so well into the future.


Lesson number one: As the world becomes more dependent on information technology, it becomes more fragile. It is possible to make any specific site or network more secure, but not the "system" as a whole. As network connections proliferate, electronic controls-for example, of petroleum refineries, chemical plants, or electrical grids-become more complex and interlinked, and the number of users grows, the opportunities to interfere with its operations expand exponentially. There is a growing possibility that even accidental missteps could cause significant harm. This damage would not necessarily be limited to data but could strike at real-world infrastructure , with potentially devastating effects. Economic losses could be severe, and loss of life is possible.

Lesson number two: Cybercrime could be as significant as cyberwar. Four members of our panel cited profit-motive information crimes as a problem of potential importance. An information "protection racket" aimed at financial institutions could entail serious economic risks, and perhaps security risks as well. These crimes might use many of the same techniques as information warfare and could be difficult to distinguish from it. Indeed, in a world where rogue governments have supported themselves in part through counterfeiting major currencies, there may be no useful distinction. However, it is not clear that cyberwar and cybercrime will be amenable to the same countermeasures.

Lesson number three: The rise of artificial intelligence will change the nature of cyberwar. As computer systems "learn" to imitate human reasoning and skills, the nature of cyberwar will change. Instead of relying on human hackers to carry out their attacks, antagonists will automate their information warfare, relying on AI systems to probe opposing defenses, carry out attacks, and defend against enemy AI. This competition will quickly outstrip human control, or even monitoring. This is one aspect of the hypothetical "Singularity," the time when artificial intelligence exceeds our own and it becomes impossible even in theory to predict what will happen in the further future.

Lesson number four: The United States is losing its leadership in critical technologies. As other countries build up their technological capacity, the United States is allowing its own to deteriorate. As China and India turn out more scientists, engineers, doctors, and technicians, the United States has been producing fewer. As other lands spend more on research and development, the United States has been spending less. And as other countries devote more of their research budgets to fundamental science, where breakthroughs happen, the United States has focused increasingly on short-term applications. All this may put America at a serious disadvantage in future cyberwars.


In the spring of 2009, the U.S. government undertook new steps to meet this threat. The Pentagon is in the process of creating a Cyber Command center with the aim of protecting the Department of Defense's 17,000 networks and 7 million computers from attack. President Obama also announced a new "cyber czar" position within the administration. Scott Charney, former head of security at Microsoft , is said to be on the top of the shortlist. The question of how effective any one cabinet official can be against a cyberattack remains unanswered.

"They're still trying to fight this problem within the traditional command and control structure," says Patrick Tucker, senior editor of THE FUTURIST. "How does a czar take down an international, unaffiliated network of anonymous attackers? It's like using a hammer against killer bees."

Many questions about information warfare remain to be answered. What are the most likely targets? How would they be attacked? What are the probability and potential impact of each attack? What would the consequences be in terms of human lives, economic cost, and continuing disruption? How could we tell such an attack was coming? And most importantly, what could we do to stop it? These future-critical questions urgently need further study.