It is night over Germany, October 1943. Two BMW radial engines
are pounding through your helmet and headset. The cockpit of
the Junkers Ju88 night fighter is drafty and freezing cold and
an enemy RAF Lancaster bomber is as easy to find as a rat in
the Berlin Opera House with all the lights out. You have the
help of a ground interception controller, which is good. You
have two controllers, which is bad. One of them is a fake, a
German-speaking RAF officer. The two controllers are trying
to shout each other down. One of them lets fly with a volley
of foul language fresh from the Hamburg waterfront. The other
responds with forced calm: "The Englishman is now swearing."
"It is not the Englishman who is swearing," screams
the other voice. "It is me!"
This early venture into information warfare spawned imitations.
In 1944, the Germans copied the British use of a "ghost"
voice controller to confuse American pilots flying close-air-support
missions over the Western Front. In one incident, a suspicious
and quick-witted U.S. Air Force fighter pilot unmasked the German
controller by asking him to sing "Mairzy Doats."
In the intervening years, the goals of information warfare
(IW), in fact, have not changed. An attacker's goal is still
to put the adversary's resources in the wrong place, so
that they miss their targets or blunder into the attacker's
prepared defenses. An attacker wants to disrupt his adversaries
support operations, so that their forces do not get the spares,
fuel or ammunition that they need. The attack also works on
a second level: every successful IW strike makes the adversary
less and less confident in his own systems.
While such goals of information warfare have remained constant,
the tools of IW have changed beyond recognition. Information
exchanged over computer networks is now as vital to almost every
aspect of military operations as it is to business. From mission
plans to targeting and threat data, the military relies on information
to such a degree that the loss of data integrity would be disastrous.
To cite a simple example: if a hacker were to change one digit
of one time designation in an air tasking order (ATO), he could
cause a formation of F-16s to reach their refueling rendezvous
and find no tanker waiting for them. At best, the F-16s'
mission would be aborted; at worst, they could be unable to
recover safely to their bases.
But you can't ask a computer to sing "Mairzy Doats."
A New Angle on Defensive IW
The traditional approach to defensive IW has been security.
Locked doors protect the computers that access a classified
network. Firewalls, passwords and cards protect the data in
the computers. And the data itself may be routed along separate
lines. In the ultimate incarnation of security-the Department
of Defense's TEMPEST standards-computers, peripherals and
the rooms where they were located were elaborately sealed to
prevent the leakage of stray electrons.
For several years, a small and cohesive team of Northrop Grumman
researchers has been supporting Pentagon customers in the development
of a new and fundamentally different approach to defensive IW.
"Our strategy has been to keep the adversary outside our
networks-just like the Maginot Line," says Dr. Stephen
Taylor of the Air Force's Rome Laboratory. "But what
happens when there is no longer an outside-when
we are just inhabitants of cyberspace along with our adversaries?"
That cohabitation is increasingly the case. The military's
need for information is outrunning the capacity of its private
networks. Unique hardware is too expensive to cover all of the
Pentagon's needs. Coalition warfare links U.S. military
networks to portals that it does not own.
Moreover, says Taylor, the future adversary will be smart and
sophisticated, "not the lone hacker or terrorist but a
nation-state capable of coordinated attacks. They'll use
speed, mobility, and deception. They'll try to occupy the
space behind our firewalls and subvert our sensors. They'll
be creative and will find ways into the system. And they can
afford to pay insiders." In short, Taylor believes, there
is no practical, 100 percent effective way of keeping intruders
out of the system.
Instead, Taylor has coined the term "information resiliency"
to describe another level of defense, one that assumes that
intruders will break in, but which works in several ways to
limit the damage that they cause.
Forecasting and Recovering From an Attack
Northrop Grumman's work in this area started in 1988 when
the company began to address IW survivability issues, according
to Greg Swain, director of Information Assurance at Logicon,
the company's information technology subsidiary. "We
then began to look at how susceptible our products might be
The Northrop Grumman team, which includes people from its Electronic
Sensors and Systems Sector, has based its work on the timeline
of a cyberattack. As in any military operation, the hypothetical
cyberattack will begin with reconnaissance. This allows the
attacker to locate points where the system might be accessed.
Then, the attacker will attempt to enter the system, starting
with a basic access level and then working to establish as much
access as possible. For example, an attacker who can emulate
the system manager can establish multiple points of access and
eliminate records of his own access to the system. The deeper
the access, the more options an attacker enjoys when the attack
is mounted. After an attack, the intruder will cover his actions
and the defender has to restore the integrity of his data.
Commercially available products can detect the actual attack
and can perform a reasonably good job of safeguarding data in
the long term. The Northrop Grumman team has focused on two
unique areas-forecasting an attack and responding in real time
immediately after the attack-which allow the user to react rapidly
to an attack, or even forestall it, and let the defender re-establish
the system's operation as soon as possible. Forecasting
and recovery are linked, because recovery is easier if the attack
is not entirely unexpected.
Investigative NEWS Reports
That technology is now being championed by the Network Early
Warning System (NEWS) program, which has been developed by the
Air Force's IW Battlelab. "What's really notable
about that system," comments Paul Zavidniak, a senior technical
staff member at Logicon, "is that it sits above all the
other systems." NEWS can be set up to forecast an attack
in almost any network-based system and can use data from all
of them to generate a comprehensive picture of an impending
"There are no rules in IW," says Dennis McCallam,
a Logicon senior technical staff member. "To say, They're
not going to do that, is a bad assumption. But there always
have to be precursor events." Comments Zavidniak, "Reconnaissance
and analysis always precede an attack. That attack may be months
ahead, but someone first has to know where you are situated."
NEWS looks for 26 different attributes that may indicate that
a network event-a ping, a contact or a log-on session-could
be a precursor to an attack. They include the time of day, the
apparent source of the event and even the length of time between
keystrokes. NEWS may look for internal events such as unusual
hard drive activity. NEWS could be connected to the security
system of the premises where the system was accessed; if the
individual whose code or workstation was used to access the
system was not signed into the building, there could be a problem.
NEWS draws on probabilistic forecasting technology of the kind
that has been developed to predict failures of critical mechanical
systems, such as helicopter transmissions. "It's not
just a signature," says Zavidniak. "It infers and
forecasts from a broad perspective."
But "prediction" is a strong word, cautions
Zavidniak. "This is engineered to be a decision aid to
the analyst," he says. "We don't want to take
the analyst out of the loop." Warned of an attack, the
defender then has a number of options. Taking instant action
to deny access to an intruder may not be the best choice. "Hiding
behind a wall trying to protect your home is not giving you
the clues that you need," Zavidniak says. Instead, the
analyst may operate at a higher level of awareness, preparing
back-up data and letting the intruder think he is undetected.