IWS - The Information Warfare Site
News Watch Make a  donation to IWS - The Information Warfare Site Use it for navigation in case java scripts are disabled

Ethics in Military and Civilian Software Development


© 1999 Sam Nitzberg http://www.iamsam.com sam@iamsam.com


Abstract: The modern world is in fact a world driven by information. One sensible view in which to divide computing disciplines is into aspects which are dominated by either civilian or military-based industry. In either of these realms, there are increasingly demanding requirements for computing systems to meet increasingly complex and highly information-dependent information processing objectives. The quality with which systems are developed for either civilian or military purposes has systemic, infrastructure-wide consequences. A number of the ethical considerations present in developing military and civilian software are similar, and are examined in this paper.


I Introduction


There are a number of very close similarities between "Civilian" industry and "military-based" industry. There are some differences; while military budgets are generally shrinking, commercial industry is enjoying a very prosperous time. The author does not make any claims of being a "dove," and any discussion of whether this funding trend is fundamentally good or bad is beyond the scope of this paper.


Some issues pertinent to both types of industry are illustrated in the following table:



Civilian Industry

Military-Based Industry

Secrets must be maintained

Trade secrets are protected by laws to protect trade secrets, as well as other laws to specifically protect corporations from industrial espionage.

Military and state secrets are protected by assorted governmental (e.g. federal) and organizational (e.g. Army or DOD - Department of Defense) laws and regulations.

Adversaries exist

Industrial espionage is a modern-day fact, especially in all high-technology enterprises. The global competitive market exacerbates competitive pressures and threats due to industrial espionage.


Industrial espionage may be launched by either private corporations, or corporations acting on behalf and in concert with a foreign power. [Winkler]

High-technology military / intelligence information may be sought by either agents of foreign powers, or organizations (including corporations) working on their behalf.

The world is a dangerous place

Executives in civilian industry have been targeted for kidnapping by organized / unorganized crime enterprises.

Civilians and Defense / Intelligence / Defense industry employees may be targeted by foreign intelligence services.

Internal threats

Employees may sell corporate secrets.

Defense industry workers may abuse their access of material in order to make it available to intelligence services.


It is interesting to note that computer models both reflect and impact the real world. Simulations are used to model traffic flow, stock market and business situations, and the effects of nuclear devices being designed. As the results of these simulations are realized and effected, their impacts are made very clear and real.


II Nature of the Beast


Software product developers must face certain challenges. Although a great many applications being currently developed have theoretic foundations which date back decades, very often, poor implementation and practices are used, resulting in buggy, if not dangerously flawed software products. The very nature of military and civilian computing systems and platforms has historically been very distinct, but modern weapons systems are increasingly being built upon conventional software products, ill suited in many ways to the demands of the modern warfighting environment [Nitzberg].


The Patriot missile system which failed to successfully track the Iraqi Scud missile which killed twenty eight American Soldiers during the gulf conflict may have failed to perform as desired due to a software problem [Littlewood]. A very real example of the real-life consequences to poor software development practices in civilian systems may be explored through an example of a computer-controlled x-ray medical diagnostic machine. Two cancer patients were in fact killed as a direct result of software errors in computer-controlled X-ray machines [Gotterbarn1].


Errors which exist due to poor or erroneously documented requirements or specifications would allow such a system to dispense lethal doses of radiation through no actual malfunction of the unit itself. On the other hand, syntactic errors in programmer code could compel the machine to apply lethal doses, as well. Curiously, errors in the requirements, program specifications, or code could actually, coincidentally, still permit proper operation - although the likelihood of this phenomenon occurring in fielded systems is quite low. Similar issues exist in military systems. Flaws in systems requirements, specifications, or program code can have very severe effects - including mission failures. Again, erroneous requirements or program specifications can result in a wide range of failures, even resulting in fratricidal engagements, as can errors in the program code itself. Traditionally, software complexity has been viewed as the source of errors in systems. At least in theory, defect-free software can be produced [Littlewood].


III DUAL-USE Technologies and the Millennium Bug


Dual use technologies are hardware and software appliances (strictly speaking, it also includes many biological and chemical agents), which may be used in ostensibly traditional commercial or "civilian" use, but, which may also be used to improve a nation’s warfighting or intelligence capability. Many technologies fall into such categories. Advanced computer systems may be used for pharmaceuticals research, or may be applied in the development of nuclear weapons; GPS (Global Positioning Systems) technologies may be used to assist mountain climbers, or to land warheads on target. The list of what constitutes dual-use technologies is a long one. The Internet itself, a long-term result and ancestor of work performed under the auspices of DARPA (the United States Defense Advanced Research Projects Agency) may itself be viewed as a "dual-use" technology. The Global Positioning system (GPS) was developed by the US military in order to better effect its many missions. GPS can be used to effectively guide and provide navigational support to military vessels, combat aircraft, covert action teams, and missile systems. What was not predicted when GPS technologies were being developed was how GPS would eventually be incorporated into civilian aircraft systems, automobiles, personal GPS systems which plug into computers equipped with mapping software, and personal handheld GPS units for hikers or other recreational use. It is not necessarily always possible to work on technologies and understand what their ultimate use will be. After all, the birth of computing was nestled in military technologies [NitzbergX].


The "Year 2,000 Problem," also commonly referred to as the Y2K problem actually refers to a great number of problems which will affect computers, most typically when their clocks roll-over from the last moment of 1999 to their first moment of the year 2,000, causing unpredictable system behavior. Actually, the problem is caused by methods used to efficiently represent dates and file markers on computing and embedded systems [Jager, Boutin, Comerford, Lefkon]. More dates than just Jan 1, 2000 are criticalities; one example is the arrival of the year 10,000 for systems using four digits to represent the date. A great number of systems which were built to support both mission-critical military and civilian purpose systems suffer from the Y2K problem [Boutin].


The Civilian Information Infrastructure is presently being analyzed and corrected to address the Year 2000 problem. As the time remaining decreases, the problem actually becomes more difficult to fix - adding personnel to the problem does not necessarily render the fundamental problem easier to solve. According to Ed Yourdon (a noted Software Engineer), just adding programmers to fix problems as the Year 2000 approaches these problems do not scale up well as time shrinks; trying to add more programmers is analogous to trying to make a baby more quickly by impregnating nine women and expecting a baby in one month. Naturally, any discussion considering the practicality, or indeed, the morality behind actually making manifest such conjecture, is beyond the scope of this paper.


Some perceive the costs associated with the "Millennium Bug" as being quite dire - "If you knew what the experts know, you’d be buying guns too" and see the potential for a near total collapse of civilization [Wired]. The author would refer to these proposed situations of failing power grids, communication systems, air traffic systems, and virtually all necessary computing functions as the "new nightfall" scenario. As if fears of an impending social collapse is not a serious enough concern for ostensibly a simple programming problem with a simple cause, the military has similar problems. According to a NSA (National Security Agency) representative, ‘The DOD’s Y2k conversion effort is a national security interest… All information detailing these information systems and the progress being made toward their conversions is considered to be highly sensitive.’ [FCW]


Due to the security issues which have arisen, the DIST (Defense Integrated Support Tool Database) containing information on Y2K fixes has been placed under security restrictions, which has restricted who may access this information further. On behalf of the ASD/C3I (Office of the Assistant Secretary of Defense for Command, Control, Communications, and Intelligence), the NSA has performed a security analysis on the DIST database and determined that the security measures in place were insufficient to protect the data in its purview.


The DOD has been cited as falling behind other governmental organizations in fixing its Year 2000 problems, and at the time of this writing, it seems to lack a seamless method by which personnel requiring access to Y2K information to obtain it.[FCW] One DOD official has characterized the Pentagon’s addressing of Y2K issues as grossly mismanaged.


A presumably not atypical timeline for nations investigating and repairing their Y2K-vulnerable systems follows[FCW]:




September, 1994

ASD/C3I identifies need to collect data on legacy/migration systems.

August, 1996

DOD begins using DIST to collect data on DOD Year 2000 conversions efforts

Nov, 1996

ASD/C3I memo tells DOD heads that "registration of information systems in DIST is mandatory"

Dec, 1997

ASD/C3 identifies DIST as the "central, authoritative database for tracking the resolution of Y2K-related problems"

Feb, 1998

ASD/C3I states aggregate of DIST data "could result in serious damage to national security"


In the absence of information to the contrary, it would seem fair to presume that this problem is not isolated to the United States or American military computing platforms. Many countries and organizations, including NATO utilize sophisticated computing tools and platforms to accomplish their missions, and they must wrestle with similar problems. The effectiveness with which these issues are addressed may significantly impact on nations’ security postures.


John Hamre, the United States Deputy Secretary of Defense informed the United States Armed Services Committee that the Pentagon does not believe the Y2K problem will directly result in an accidental nuclear exchange. In order to mitigate the risk of accidental nuclear exchanges related to Year 2,000 issues, the Pentagon will share traditionally restricted information with other, less Y2K prepared countries, regarding the nature of American nuclear and missile early warning systems[FCW2]. Mr. Hamre has also indicated that Hackers are expected to take advantage of any confusion caused by the Y2K crisis towards their ends of infiltrating both military and civilian systems [FCW2]. Fortunately, Mr. Hamre has also indicated that the Defense Department ‘went into hyperdrive’ with its Y2K work, and all computer systems necessary for national defense will be ready for the millenial roll-over [APP2].


Some amazing considerations regarding the scope and breadth of the Year 2000 problem are that on the small-scale, the developers of Y2K affected system understood the temporal constraints under which they would function, that customers of complex systems were generally informed of the consequences and their systems specifications, and that, in the large picture, with the multitudes of stand-alone and interacting systems, no one really knows precisely what will happen when the clocks do in fact roll-over. Considering how simply the Y2K problem could have been remedied on a system-by system basis at its source, this represents a fundamental failing in the software "sciences."


IV Professionalism


Whether persons in the software development industries are members of a profession or are merely engaged in an occupation is a common subject of debate. The licensing of software professionals is considered by some as a possible remedy to all poor practice and incompetence in the industry, and has been cited as a mechanism to answer the call to "protect us from their incompetence." [Gotterbarn2] Certainly, without some standard measures or guidelines, no approved standard for competence can exist, and therefore, no malpractice or negligence could exist, either. Whether software development is seen as a profession or occupation, certainly, more professionalism is needed.


A brief survey of professions, occupations, and associated licensing requirements follows:




Duties Include/May Include

Licensing Required

Barbers / Hairdressers

Cutting and Styling of hair

Applying of dyes to hair

Use of proper hygiene

Generally required.


Cutting and Trimming of nails.

Grooming of nails

Maintaining proper sanitization of equipment


Licenses (in New Jersey, USA) are awarded following the successful passing of a one hundred question test, to be taken every two years to maintain licensing. Regulations are vital to minimize the risk of the spread of infection. [APP]

Generally required.

Civil Engineers

Design and construction of bridges, roadways, industrial buildings and complexes, military complexes and transit systems, dams, etc…




"Personal Services"


Regular medical testing.


"Safe" practices.


Mandatory (Generally) where lawful, e.g. The Netherlands, United States (Nevada, outside the Las Vegas City Limits)

Software Engineers






















Design and construction of:


Medical diagnostic equipment,

Medical dosing systems <eg.

Computer-Controlled IV drips>

Air Traffic Control Systems,


Strategic nuclear weapon control systems,

Command and Control Systems,


Software components of anti-aircraft/missile systems, e.g. The Patriot


Aeronautic fly-by-wire systems, such as those used in the Space Shuttle, Stealth fighter and bomber aircraft, and civilian fly-by-wire aircraft


Automotive computer control systems


Banking and Financial systems


Implementation of all items listed above for Software Engineers.

No requirement.



The President’s Commission on Critical Infrastructure Protection has divided its work into five sectors based on commonalities amongst included industries. As defined, these sectors are:


  • Information and Communications
  • Banking and Finance
  • Energy, Including Electrical Power, Oil and Gas
  • Physical Distribution
  • Vital Human Services


The commission studied these sectors, their vulnerabilities, and approaches to the necessary solutions. [PCCIP]. While these sectors, taken together, can be viewed as necessary to the general well-being of the United Stated (and for that matter, and "first world" nation), these sectors are outlined so broadly so as to include almost any computerized contrivance in this domain.


One telling indicator of the quality of work is the degree to which a firm will stand behind and warrantee its wares. The software industry is not very well known for its warrantees, but is much more famous for its legal disclaimers absolving software firms for any and all liability for its products. One such unfortunate and sweeping disclaimer follows:


Cosmotronic Software Unlimited Inc. does not warrant the functions contained in the program will meet your requirements or that the operation of the program will be uninterrupted or error-free.


However, Cosmotronic Software Unlimited Inc. warrants the diskette(s) on which the program is furnished to be of black color and square shape under normal use for a period of ninety (90) days from the date of purchase


Note: In no event will Cosmotronic Software Unlimited Inc. or its distributors and their dealers be liable to you for any damages, including any lost profit, lost savings, lost patience or other incidental or consequential damage.


We don't claim Interactive EasyFlow is good for anything - if you think it is, great, but it's up to you to decide. If Interactive EasyFlow doesn't work: tough. If you lose a million because Interactive EasyFlow messes up, it's you that's out of the million, not us. If you don't like this disclaimer: tough. We reserve the right to do the absolute minimum provided by law, up to and including nothing.


This is basically the same disclaimer that comes with all software packages, but ours is in plain English and theirs is in legalese.


We didn't really want to include a disclaimer at all, but our lawyers insisted. We tried to ignore them, but they threatened us with the shark attack at which point we relented.45


Another extraordinary aspect of software marketing is the fact that the user generally pays for software updates. In other words, even if the product is faulty or needs amendment, the user pays the software supplier to provide more correct versions.



One common myth in computing is that there are no standards for producing software code. Quite to the contrary, there are a number of standards and methods not only for producing high-quality software products, but for software testing methodologies as well [Roetzheim, Freedman, Musa]. The unfortunate reality, however, is that there are defacto software disclaimers, and they tend to look very much like the specimen above.


V Quagmire


A common thread running through most of the issues which must be addressed to mitigate needless damage in either civilian or military environs appears to be the simple attempt for organizations to maintain some degree of foresight in how they develop and deploy their systems. With due care and the proper use of software development methodologies and a critical eye for detail, virtually every qualitative problem addressed above may be addressed.


Job descriptions for software professionals typically include lists of skills required to hire an individual for any given position, as well as a certain minimal number of years of experience or use of each skill. Naturally, such corporate or military job descriptions for open positions also may include credentials which are required to fill a position; for example, the prospective holder of an available position might be required to possess a "Microsoft Certified Engineer," credential, or a Bachelor’s, Master’s, or Ph.D. (Doctorate in Philosophy) degree. While such credentials may reflect a certain basic knowledge or sophistication, they do not necessarily demonstrate in any material way that a candidate will produce quality work on any safety or mission - critical application.


"Ultimately, though, as professionals with particular roles and responsibilities, carrying out practical tasks the ramifications of which are often profoundly unclear, the sorts of guidance that many normative ethical theories provide us also depends on our social knowledge of what it is that we are doing, on our understandings of the possible impacts of the projects we undertake, and on our ability to integrate abstract ethical theories with the (apparently) more practical decisions of the workplace. Just as no code of ethics guarantees ethical behaviour, no normative ethics can compel assent or assure its own appropriate application." [Rooksby’]


Critically important elements, however, are often absent from software professionals’ resumes, and seldom appear either as prerequisites to performing needed software engineering work, or as vitally important areas in which job holders are to be trained. Such oft-neglected areas include computing security, software testing, and the consequences of "mission failures."


Virtually all commercial and military applications are built upon "closed" operating systems or applications. There are traditional methods of security analysis which allow security decisions to be based on the likelihood of certain events being weighed, as well as their potential costs [Amoroso]. The use of such "closed" systems defy such analysis and are therefore used without any quantifiable understanding of the implicit risks.


Software testing is considered by many in the computing industry as an annoyance, and a hurdle to overcome in developing a product, prior to its release. Professionals responsible for designing, implementing, and performing system tests may not have a background or familiarity in more advanced software testing methods, and the quality and accuracy of software tests may be seriously jeopardized as a result.


VI Conclusion


Most of the challenges facing either the current civilian or military software developer are not necessarily totally new. While the technologies and mechanisms specific to any specific application may be unique, what is of increasing importance and consequence is that software developers have an underlying humanistic philosophy and context in which they perform their tasks. The fundamentally sticky problem in software development has been – and continues to be that developers must understand both the nature of their work and the consequences of their potential failures and take steps to ensure that their projects lead to successful long-term deployment.


VII References


Amoroso, Ed , Fundamentals of Computer Security Technology, PTR Prentice Hall, New Jersey, 1994, p. 17-29


APP, Asbury Park Press, Ask The Experts: Trouble Shooter, Nov. 6, p. B7, 1998.


APP2, Asbury Park Press, System said safe for 2000, The Associated Press, Washington, 15 Jan 1995, p. A-15.


Boutin, Paul, The Bugs in Your Future, Wired, January, 1999, p. 76-77.


Comerford, Richard, and Perry, Tekla, Brooding on the Year 2000, IEEE Spectrum, June 1998, p. 68-73.


Roetzheim, William H., Developing Software to Government Standards, 1991, Prentice-Hall, Inc., NJ


Computer Ethics: Cautionary Tales and Ethical Dilemmas in Computing, 2nd Ed., Tom Forester and Perry Morrison, The MIT Press Cambridge, Massachusetts, London, England, 1994.


FCW, Federal Computer Week, May ## 1998, Page 1, "NSA concerns could hamper DOD Y2K fix,"

By Bob Brewin, Heather Harreld, and Daniel Verton


FCW2, Federal Computer Week, June 8, 1998, Page 1, "U.S. to share Y2K Nuclear Data," By Bob Brewind and Heather Harreld.


Freedman, Daniel P. and Weinberg, Gerald M., Handbook of walkthroughs, inspections, and technical reviews: evaluating programs, projects, and products, Dorset Publishing Co, 1990, NY, NY


Gotterbarn1, Donald Gotterbarn, Informatics and Professional Responsibility, http://www-cs.etsu.edu/gotterbarn/ARtPP6.htm


Gotterbarn2, Donald Gotterbarn, Computer Practitioners: Professionals or Hired Guns, http://www-cs.etsu.edu/gotterbarn/ArtLP1.htm


Jager, Peter de, Y2K: So Many Bugs … So Little Time, Scientific American, January 1999, p. 88-93.


Lefkon, Dick, and Payne, Bill, Making Embedded Systems Year 2000 Compliant, IEEE Spectrum, June 1998, p. 74-79.


Littlewood, Bev, and Stringini, Lorenzo, The Risks of Software, Scientific American, November, 1992, p. 62.


Musa, Iannino, Okumoto, Software Reliability: Measurement, Prediction, Application, McGraw-Hill, New York, 1987.


Nitzberg, Improving Computing Security During the Development of DOD Computerized Weapons Platforms, National Information Systems Security Conference, Crystal City, VA, 1998.


Nitzberg, Information Warfare: Advancing the Art of War -- The Computer as Agent of Destruction, Work in Progress.


PCCIP, The President’s Commission on Critical Infrastructure Protection Report Summary.


Rooksby, Emma Rooksby, Posting on Computer-ethics@mailbase.ac.uk, October 29 10:20:27 PM, 1998.


Winkler, Ira , "Corporate Espionage," Industrial Espionage, 1997.


Wired, Kevin Poulsen, The Y2K Solution: Run for Your Life!!, August, 1998, Tagline from mailer page.