Maximize
Bookmark

VX Heavens

Library Collection Sources Engines Constructors Simulators Utilities Links Forum

Malicious code: An ethical dilemma

Glenn Watt
Proceedings of 12th National Computer Security Conf., Baltimore, October 12, 1989, pp.542-546
October 1989

[Back to index] [Comments (0)]

Abstract

In the early 1980s, the city of Cambridge, Massachusetts, voted to petition Harvard University to temporarily halt the construction of a very expensive laboratory for specialized genetics research. This action, initiated and supported by distinguished members of the faculty, recognized the potentially dangerous situation at hand. This example is typical of what professionals usually do when they encounter an immature technology. The information about the atomic bomb and other such devices also was tightly controlled by military professionals with an ethical standard that demanded control to assure the protection of the larger community. A technology equally dangerous to the national compuler security community is malicious code. It is a problem that has crossed international borders, and threatens the integrity of every type of system from personal computers to super computers. In 1985 J.M. Carroll and H. Juergensen performed mathematicai proofs showing that any current state-of-the-art time sharing, multiprogramming environment could not simultaneously support security and integrity without compromising protection, efficiency or both [1]. The National Computer Security Center's (NCSC) Trusted Computer System Evaluation Criteria (TCSEC) and Trusted Network Interpretation (TNI) guidelines do not specifically address viruses. In fact, the Internet Virus of 1988 might have propagated on a B2 system and perhaps even on an A1. Will technology alone solve the problem of malicious code? If not, how should we then compute?

[Read the article]

deenesitfrplruua