«Item type text; Dissertation-Reproduction (electronic) Authors DUNN, THURMAN STANLEY. Publisher The University of Arizona. Rights Copyright © is ...»
There are several issues, however, which should be addressed to bring the methodologies in this thesis in 1ine with the technologies and methodologies for auditing or investigating computer systems.
Investigation VS Audit The first issue that wi 11 be addressed revolves around the selection of the term "The Investigation" as a title for tiris chapter rather than "The Audit". The reason for choosing liThe Investigation" is its connotation of examining either past or current activities.
liThe Audit", on the other hand, connotes an examination of past activities. This does not mean that audits never address current activities, but rather that the word "Audit" typically connotes review of past activities. These connotations are supported by the dictionary (The American Heritage Dictionary of the English Language 1976) where
the following definitions appear:
Audit - 1. An examination of records or accounts to check their accuracy. 2. An adjustment or correction of accounts.
3. An examined and verified account.
Investigate - To observe or inquire into detail; examine systematically. Search into: to trace or track.
Notice that the general connotation of "Audit" from the above definition is past tense while the definition of "Investigate" suggests examination of either current activity or past activity. This latter type of examination is necessary for evaluating the broad spectrum of computer fraud threats. A simple review of the threats in Figure 37 should confirm this.
For example, assuming that comprehensive transaction registers are maintained for all transactions, an audit or internal review group should be able to evaluate the first threat "Transactions Added by Data Entry/Terminal Operators". Further, this should be achievable after the fact by examining historical records, specifically, the transaction registers. However, this technique may not be adequate for program changes, fi le changes or improper computer operations where there may be no record of extraneous activities. It might also be inadequate for the above "transactions added" example if it is possible to add transactions without entry on the transaction register. It certainly would be inadequate for transactions if no register existed.
Where "after the fact" examination is not adequate it is necessary to incorporate current activities as well as historical into the examination of computer system threats, thus, the term "Investigation" is appropriate.
environment it is not unusual for large systems to process hundreds of thousands of transactions per month requiring thousands of file updates, program changes and computer operation cycles. In many cases there is simply not enough time between processes to examine every activity. Further, even if it were possible to apply enough resources to perform a total evaluation, the costs would probably be prohibitive.
The infeasibility of total evaluations is pretty well established, certainly for large systems.
The second assumption is not too difficult to accept for activities easily evaluated through the use of traditional auditing techniques such as transactions added where good transaction registers exist. The assumption is more difficult to accept in those instances where traditional auditing techniques are not as easily applied such as improper computer operations. Th i s latter category of activities wi 11 be the focus of much of the remainder of this chapter. Hopefully, the methodologies presented will diminish doubts regarding this assumption.
The third assumption, like the second assumption is quite believable for activities which are amenable to traditional audit techniques. Using the above example, it should be fairly easy to estimate the time required to examine each transaction in a system where good transaction registers exist. evaluations have been made If in the past, time estimates can probably be made from past experience.
If not, it should be possible to estimate reasonably by simply evaluating several transactions and recor'ding time requirements. It may be necessary to adjust the initial estimates after a few iterations to account for increased proficiency, unusual transactions, etc., however, it should not be too difficult to establish reliable estimates. For activities which are not amenable to traditional audit techniques the time estimates will be more difficult but should still be achievable. It is suggested that, once the methodologies for dealing with this type of activities are established, time estimates to accomplish them become feasible. Therefore, as with the second assumption, the methodologies discussed in the remainder of this chapter should alleviate many of the doubts regarding the third assumption.
If, however, in the final analysis, it is determined that the capability simply does not exist to investigate a particular threat and it is not feasible to develop this capability, or if it is impossible to estimate the time required to investigate an activity, adjustments may have to be made. If, for example, the organization has no capabi 1ity to examine computer operations by computer operators, zero resources could by applied to this threat. Although these resources could be applied to other threats, there is the potential that the resulting systems detection capability, as measured by the system DQ will be lower than if the capability had existed.
The above case of zero capability is not likely. Although it may be small, almost any investigative effort should produce some possibility of detection. Even a small detection capability might have a significant abatement potential. This leads to the final observation which revolves around the deterring effect of computer fraud detection.
Treating The Deterring Effect Of Computer Fraud Detection Recall from the Computer Fraud Detection Model in Figure 8 that if deterrents to computer fraud are strong enough prevention and detection are unnecessary. Thus, in an ideal situation, high morals or fear of imprisonment, for example, might totally eliminate computer fraud threat. Such an ideal situation is, unfortunately, the exception rather than the rule. As the research showed, computer fraud is all to often not detected and even when detected it is often not prosecuted.
Assuming that prevention and detection are required there is, however, a relationship between these measures and deterrence. To clarify, assume that there is zero capabil ity to detect fraudulent computer operations by computer operators. Actually, it is only necessary that the computer operators perceive a zero capability. Basically, given this total lack or perceived total lack of detection capability, it is fair to assume that those with a propensity and ability to commit fraud will do so since would be perpetrators will view it as a "nothing-tolose, everything-to-gain" situation.
Now assume that, instead of a zero capability, the computer operators perceive a ten percent capabi 1ity on the part of management to detect fraud. Does it follow that ten percent of would be perpetrators wi 11 be di scour aged by th i s perceived ten percent detection capabil ity? How about a perceived twenty percent; forty percent; eighty percent or ninety-nine percent perceived capability?
If so, the relationship between detection capability perceived and deterrence would be as shown in Figure 38.
Observe in Figure 38 that there is a one-to-one positive relationship between the perceived detection capability and the percentage of would be perpetrators deterred from committing fraud.
Does this relationship seem logical? Although a review of the P 100 E R 90 C E 80 N T 70 A G E D E T E R R 20 E D 10
perceived probabil ity of zero that a fire wi 11 occur in a person I s home, it is highly unlikely that any money would be spent on fire insurance. However, given a very small perceived probability of fire, just slightly greater than zero, a very large percentage of people
deterrence might more closely resemble the curve in Figure 39 than the one in Figure 38.
The curve in Figure 39 indicates that a large percentage of would be perpetrators will be discouraged from perpetrating fraud by a small perceived detection capability. The curve then continues upward but at a decreasing rate until it reaches 100 percent on each axis indicating that 100 percent of would be perpetrators wi 11 be discouraged by a 100 percent perceived probability of getting caught as in Figure 38. However, there is a significant difference in the relationship between the perceived detection capability and deterrence in the two curves. The relationship is, of course, quite complex, and it is not intended that either Figure 38 nor Figure 39 be interpreted as an accurate illustration, but rather to point out the diverse possibilities. The exact nature of this relationship would require a significant research effort to solve. In fact it would probably make a good research topic for a dissertation in the behavioral sciences.
As indicated above and illustrated in the Computer Fraud Model in Chapter 3, a perceived detection capability becomes a deterrent.
The above unresolved questions indicate that the exact nature of this deterring effect is unknown. It is, therefore, impossible to totally quantify its effects. However, it is included as a topic in this chapter for two reasons: First, to suggest a lucrative area for research and second, to discourage the superficial elimination of threats from resource allocation simply because detection capabilities are low.
There might be a strong temptation to avoid allocating resources to threat activities fer which a low detection capability exists. This action might well be supported by the curve in Figure 38.
However, it would probably not be supported by the curve in Figure 39 where even a small perceived detection capability discourages a large percentage of wou 1d be perpetrators. Wh il e the Resource Opt imi zat ion Model in Chapter 7 does not deal directly with the deterring effects of perceived capabil ities, it does not el iminate threats from resource allocation simply because of a low detection capability. If incremental assignment of resources to a threat with a low detection capability results in a larger positive gain to the system "DO" value than incremental assignment to a threat with a higher detection capability the resources are assigned to the threat with the lower detection capability. An analogy may be drawn from the example toward the end of Chapter 7 where the incremental assignment of resources to a threat with a small threat value resulted in a larger net gain to the system "DQ" value than the assignment of these same resources to a threat with a larger threat value.
Although the Resource Optimization Model does not deal directly with the quantitative effects of perceived detection capabilities, there is an indirect treatment. To illustrate, it is necessary to think beyond the first iteration of the Computer Fraud Detection Model in Figure 8 of Chapter 3. Assume, for example, that computer fraud threats were identified for the first time a year ago, for a given organization and that the Resource Optimization Model was run and investigations accomplished during the past year.
Assuming that the activities of the past year were at least marginally effective, would be perpetrators' perceptions of the org an i zat ion's ab i 1i ty to detect computer fraud sh ou 1d be greater th an they were a year ago. Now, assume that the Computer Fraud Detection Program for the next year is in the initial stages of implementation.
When the systems threats for the second year are identified and evaluated, the deterring effect of computer fraud detection capabilities emanating from the previous year's program should be considered. Thus, the threat profile might change from one year to the next, thereby influencing the final allocation of resources. Over time, changing detection capabilities will effect this allocation, thus the deterring effects of these capabilities does have an indirect impact on resource allocation.
Relationship Between Controls And The Investigation It was demonstrated in Chapter 5 that Controls Analysis had a tempering effect on the threat matrix. Threat values of the threats initially identified in an unconstrained fashion were adjusted downward or eliminated after Controls Analysis. Controls, thus, have a direct impact on the investigation since they determine, to some extent, which threats will be considered for examination and to what degree.
Existing controls will also partially determine the nature of the investigation since certain controls will influence the need for specific techniques of examination. For example, less emphasis on terminal operator identification verification may be required in a system containing two levels of security protection (e.g., system identification and password) than a system with only one level (e.g., password only). Finally, certain controls are essential to the investigation. Without these controls it becomes difficult, if not impossible, to perform an evaluation. For the most part, these types of controls are also essential to the basic operation of a system. For example, in a system with a high volume of transaction input it would be difficult to operate the system without a transaction register on which transactions added, altered or deleted are recorded.
Elements of an ADP System Basic to an understanding of EDP systems is a familiarity with the elements involved. EDP systems generally include, in some form,
the following components or elements:
Hardware Software Documentation People Data (Files) Procedures Additionally, more and more systems involve local or long haul communications, or both.