FREE ELECTRONIC LIBRARY - Dissertations, online materials

Pages:     | 1 |   ...   | 8 | 9 || 11 | 12 |   ...   | 16 |

«Item type text; Dissertation-Reproduction (electronic) Authors DUNN, THURMAN STANLEY. Publisher The University of Arizona. Rights Copyright © is ...»

-- [ Page 10 ] --
applying this technique it is possible to ensure with a high level of confidence, a near optimum solution whi le only examining a fraction of the possible combinations, or alternative solutions. Recall that the basic procedure of this technique is to select a base "combination to beatll, then iteratively take random samples of additional combinations attempting to beat the base combination. When a superior combination is found it becomes the new "combination to beat" and the process is repeated. This process continues until an entire sample is compared to the residing "combination to beatll without discovering a superior combination. Based on the principles of discovery sampling, it can then be stated with a given statistical confidence, that the reigning combination is within some specific top percentage of possible combinations. For example, if there are 200,000 possible combinations, a sample of 3,000 which does not produce a combination superior to a reigning "combination to beatll may be interpreted to mean that the reigning combination is within the top.2 percent of possible combinations with a confidence level of 99.8 percent.

Two issues should be addressed at this point. First, is the size of samples which will be required for phenomenally large numbers of combinations. In the above example a sample size of 3,000 will provide the given statistics for a population of 200,000. But, how large a sample will be required for a much larger population, for or the 10 million 17 example, 250 trillion possible combinations, possible combinations for the threat matrix in Figure 33. The second issue concerns the number of iterations which may be required before a full sample is evaluated without discovering a combination superior to the base combination, or "combination to beat".

–  –  –

equation for increasing population sizes for given confidence levels and rates of occurrence. Then, determine the points at which the sample sizes level off and use them for larger populations. It may be noted from the tables, for example, that a sample of 3,000 provides a.

99.B percent probabi 1ity of finding at least one occurrence given an occurrence rate of.2 percent for population sizes 30,000; 40,000;

50,000; 100,000; 150,000; or 200,000. The sample size in this example has leveled off at 3,000 for populations 30,000 of above. Thus, using the shortcut alternative, a sample size of 3,000 might be used if a

99.8 percent confidence of finding at least one occurrence given a rate of occurrence of.2 percent without regard to how large the population is. Th.e disadvantage of the shortcut methods is that they are not scientifically defensible, thus they may be hard to justify. However, from an intuitive standpoint it should be relatively easy to accept them. With random selection of samples a sample of 3,000 should provide approximately the same probability of discovering one occurrence whether the population is 30,000 or 30 billion, given the same rate of occurrence. For example, with a rate of occurrence of.01 percent there would be three such occurrences in a population of 30 thousand and three million occurrences in a population of ten billion.

It should be fairly easy to accept, on an intuitive level, that a sample of 3,000 from either group should produce approximately the same probability of finding at least one occurrence, assuming the samples are strictly random.

Iterations The second issue which should be addressed is the number of iterations of samples which might have to be evaluated before a solution is found. In Chapter 6 an Iterative Discovery Sampling technique was described which provides a solution to the "Combinatorial Dilemma". This technique converges to a near optimum by iteratively taking samples of possible alternatives. The proximity of the solution to the optimum can be stated with statistical precision. Thus, it may be stated with a given level of confidence that the solution is within some specific top percentage of possible solutions.

The first step in the process described in Chapter Six is to take a full sample and select the best combination. This combination b~comes then the "combination to beat". For example, given a population of 200,ooo and a desire to ensure, with a 99.8 percent confidence, a solution within the top.2 percent, a sample of 3,000

–  –  –

existing.. combination to beat 11 • This reigning combination may then be interpreted as bei11g within the top.2 percent of possible solutions with a 99.8 percent confidence.

–  –  –

Figure 36. Sample Iterations of Resource Optimization Model up and run by computer operators for a given period.

The third column is one random selection, or combination of sample sizes for the various schemes. For the first item, T1, the sample size is 1,000 for transactions added by data entry/terminal operators. The fourth column represents the estimated time in hours to examine one transaction, or activity and the fifth column gives the total estimated hours required to examine the sample given in the fourth column.

The last four columns determine the detection quotient for each = TPe) and for the system as a whole (System DQ = Sum system threat (DQ of individual DQ1s).

The system DQ value measures the effectiveness of each combination of lip II and lie II values. For the combination in Figure 36 the system DQ value is 26.85. Using the Resource Optimization Model, the DQ value would be compared for various randomly selected lip II and "e" combinations until a combination withstands comparison to a complete sample of combinations without being beat.

As indicated in the mathematical statement of the problem, the constraint in the Resource Optimization Model is the total number of hours available to spend in investigation. Notice that the hours required to perform the investigation described in Figure 36 is 1695.

For purposes of illustration, this will be assumed to be the number of hours available. In actuality, the IIpll and "e" values for a given combination will probably have to be adjusted up or down on a random basis to conform to the available hours. For example, given 1695 available hours, assume that the IIpll and "e" combination at Figure 37 required 1950 hours. It would be necessary to adjust lip II and lie II values downward on a strictly random basis until the total adjusted hours required falls within (is less than or equal to) 1695.

No. of Investigative Investigative Time Required Time Required Activities Sample per Sample Total Sample DQ II Til liT (Pop.) Size (hours) (hours) ~~ Value II

–  –  –

combination in Figure 36.

Following the methodology described above, assume the combinations in Figures 36 and 37 are the first two combinations in the first full sample. Using the example above with a sample size of 3,000 the combination in Figure 36 would be retained as the "combination to beat" and 2,998 more combination comparisons made. The best of these combinations would be the base "combination to beat" and the next sample drawn. The process would be continued until a combination withstands comparison to a full sample of 3,000 without being beaten.

At that point the reigning combination could be considered in the top.2 percent of possible combinations (in this case, allocations of available resources) with a confidence level of 99.8 percent.

The prospect of repeating the process contained in Figures 36 and 37 therein several thousand times should illustrate the benefits of automating the Resource Optimization Model. Although the process could be performed manually, it is very labor intensive and lends itself extremely well to automation.



The research for this dissertation indicates that much is yet to be done in detecting and reporting computer fraud. As illustrated in Chapter One, reported cases of computer fraud probably represent a very small fraction of actual cases. While it may be logical to assume that reported cases of computer fraud are representative of the population of actual cases, this assumption cannot be defended on any scientific basis.

The typology in Chapter Two is provided as IIbest available ll evidence of vulnerabil ities of various types of computer systems. A measure of comparative vulnerabilities for various computer system types, developed for this dissertation, is provided which combines relative frequency of occurrence and dollar impact values for different computer fraud types using the formula vulnerability (V) Frequency = (F) times Impact (I).

The primary distinction between the treatment of reported cases in this dissertation and other sources reviewed during the research was this formal combining of frequency and impact into one indicator.

Other sources deal with them separately, or deal with only one, to the exclusion of the other.

A methodology for threat assessment is developed in Chapter Three. This methodology is based on a matrix approach which combines

–  –  –

comparative vulnerabil ities and threat values for different computer fraud type/perpetrator type combinations based on reported incidents of fraud. However, the research to this point indicates such a limited association between the number of reported cases and actual cases that the results cannot be extended to specific systems on any scientific basis. Although it would be nice to simply concentrate on highly vulnerable systems and perpetrator types as reported in Chapters Two and Three in assigning computer fraud detection resources to specific systems, the research does not support this approach.

Based on this conclusion, a methodology was developed for

evaluating threats and allocating resources to computer fraud detection for specific systems. This methodology consists of four components in addition to the threat assessment matrix approach developed in Chapter Three.

First, is the detection quotient in Chapter Four, a measure of the effectiveness of different allocations of available resources to computer fraud/perpetrator combinations for given systems. The detection quotient, developed specifically for this methodology, is based on the threat value discussed earlier in conjunction with attributes of the discovery sampling approach.

Second, is a threat assessment methodology for specific systems, presented in Chapter Five. This methodology is based on the general threat assessment methodology from Chapter Three in conjunction with the Delphi Approach, the Churchman-Ackoff technique and a Controls Analysis process developed for the methodology. This combination of techniques forms' a unique approach to threat analysis aimed at maximizing the effectiveness of subjective threat assessment.

The third component was necessitated by the phenomenally large number of possible allocations of resources associated with large This situation, entitled liThe Combinatorial Dilemma" in systems.

Chapter Six, occurs when the large number of possible alternatives precludes comprehensive analysis. A general solution to the combinatorial dilemma is provided in Chapter Six which provides a statistically defensible near optimum alternative while only examining a small fraction of possible alternatives. Although the combinatorial problem is well know, the solution in Chapter Six is unique to this dissertation.

The fourth major component of the methodology for allocating resources to computer fraud detection in specific systems is the resource optimization model in Chapter Seven. This model converges, through an iterative process, to a near optimum allocation of computer fraud detection resources utilizing the concepts developed in Chapters Four through Six.

Finally, in Appendices A and B, investigative techniques and automated techniques associated with computer fraud detection are presented.

–  –  –

intentional acts such as computer fraud discourages such an approach.

Rather, the methodology, tailored to the specific characteristics of given computer systems, is suggested.

–  –  –

significance. Second, available detection resources must be assigned to these specific threats. This creates a resource allocation problem since only a fraction of the resources required to completely exami ne every threat will typically be available.

The methodology presented in this dissertation answers these two challenges in a way which should produce a near optimum allocation of resources to computer fraud detection for the threats which are relevant to a given system.

Stronger legislation on reporting computer fraud and continued research wi 11 hopefully increase available data on computer fraud and provide more insight into threats surrounding various types of computer systems. However, it is suggested that "Cookbook" approaches which might be effective in detecting unintentional acts such as errors and omissions in data, will probably continue to have only limited success against intentional acts such as fraud.

In conclusion, computer fraud threat assessment and resource allocation must be tai lored to specific systems. The methodology in this dissertation provides such an approach. Hopefully, this dissertation will stimulate further research in the areas indicated in the preface, specifically: expansion of case data; the deterring effects of perceived detection capabi 1ities on would be perpetrators and the "live monitoring" technique discussed in Appendix A.



There is a considerable amount of literature available on the subjects of Computer Auditing, EDP Auditing, Automated Auditing, etc.

Additionally, an audit or investigative ability probably either exists internally or is available externally to most organizations large enough to be seriously concerned with computer fraud.

For these reasons this thesis does not address audit or

Pages:     | 1 |   ...   | 8 | 9 || 11 | 12 |   ...   | 16 |

Similar works:

«The Texan Newsletter of the Texas NTS CW Net (TEX) Net Manager: Steve Phillips, K6JT, Plano TX (k6jt@arrl.net, 972-517-3332) Assistant Manager: Floyd Bumpus, N5EL, Temple TX (fbumpus@hot.rr.com) May 2006 Here we are in May already. The change to 7143 for early TEX appears to be working better now after a couple weeks of inconsistent results. We’ll leave it there for now, but NCS stations are still urged to carefully assess the conditions and switch to 3643 at 7:05 if 40 meters seems to be...»

«Exports and Firm-level Efficiency in the African Manufacturing Sector Arne Bigsten(1), Paul Collier(2), Stefan Dercon(3), Marcel Fafchamps(4), Bernard Gauthier(5), Jan Willem Gunning(6), Jean Habarurema(5), Abena Oduro(7), Remco Oostendorp(8), Catherine Pattillo(9), Mans Soderbom(1), Francis Teal(3), Albert Zeufack(10) November 1998 Abstract In this paper, we use firm-level panel data for the manufacturing sector in four African countries to estimate the effect of exporting on efficiency....»

«Terminology Work in Maritime Human Factors Situations and Socio-Technical Systems Thomas Koester Frydenlund (This page is left intentionally blank) Terminology Work in Maritime Human Factors Situations and Socio-Technical Systems (This page is left intentionally blank) Thomas Koester Terminology Work in Maritime Human Factors Situations and socio-technical systems Terminology Work in Maritime Human Factors Situations and Socio-Technical Systems © Thomas Koester 2007 On consignment at...»

«PAT/RFD330-08/09/2014-0473778 _ _ Dipartimento Infrastrutture e Mobilità Via Gazzoletti, 33 38122 Trento Tel. 0461/497661 Fax 0461/497470 e-mail: dim@provincia.tn.it Egregi Signori Direttore generale della Provincia Dirigenti Generali Responsabili dei Progetti speciali Dirigenti di Servizio Dirigenti di Agenzie Enti Strumentali della Provincia LORO SEDI e p.c. Spett.le Autorità Nazionale Anticorruzione Via di Ripetta, 246 00186 ROMA segreteria.autorita@pec.anticorruzione.it Spettabile...»

«THE FOURTH ANNUAL TRYGVE LIE SYMPOSIUM ON FUNDAMENTAL FREEDOMS The Role of Social Media in Promoting Democratization and Human Rights: Prospects and Challenges Wednesday, September 21, 2011 1:00-2:45pm at the Trygve Lie Center for Peace, Security & Development International Peace Institute 777 United Nations Plaza, 12th Floor (Corner of 1st Avenue and 44th Street) TRANSCRIPTION Terje Rød-Larsen: Excellencies, Ladies and Gentlemen, dear friends and colleagues, good afternoon to everybody, and...»

«Archief SHM 1818-1821 Markeverdelingszaken Kerspel Goor Extract uit het rapport van gecommitteerden des Carspels Goor dd 1 maart 1818. Op de erfgenamens vergadering gehouden den 14 july 1815 aan de ondergeteekende gecommitteerden des Carspels Goor opgedragen zijnde om de etc. Zij zijn verders ter regeling van de limites tusschen deze en andere markten etc. Over de markten scheiding tusschen het Carspel en Weddehoen hebben gecommitteerden de eer te berigten dat volgens de aanwijzing der...»

«ERDC SR-10-2 Water Resources Infrastructure Literature Review – Vegetation on Levees Maureen K. Corcoran, Donald H. Gray, David S. Biedenharn, December 2010 Charlie D. Little, James R. Leech, Freddie Pinkard, Pamela Bailey, and Landris T. Lee Engineer Research and Development Center Approved for public release; distribution is unlimited. Water Resources Infrastructure ERDC SR-10-2 December 2010 Literature Review – Vegetation on Levees Maureen K. Corcoran and Landris T. Lee Geotechnical and...»

«See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/228930706 ARUM: application resource usage monitor Article · January 2008 CITATIONS READS 3 authors, including: Rashawn Knapp Karen L. Karavanic Intel Portland State University 6 PUBLICATIONS 29 CITATIONS 36 PUBLICATIONS 1,330 CITATIONS SEE PROFILE SEE PROFILE All in-text references underlined in blue are linked to publications on ResearchGate, Available from: Karen L. Karavanic...»

«Cast Away and the Contradictions of Product Placement Ted Friedman SUMMARY. This essay looks at implications of product placement in Cast Away, the 2000 film in which Tom Hanks plays a Federal Express executive who is stranded on a desert island before making his way back home. It argues that Cast Away is a particularly valuable case study because of the conflict between its relentless product placement and its dark vision of contemporary global capitalism. The article investigates four aspects...»

«13th World Conference on Earthquake Engineering Vancouver, B.C., Canada August 1-6, 2004 Paper No. 3286 CHARACTERIZATION OF DYNAMIC ASPERITY SOURCE MODELS FOR SIMULATING STRONG GROUND MOTION Luis A. DALGUER1, Hiroe MIYAKE2, and Kojiro IRIKURA3 SUMMARY Asperity source models are important since recent studies have clarified that the main contribution to strong ground motion comes from the asperity area. These models are defined by the fault parameters: fault area, average slip, maximum slip,...»

«1 Universidade do Estado do Rio de Janeiro MADNESS OR MASK OF PREJUDICE? – REPRESENTATIONS OF WOMAN’S MISCEGENATION AND SEXUALITY IN WIDE SARGASSO SEA AND JANE EYRE Heleno Alvares Bezerra Junior Dissertação apresentada como requisito parcial para obtenção do grau de mestre no curso de Mestrado em Literaturas de Língua Inglesa do programa de Pósgraduação Strictu Sensu da Universidade do Estado do Rio de Janeiro. Orientadora: Prof ª. Dr ª. Maria Conceição Monteiro Rio de Janeiro...»

«A Model to Assess Fatigue at Joint-Level Using the Half-Joint Concept I. Rodrígueza, R. Boulicb, D. Meziata inma@aut.uah.es, ronan.boulic@epfl.ch, meziat@aut.uah.es a Computer Engineering Department, University of Alcalá Campus Universitario. Escuela Politécnica. Alcalá de Henares, Madrid, Spain b Virtual Reality Laboratory. Swiss Federal Institute of Technology Lausanne, Switzerland In this paper we focus on the modeling and evaluation of performance factors as human fatigue at joint...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.