To Previous Chapter To Table of Contents To Bottom of Page To Next Chapter

Chapter 13 - Evaluatiing, Refining, and Integrating Applications

  1. Risk from Individual Applications
    3 approaches for new applications
    1. Customizing packages by selecting options
    2. Developing new functions using the macro facilities
    3. Developing complete custom applications

    application risk = probability(error) x cost(error)
    organizational risks
    1. faulty decisions made as aresult of erroneous information
    2. additional time required to correct erroneous information
    3. productivity lost due to personal systems development

    individual risks
    1. errors introduced due to lack of error control
    2. data lost due to mistakes that destroy data files
    3. time lost due to use of poor tools

  2. Application Evaluation
    1. Evaluating Fit with Real Requirements
      • identify unnecessary (although customary) requirements
      • identify innovative requirements that add value
      • use the full pontential of tools
      • compensate for biases
        • recency bias - don't unduly focus on recent events
        • concreteness bias - physical counts "more important" than "soft data"
        • availability bias - user focuses on data readily available rather than generating new data
        • small sample bias - user may be unduly influenced by events that occur infrequently
      • evaluating completeness of application requirements

      • Are all requirements necessary?
      • Does the application reflect innovation regarding the function to be performed?
      • Do the requirements take into account the capabilities of the chosen development platform?
      • Do thye take advantage of the strengths of the hardware and software they will operate on?
      • Does the application reflect biased behavior with respect to recency ofevents, concreteness of data, and availability of data?
      • Does the analysis and presentation of low occurrence of data aid the user in overcoming small sample bias?
      • Does the application have requisite variety?
    2. Evaluating Robustness and Usability
      • robustness - behavior of the application when a mistake is made or a fault occurs (error detection and error handling)
        1. correct responses to all correct, complete inputs
        2. explanation if systems make decisions or assumptions
        3. rejection of incorrect or incomplete inputs/messages
        4. instructions to users
          • what to do to provide correct, complete inputs or correct errors
          • what to do to recover
      • usability - behavior of the application with respect to the user (user friendly)
        different users require different levels of robustness
        • Single developer/user and frequent use
        • Single developer/user and infrequent use
        • Colleague/department use
        • unknown user

        • Does the user understand the nature of the application, the inputs, and the use of the outputs?
        • Is documentation satisfactory for persons using the application?
        • Does the application present a clear and consisten image that is relatively easy to learn, use, and remember?
        • Will the user be able to understand the behavior of the application if mistakes are made in using it?
        • Given the intended use, will there be an appropriate and well-defined means of correcting for the effects of data and system errors?
    3. Evaluating Appropriateness and Suitability for the Task [task/technology fit]
      • technology employed
      • user interface
      • organizational culture
      • responsiveness to changing task needs

      • Does the application employ appropriate technology?
      • Is the interface suitable for all users of the application?
      • Does the application violate organizational norms or culture?
      • What changes can be anticipated and how feasible will it be to make them?
    4. Analyzing Risks [risk analysis - analysis of effects of errors in the application and its use]:
      1. financial loss from incorrect or incomplete data
      2. loss of data
      3. loss of time

      To evaluate financial risk, assess:
      1. potential loss with each use of the application
      2. expected number of uses per year
      3. total exposure per year
      4. estimated probability of error with use
      5. computed financial risk

      To evaluate risk of data loss, assess:
      1. the value of the outputs to the decisions or actions it supports.
      2. the per period (yearly) exposure based on the potential loss per use and number of uses
      3. the probability of partial or complete loss of data
      4. the financial risk of data loss

      To evaluate risk of lost time, assess:
      1. the extra time required to repeat inputs or correct data and product correct outputs
      2. the proportion of applicationuse where time is lost due to design flaws
      3. lost time per period
    5. Using the Results of Application Evaluation
      • prototype evaluation
        • discard prototype
        • use prototype as is
        • refine prototype
        • use prototype as specification, but discard as system
      • testing
        • if risk is relatively small, testing can be done by user/developer
        • if risk is moderate, testig should involve others in the workgroup or department
        • if the risk is relatively high, testing should be done by the "professional applications testing group"

  3. Refining the Application
    1. Refining the Input Interface [add input validation to detect and handle incorrect input]
      • if a stored record is referenced, display its values for visual validation that the record is correct.
      • display the meaning of codes when they are input
      • when there is a computation involving several factors, show the factors used.
      • display warning messages if a dat item appears to be out of the normal range
      • display a count of input items entered and a control total for numeric data
      • provide data for reconciling input data with other control information
    2. Refining the Output Interface - remove ambiquity
    3. Refining Data Integrity [referential integrity - need for referenced records to be present in the database (insertion/deletion anomalies)]
    4. Access Control and Backup
      • authentication techniques
      • backup (offsite?)

  4. Application Integration
    1. Standardization
      1. list all component names that must be understood to use the application in its normal use
      2. organize components according to external user related concepts.
      3. assign meaningful names to all listed objects in a consistent pattern.
      4. rename the objects and test to make sure functionality has not been lost or errors introduced
    2. Command Interface
      1. List the possible user actions and organize them to match the user functions provided by the application
      2. Design the user Command Interface using the outline of user actions and application functions
      3. Use the features of the software package to implement the specified menu or graphical command interface
    3. Linking Other Applications
    4. Third-Party Add-ons

To Previous Chapter To Table of Contents To top of page To Next Chapter