Software Reviews & Inspections

Types identified by IEEE Standard:

  1. Management Reviews

  2. Peer Reviews

    1. Walkthroughs
  1. Inspections

  2. Technical Reviews

  3. Audits

Technical reviews

Confirms that product

  • Conforms to specifications
  • Adheres to regulations, standards, guidelines, plans

  • Changes

    • properly implemented

    • affect only those system areas identified by the change specification

  • Software artifacts subject to technical reviews

    • SRS: Software requirements specification

    • SDS: Software design description

    • STD: Software test documentation

    • Software user documentation

    • Installation procedure

    • Release notes

  • Outputs = documented evidence that identifies:
    • project under review
    • review team members
    • software product reviewed
    • inputs to the review
    • review objectives and status
      • list of resolves\/unresolved defects
      • lift of management issues
      • action item status
      • recommendations for unresolved issues
      • if software product mets specification

Technical review meeting

  • Roles for the technical review

      • Decision maker

      • Review leader

      • Recorder

      • Technical staff

  • 3-5people (reviewers)

    • including: producer, review leader, reviewers

    • <2 hr advance prep + >2 hr meetings

    • for a small part of the overall software

    • must have a follow-up procedure

      • ensures any corrections are completed
  • At end of review, attendees decide based on“criteria” to:

    • accept the work product w\/o modification

    • reject the work product

      • corrected then another review performed
* **accept **the work product **provisionally**

  * minor errors to be corrected, no further review
  • Conducting Review
    1. prep: evaluate product prior to review
    2. revier prodict ARTIFACT not producer
    3. milk tone: question, don't accuse
    4. stick to review agenda
    5. raise issues, don't resolve issues
    6. avoid style discussions, stick to technical correctness
    7. schedule reviews as project tasks
    8. record\/report review results

Auditing

independent reviews that assess compliance with specs, standards, and procedures'

done before software release

Physical Audit: check that all items identified as part of configuration are present

Functional audit: check that unit, integration and acceptance tests have been carried out and that recorrds show their success or failure

Traceability

Forward: each input to a phase must be matched to an output of the same phase to show completeness

Backward: each output is traceable to an input of a phase

Software Quality Standards

Standards: an approved, documented, and available set of criteria to determine adequacy of an action (process standard) or object (product standard).’

Guidelines: well-defined\/documented set of criteria that guides an activity or task.’ (Dorfman & Thayer, 1990) - allows for judgement and flexibility

Categories of Standards and Assessment

  • Product Standards

  • Process Standards

  • Calibration and Measurement Standards

  • Quality Management Systems Standards

Testing Activities

Test Condition: description of circumstances that could be examined (event\/item). Categories: functionality, performance, stress, robustness. Derive using testing techniques i.e. V-model

TestCase\/Data: a pair (test data(value\/input per var), expected output). Execution of a testcase against program P covers certain: req's of P, parts of P's functionality, parts of P's internal logic

Test Set: Collection of >0 test cases. i.e. function test case for sort:

  • test data = <12 -29 32>, expected output = -29, 12, 32, actual output = ???, evironment pre req's = file, net connection....

Build Test Case (implement)

  • Implement the preconditions (set up environment)

  • Prepare test scripts (may use test automation tools)

Program Behavior

  • specified by: plain natural language, state diagram, formal mathematical specifcatin, etc.

    • state diagram specifies program states & how program changes state on input
  • Observation\/ Analysis: can be complex for large programs

      1. observe behavior
      2. analyze observed behavior (correct or not?)

      3. Oracle: system that checks correctness of observed behavior

        • Oracle Programs(complex): automated oracles that require determination of input-output relationship. i.e. matrix multiplication program used to check if matrix inversion program produced correct output.

Scripts Test Case

Scripts contain data\/instructions for testing:

  • comparison info

  • what screen data to capture

  • when\/where to read input
  • control information

    • repeat set of inputs
    • make a devision based on output
  • testing concurrent activities

Comparison

  • compare (test outcomes, expected outcomes)

    • simple\/complex. Types:

      • variable values (in memory)
      • disk based (textual, non-textual, database, binary,...)
      • screen based (char, GUI, images)
      • Others (multimedia, communicating apps0
    • actual output =? expected output

      • yes = pass (assuming testcase was "instrumented")
      • no = fail (assuming no essor in test case, preconditions)

Dynamic Tests

results matching ""

    No results matching ""