S1: Meeting 7
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S2: Testing Principles : Chapter 5 (pp. 109-136)
  • Testing is relevant to all phases of the life-cycle
    • Testing in earlier phases is more cost-effective (Fig. 1.5)
  • Validation vs. Verification
    • Validation: Product satisfies its specification
      • Are we building the right product?
    • Verification: Are we building the product right?
  • Two Types of Testing:
    • Execution based
    • Non-execution Based
  • Quality Issues
    • Quality = how well product satisfies its specification
    • Software Quality Assurance Group
    • Managerial Independence
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S3: 5.2 Non-execution based: inspection, walkthrough
  • Walkthroughs
    • 2 Steps: preparation, walkthrough
    • Participant Driven: Reviewers point out potential ambiguities, errors
      • designer/coder tries to address those
    • Document Driven: One person walks others thru' code
      • explaining it, justifying assumptions etc.
      • Others can ask questions, comment etc.
    • Inspections
    • 5 Steps: Overview, Preparation, Inspection, Rework, Follo-up
    • Team of 4: moderator, designer, implementer, tester
    • Use checklist of fault-types, past-fault statistics
    • Metrics: fault density, detection rate, detection efficiency
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S4: 5.3 Execution based Testing, 5.5 Correctness Proofs
  • Fault, Failure, Error
    • Fault: bug (mistake in the code or document)
    • Failure: incorrect behaviour resulting from a fault
    • Error: mistake made by the programmer
  • Analogy with symtomps, disease and risky behaviour
  • Program testing is an effective way to
    • show the presence of bugs
    • but not their absence [Dijktra 72]
  • Q? Is there an alternative to testing?
  • Q? How much testing ? When can testing stop?
  • Correctness Proofs
    • Example of correctness proof (Fig. 5.4-5.6)
    • Experience: Naur's paper
    • Pros: Smaller program can be shown to meet their specifications
      • Useful fot safety-critical components
    • Cons: needs mathematical training, expense, hardness,
      • can one trust tools (e.g. theorem provers - see. Fg. 5.7)
    • Q? Can it be used to test specifications?
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S5: 5.4, 5.6, 5.7 Testing goals, stopping criteria
  • When should testing Stop?
    • Testing is part of each phase of life-cycle.
    • Can stop testing testing unless product is retired
    • Q? When can testing stop within a given phase?
      • Metrics: fault density, detection rate, detection efficiency
      • Constraints: product complexity, etc.
    • What should be tested?
    • functional as well as non-functional specifications
    • Ex. reliability, performance, correctness, etc.
  • Should Programmers test/certify their modules?
    • Conflict of interest
    • Testing requires different thinking mode than building
    • Practice: Intial testing by programmer,
      • systematic testing by SQA team
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S6: 12.14-12.23 Module Testing: Overview
  • Simple Approaches to Testing
    • 1. Haphazard Testing:
      • Tester runs program, types in program inputs
      • Pass test if program doesn't crash
      • Problem: miss boundary cases, redundancy
    • 2. Comprehensive testing with all possible cases
      • Problems: there could be too many possible cases
    • Criteria to select test cases for a module
    • Goals: Small set of tests to cover an aspect
    • Method 1: Test to specs (black box)
      • cover equivalence regions, boundary conditions, functions.
    • Method 2: Test to code (glass box)
      • cover nodes, edges, branches or paths
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S7: 12.14-12.23 Module Testing: Black-Box
  • Overview SubItem%Testcase based on specification, not code
    • Using input data, output data, functionality etc.
  • Equivalence testing: Boundary values analysis
    • Partition set of possible input data into equivalence classes
    • Select test inputs from each equivalence class
    • Select test inputs from boundary between classes
    • Ex. 5 testcases for (16 < legal-driving-age < 120)
    • Ex. Triangle (See Sample Final Exam.)
  • Functional Testing
    • List functionalities from the specification (e.g. DFD)
    • Select test cases for each functionality
    • Cons: What if functionality and modules have M:N relationship?
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S8: 12.14-12.23 Module Testing: Glass Box
  • Overview SubItem%Testcase based on code
    • Using control flow graphs, definition-use graphs, etc.
    • Cover an aspect of a graph: node, edge, path, ...
    • Measure complexity using graph properties
  • Structural Testing: Criteria to select a set testcases
    • Statement Coverage
    • Branch Coverage
    • Path Coverage
    • Issues: unreachable statements, priotizing paths (e.g. def-use)
    • Example: Triangle Program (Sample Final)
  • Complexity Meaures: Predicting faults
    • Lines of code
    • Halstead's: distinct operators, operands,
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S9: 12.14-12.23 Module Testing: Walkthrough and Inspections
  • Overview (See Ch. 5.2, pp 112-117)
    • Non-Execution Based Reviews
    • Diversity: tester independent of designers & other testers
      • lateral thinking, complement each other's strengths
    • Increase chances of catching faults
  • Review Technique 1: Walkthroughs
    • 2 Steps: preparation, walkthrough
    • Participant Driven: Reviewers point out potential ambiguities, errors
      • designer/coder tries to address those
    • Document Driven: One person walks others thru' code
      • explaining it, justifying assumptions etc.
      • Others can ask questions, comment etc.
    • Review Technique 2: Inspections
    • 5 Steps: Overview, Preparation, Inspection, Rework, Follow-up
    • Team of 4: moderator, designer, implementer, tester
    • Use checklist of fault-types, past-fault statistics
    • Metrics: fault density, detection rate, detection efficiency
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S10: 12.1-5 Language, Structured Programming, Coding Standards
  • Choosing Programming Languages
    • Match items from following two lists:
    • real-time systems, web/CGI script, GUI, system admin. scripts
    • Visual Basic, Perl, HTML, C, C++, Perl, Java, COBOL, SQL
    • Issue: Q? What if you know one better than the other?
    • Q? How can a language be more (or less) suitable ?
  • Structured Programming : A walk down the memory lane!
    • "Goto statement considered harmful" [Dijktra 1968]
    • Avoid goto, break, continue, multiple returns in functions, ...
    • Except for: error-handling, improving code readability!
  • Coding Standards: Other aspects of good programming!
    • Meaningful Names for Variable, Procedures, Modules
    • Comments: brief description for maintainers,
      • programmer id, date of changes, variable meanings,
      • known faults, input/outputs, files accessed/updated
    • Good indentation, highlight transitions
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S11: 12.6-9 Team Organization
  • Motivation
    • Common wisdom: Many hands make light work.
    • Q? Is Completion time = work (man hours) / team size (?)
    • "Mythical Man months" in software project:
      • Adding manpower to a late project makes it later
    • To use workers effectively, the group should be structured!
    • Chief programmer team [Fig. 12.10]
    • Matrix organization [Fig. 12.11]
    • Team issues:
    • Channels of communications
      • Democratic team has O(n*n) [Fig. 12.8, 12.9]
      • Chief programmer team has O(n) (Fig. 12.10)
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S12: 12.10-12 Portability
  • Incompatibility Issues
    • Hardware - character sets, instruction sets, word-size
    • Operating Systems: size limits, system calls
    • Compilers deviate from language standards (J++ vs. Java)
  • Motivation for portability
    • Vendors: Save costs of conforming to multiple platforms
    • Users: Save cost of upgrading, support etc.
  • What should be portable?
    • Data, application software, system software
  • Techiniques
    • Isolate platform dependent pieces (Unix device independence)
    • Use levels of abstractions (e.g. Networks OSI)
    • Use standardized lanaguges, libraries. (Standard C++)
    • Use text based datafiles with simple format
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S13: %12.24-25 CASE Tools, MSG Case Study
  • CASE Tools (See Chapter 4.4)
    • Version management
    • Coding tools: structured editors (emacs), pretty printers,
      • interface checking (lint), source level debuggers
    • Online documentation, Team communication tools
  • 12.25 MSG Case Study (Fig. 12.22, pp. 427)
    • Q? Is Fig. 12.22 cover all inputs and functionalities?
    • Q? Provide 2 more equivalence classes and test cases
      • for "item name" and "item number" in "Investment" !
    • Q. Design black box testcases for "mortgage" ?
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)