S1: Csci 8701 - Week 2
Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S2: Motivation (Why Benchmark?)
  • Goal: assist in answering
    • Which computer should I buy?
    • For my application domain
    • Choose cheapest system that does the job
  • Audience
    • Programmers: choose design alternatives
    • Product developers: compare w/ competitors
    • End users: range of performance expected
    • for the systems they are buying


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S3: Motivation (Why Benchmark?)
  • Emerging consensus on quantitative comparison
    • what to measure
    • how to measure it
  • Application workload: Benchmark
    • Generic - SPEC, TP1, TPC-A
    • Domain Specific- Sequoia 2000
  • System Metrics
    • Performance: throughput (work/second)
    • Performance: response time
    • Price: 5 year cost of ownership


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S4: Evaluating Benchmarks
  • What is a useful benchmark?
    • Four criteria
    • Source: Benchmark Handbook (J. Gray ed.)
  • Relevance
    • measure peak performance, price/performance
    • on typical tasks in problem domain
  • Portability
    • across systems and architectures
  • Scaleability
    • Applicable to small and large computers
    • scale to larger systems and parallel computers
  • Simplicity
    • Easy to understand, credible


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S5: Standard Bodies Defining Benchmarks
  • Transaction Processing Council
    • 1988 - Omri Serlin
    • consortium of 35 hw/sw companies
    • defines benchmarks for TP and DSS
    • End-to-end performance from terminals to
    • server and back
    • define cost/performance metrics,
    • provide official audits


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S6:



Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S7: DB Benchmarks
  • Gray's Benchmark Handbook
  • Transaction Processing Council
    • TPC-C : the TP benchmark
    • TPC-D : decision-support benchmark
  • Other Generic
    • ANSI SQL Standard - AS3AP, NIST
    • Engineering, CAD: Cattell
    • Text Retrieval
    • SPEC
  • Domain Specific
    • Sequoia: Earth Science, Object-Relational databases
    • Geographic Info. Systems
    • E-commerce
    • Data Mining


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S8: Limitations of Benchmarks
  • Self-evaluation by vendors
    • Can not manipulated to improve numbers
  • Benchmark wars among vendors
    • Product will never exceed the quoted performance
    • Benchmarketing - foe each system their is
    • a benchmark that rates the system best.
  • EPA warning
    • ... actual mileage may vary according to
    • road conditions and driving habots.
    • Use ... for comparison purposes only.


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)
S9: Exercise
  • Evaluate TP1 and Sequoia 2000
    • Workload
    • Measures
    • Relevance
    • Portable
    • Scaleable
    • Simplicity


Copyright: S. Shekhar, C. S. Dept., University of Minnesota, Minneapolis, MN 55455. ,,.(home)