18
© 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks, Predicted I/O Performance, SIGMETRICS 1993.

© 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

Embed Size (px)

Citation preview

Page 1: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Self-Scaling Benchmarks

Peter Chen and David Patterson,A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks, Predicted I/O Performance, SIGMETRICS 1993.

Page 2: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

Workloads Experimentalenvironment

prototypereal sys

exec-driven

sim

trace-driven

sim

stochasticsim

Liveworkload

Benchmarkapplications

Micro-benchmarkprograms

Syntheticbenchmarkprograms

TracesDistributions

& otherstatistics

monitor

analysis

generator Synthetictraces“Real”

workloads

Made-up

© 2003, Carla Ellis

Datasets

You are here

Page 3: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Goals

• A benchmark that automatically scales across current and future systems– It dynamically adjusts to system under test

• Predicted performance based on self-scaling evaluation results– Estimate performance for unmeasured

workloads– Basis for comparing different systems

Page 4: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Characteristics of an Ideal I/O Benchmark

Benchmark should1. Help in understanding why, isolate reasons for poor

performance2. Be I/O limited3. Scale gracefully4. Allow fair comparisons among machines5. Be relevant to a wide range of applications6. Be tightly specified, reproducible, explicitly state

assumptionsCurrent benchmarks fail

Page 5: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Overview of Approach

• Step 1: scaling: Benchmark automatically explores workload space to find relevant workload.– By depending on system under test, the

ability to compare systems on benchmark results is lost

• Step 2: Predicted performance scheme helps restore that capability– Accuracy of prediction must be assured

Page 6: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Workload Parameters

• uniqueBytes – total size of data accessed• sizeMean – average size of an I/O request

– Individual requests chosen from normal distribution

• readFrac – fraction of reads; fraction of writes is 1-readFrac

• seqFrac – fraction of requests that are sequential access – For multiple processes, each has its own thread

• processNum – concurrency

Workload is user-level program with parameters set

Page 7: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

Representativeness

Does such a synthetic workload have the “right” set of parameters to capture a real application (characterized by its values for that set of parameters)?

Page 8: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

Benchmarking Results• Set of performance

graphs, one for each parameter, while holding all other parameters fixed at their focal point values.– 75% performance point– Found by iterative search

process

• More of workload space is explored

• Does not capture dependencies among parameters

Page 9: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

focal point = (21MB, 10KB, 0,1,0)

Page 10: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Families of Graphs

• General applicability – representative across range of parameter (75% rationale)

• Multiple performance regions – especially evident for uniqueBytes because of storage hierarchy issues – On border – unstable– mid-range focal points

Page 11: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

cache

disk

Larger requests better

Reads are better than writes Sequential helps

Sequential has little effect

Page 12: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Predicted Performance

• Problem: benchmark chosen will be different for 2 different systems so they can not be directly compared.

• Solution: Estimate performance for unmeasured workloads so a common set of benchmarks can be used for comparisons

Page 13: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

How to Predict

• Assume the shape of performance curve for one parameter is independent of values of other parameters.

• Use self-scaling benchmark to measure with all but one parameter fixed at focal point

Page 14: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

• Solid lines measured performance with sizeMean fixed on left (Sf), processNum fixed on right (Pf)

• Predict throughput curve with sizeMean at S1 by assuming constant ratio

Throughput(processNum, sizeMeanf)

Throughput(processNum, sizeMean1)

which is known at processNum Pf in righthand graph

Page 15: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

Accuracy of Predictions

For SPARCstation + 1diskMeasured at random points inparameter space.Error correlated to uniqueBytes

Page 16: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

Comparisons

Page 17: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

For Discussion Next Thursday (because of

snow)• Survey the types of workloads –

especially the standard benchmarks – used in your proceedings (10 papers).

www.cs.wisc.edu/~arch/www/tools.html is a great resource

© 2003, Carla Ellis

Page 18: © 2003, Carla Ellis Self-Scaling Benchmarks Peter Chen and David Patterson, A New Approach to I/O Performance Evaluation – Self-Scaling I/O Benchmarks,

© 2003, Carla Ellis

Continued discussion of reinterpreting an

experimental paper into strong inference model