10
Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Embed Size (px)

Citation preview

Page 1: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experiments in computer science

Emmanuel Jeannot

INRIA – LORIA

Aleae Kick-off meeting

April 1st 2009

Page 2: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experimental validation Emmanuel Jeannot 2/26

The discipline of computing: an experimental science

Studied objects (hardware, programs, data, protocols, algorithms, network): more and more complex.

Modern infrastructures:• Processors have very nice features

Cache Hyperthreading Multi-core

• Operating system impacts the performance

(process scheduling, socket implementation, etc.)• The runtime environment plays a role

(MPICH≠OPENMPI)• Middleware have an impact (Globus≠GridSolve)• Various parallel architectures that can be:

Heterogeneous Hierarchical Distributed Dynamic

Page 3: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experimental validation Emmanuel Jeannot 3/26

Analytic study

Purely analytical (math) models:• Demonstration of properties (theorem)• Models need to be tractable: over-

simplification?• Good to understand the basic of the problem• Most of the time ones still perform a

experiments (at least for comparison)

For a practical impact: analytic study not always possible or not sufficient

Page 4: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experimental validation Emmanuel Jeannot 4/26

Experimental culture not comparable with other science Different studies:

• In the 90’s: between 40% and 50% of CS ACM papers requiring experimental validation had none (15% in optical engineering) [Lukovicz et al.]

• “Too many articles have no experimentalvalidation” [Zelkowitz and Wallace 98]: 612 articles published by IEEE.

• Quantitatively more experimentswith times

Computer science not at the samelevel than some other sciences:

Nobody redo experiments (no funding).• Lack of tool and methodologies.

M.V. Zelkowitz and D.R. Wallace. Experimental models for validating technology.Computer, 31(5):23-31, May 1998.

Page 5: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experimental validation Emmanuel Jeannot 5/26

Two types of experiments

• Test and compare:1. Model validation (comparing models

with reality)

2. Quantitative validation (measuring performance)

• Can occur at the same time. Ex. validation of the implementation of an algorithm:• grounding modeling is precise• design is correct

Idea/needIdea/need

DesignDesignImplementationImplementation

Experimentalvalidation

Experimentalvalidation

ObservationObservation

ModelModelPredictionPrediction

Experimentaltest

Experimentaltest

Page 6: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Experimental validation

A good alternative to analytical validation: • Provides a comparison between algorithms or programs• Provides a validation of the model or helps to define the validity domain of the model

Four Methodologies•In-situ (Real scale)•Simulation•Emulation•Benchmarking

Emmanuel Jeannot 6/26Experimental validation

Page 7: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Simulation Emulation

In-Situ (real scale)Benchmarking

Four Methodologies for Expérimentation

Experimental validation Emmanuel Jeannot 7/26

Real application

Rea

l en

viro

nn

emen

tM

od

el o

f th

e en

viro

nn

emen

t

Model of the application

Grid’5000

Das-3Planet Lab

Linpack

Montage WorkflowNAS

SimGRID

GridSim

P2PSim

MicroGRID

Wrekavoc

ModelNet

Page 8: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Complementarity of the approaches

Experimental validation Emmanuel Jeannot 8/26

Simgrid (simulation)

Benchmark Insitu Emulation

Real application No Model Yes Yes

Abstraction Very High High No Low

Execution time Speed-up Preserved Preserved Preserved

Folding Mandatory No No No

Heterogeneity Controllable No No Controllable

Page 9: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Simulation vs real scale

Experimental validation Emmanuel Jeannot 9/26

A good strategy :1. Build benchmarks to asses the characteristics of the

environments2. Calibrate model with real-scale experiments3. Real application tests with benchmarks4. Perform large-scale and reproducible experiments with

simulation

Page 10: Experiments in computer science Emmanuel Jeannot INRIA – LORIA Aleae Kick-off meeting April 1st 2009

Conclusion

Experimental validation Emmanuel Jeannot 10/26

•Experimentation is important to validate our approach (model, solutions, etc.)

•But it has to be done carefully!

•This is an important part of the project

•Important : we get funded (partially) because we promised we will use Grid’500