Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
RoSta BenchmarkingGEM and Benchmarking WS, April 06, 2009, Leuven 1
Benchmarking
Recommendations for Benchmarking in Mobile
Service Robotics
RoSta Benchmarking 2
Content
• Introduction to RoSta in general
• Review of State of the Art – Benchmarking activities
• Review of Requirement Analysis
• Guidelines for benchmarking
• Outlook and next steps
RoSta Benchmarking 3
Introduction to RoSta
Vision• Initiative on the definition of formal standards and the establishment
of “de facto” standards in the field of service robotics.
• Formulation of standards (action plans) in a few, selected key topics which have the highest possible impact.
• Form the root of a whole chain of standard defining activities going far beyond the specific activities of RoSta.
http://www.robot-standards.eu
http://wiki.robot-standards.org
RoSta Benchmarking
Three tasks• Task 4.1 Compilation and Evaluation of State of the Art in
benchmarking service robots
• Task 4.2 Requirement analysis
• Task 4.3 Action plan and recommendation
4
Introduction to RoSta
4 Topics (“Action Lines”)• Glossary/ontology for mobile manipulation, service robots
• Specification of a reference architecture
• Specification of a middleware
• Formulation of benchmarks
RoSta Benchmarking 5
Review State of the Art - Benchmarking
Evaluation of actual benchmarks by the following characteristics:
• Complexity- Technology
- Component
- System
• Type of Benchmark- Simulation
- Real world test
- Competition
•Used Hardware-Standard hardware
-Own/custom hardware
•Input-Database
-Perception by sensors
•Single or Multi Robot Benchmark
RoSta Benchmarking 6
Review State of the art - Benchmarking
RoSta Benchmarking 7
Review of Requirement Analysis
A benchmark useful for the community needs to consist of:
1. A clear and complete definition of- the robot,- the environment, and- the evaluation metric.
2. A way to authenticate the performance of a solution.
3. A way to promote the research performance on this benchmark.
Furthermore:
•Standardised open architectures and simulators
•Centralization for a living benchmarking culture
•Community driven
RoSta Benchmarking 8
Guidelines for benchmarking
Recommendations for:
• Benchmarking statement in project proposals
• Benchmarking process
• Definition of new benchmarks
• Modification of benchmarks
• Web portal for work
RoSta Benchmarking 9
Guidelines for benchmarking
Guideline for benchmarking statement in project proposals
• Functional decomposition
• Clarification with call requirements
• Check with indicator list
• High indicator -> benchmark
• Low indicator -> argument to EC to not benchmark functionality
RoSta Benchmarking 10
Guidelines for benchmarkingItemising and structuring of
project objectives
Itemising and structuring of
system functions
Benchmark enquiry
Does mapping between
functions and objectives
exist?
Partly
Yes
Modify existing BM or
establish new one?
Create new benchmark
Benchmark database
Apply benchmark
Modify benchmark
Modify
Insert into DB
Open, moderated Webportal
Existing Benchmark
Modified Benchmark
New Benchmark
Insert into DB
Reasons
No
New
Guideline for benchmarking process
• Benchmark enquiry
• Use existing benchmark
• Modify benchmark
• Define new benchmark
RoSta Benchmarking 11
Guidelines for benchmarkingDo BMs exist
in related areas?
Reference existing benchmarks Yes
No
Document benchmark
Insert into database
Identify functionalities to
be measured
Identify variables to be measured
Define variable and measuring
unit
Assign measuring specification/
method
for each variable
for each functionality
Explain relevance for project objectives
for each functionality
Explain relevance for functionality
for each variable
Is the method non-intrusive,
repeatable and portable?
for each variable
Yes
No
New Benchmark
Loop for each identified variable
Guideline for defining new benchmarks
• Functional decomposition
• Variables per functionality
• Measuring unit per variable
RoSta Benchmarking 13
Outlook and next steps
Web portal for central work on and with benchmarks
• Database for benchmarks, results and documentation
• Platform for discussion, evaluation and comparison
• Promotion of work performed and results achieved
RoSta Benchmarking 14
Outlook and next steps
RoSta Benchmarking 15
Outlook and next steps
Then aMiracleoccures
Very good work!But shouldn‘t we be a
little bit more specific here?
Guideline are there to be: à Followed à Broken
à Helpful à ChangedV V