Upload
jailyn-salway
View
216
Download
1
Embed Size (px)
Citation preview
16th Sept’02 Nick Brook – University of Bristol 1
News from the EB & LCGNick Brook
University of Bristol
•EB News
•LCG News
•Structures
•Review of RTAGs
16th Sept’02 Nick Brook – University of Bristol 2
EB News
Roger Barlow re-elected as deputy chair of the EB
•will take over as chair September’03
Reporting structures in place
•Measures manpower effort and deadlines but also requests expt reqts of UK (Tier1/A) resources
•First reports appear for Q2’02 – available on EB web page
•Next submissions are due now
16th Sept’02 Nick Brook – University of Bristol 3
EB News
6 application submissions to Sheffield All-Hands conference
•Applications are beginning to deliver their projects
•New format to GridPP expt sessions
All but one application position is now filled (last remaining vacant post will be filled beginning of October)
Successful joint ATLAS-LHCb workshop at Coseners• http://www.phy.bris.ac.uk/research/pppages/LHCb/coseners/
CosenersHouse.htm
16th Sept’02 Nick Brook – University of Bristol 4
Fundamental Goal of the LCG
To help the experiments’ computing projects get the best, most reliable and accurate physics results from the data coming from the detectors
Phase 1 – 2002-05prepare and deploy the environment for LHC computing
Phase 2 – 2006-08acquire, build and operate the LHC computing service
16th Sept’02 Nick Brook – University of Bristol 5
Phase 1 - High-level Goals
development/support for applications – libraries, tools, frameworks, data management (inc. persistency), …….. common components
develop/acquire the software for managing a distributed computing system on the scale required for LHC – the local computing fabric, integration of the fabrics into a global grid
put in place a pilot service – “proof of concept” – the technology and the distributed analysis environment platform for learning how to manage and use the system provide a solid service for physics and computing data challenges
produce a TDR describing the distributed LHC computing system for the first years of LHC running
maintain opportunities for re-use of developments outside the LHC programme
To prepare and deploy the environment for LHC computing
16th Sept’02 Nick Brook – University of Bristol 6
The LHC Computing Grid Project Organisation
LHCC
Reports
Reviews
Common Computing RRB(funding agencies)
Resources
ProjectExecution
Board
Software andComputingCommittee
(SC2)
Project Overview Board
Requirements,Monitoring
16th Sept’02 Nick Brook – University of Bristol 7
SC2 & PEB Roles SC2 includes the four experiments, Tier 1 Regional Centres SC2 identifies common solutions and sets requirements for the
project may use an RTAG – Requirements and Technical
Assessment Group limited scope, two-month lifetime with intermediate
report one member per experiment + experts
PEB manages the implementation organising projects, work packages coordinating between the Regional Centres collaborating with Grid projects organising grid services
SC2 approves the work plan, monitors progress
16th Sept’02 Nick Brook – University of Bristol 8
SC2 Monitors Progress of the Project
• Receives regular status reports from the PEB• Written status report every 6 months
– milestones, performance, resources– estimates time and cost to complete
• Organises a peer-review– about once a year– presentations by the different components
of the project– review of documents– review of planning data
16th Sept’02 Nick Brook – University of Bristol 9
Project Execution Organisation
Four areas – each with area project manager
Applications
Grid Technology
Fabrics
Grid deployment
16th Sept’02 Nick Brook – University of Bristol 10
RTAG status– in application software area
• data persistency completed – 5th April 02• software support process completed – 6th May 02• mathematical libraries completed – 2nd May 02• detector geometry description running• Monte Carlo generators running• applications architectural blueprint running• detector simulation running
– in fabric area• mass storage requirements completed – 3rd May 02
– in Grid technology and deployment area• Grid technology use cases completed – 7th June 02• Regional Centre categorisation completed – 7th June 02
Current status of RTAGs (and available reports) on www.cern.ch/lcg/sc2
16th Sept’02 Nick Brook – University of Bristol 11
Data Persistency (RTAG1)
Technology
• Streaming layer should be implemented using the ROOT framework’s I/O services
• Components with relational implementations should make no deep assumptions about the underlying technology– Nothing intentionally proposed that precludes
implementation using such open source products as MySQL
16th Sept’02 Nick Brook – University of Bristol 12
Data Persistency (RTAG1)
Implementation – POOL
Five work package areas:
• Storage Manager & refs
• File catalog & Grid integration
• Collections & Metadata
• Dictionary & Conversion
• Infrastructure, Integration & testing
http://lcgapp.cern.ch/projects/persist
16th Sept’02 Nick Brook – University of Bristol 13
Grid Use Cases (RTAG4)79 page report & 43 use casesGlobal Summary of EDG Response• Use case is already implemented (release 1.2) – 19
– Mostly basic job submission and basic data management– For half of these, WP8 agrees that the functionality is implemented in
1.2, but the implementation is quite a bit more complex than that outlined in the use case (esp. data management). The release 2.0 implementations look simpler.
• Planned for release 2 – 10• Will be considered for release 3 – 4• Use case not detailed enough – 4
– VO-wide resource allocation to users – HEPCAL did not make strong requirements on security
– “Job Splitting” and “Production Job” – were purposely vague in HEPCAL due to lack of clear vision of how massive productions will be run on the Grid. One job auto-split into thousands? Or thousands of jobs somehow logically grouped into one production?
• Not planned for any release – 7– Software publishing– Virtual Datasets (reliant on GriPhyn)
16th Sept’02 Nick Brook – University of Bristol 14
Regional Centres (RTAG6)
A service oriented view should be adopted for categorization of regional centres
It could be profitable to revisit the overall computing model in terms of services around 2004
The important aspects to categorize RCs are– Commitment to guarantee data management at a
high QoS for the lifetime of LHC– Commitment to guarantee state-of-the-art network
bandwidth to ensure efficient inter-operation– Commitment to contribute to collaborative
services
16th Sept’02 Nick Brook – University of Bristol 15
LCG Blueprint (RTAG8 - ongoing)Precepts Software structure: STL/utilities, “core” infrastructure,
“specialised” infrastructure Component model: APIs (embedding frameworks, “own” plug-
ins, end users), physical/logical module granularity, role of abstract interfaces, …
Service model: Uniform, flexible access to basic framework functionality
Object models: dumb vs. smart, enforced policies with run-time checking, clear and bullet-proof ownership model
Distributed operation Global objects Dependencies: minimisation between components, run-time
rather than compile-time Interface to external components: generic adapters – version &
variant identification Exception handling
16th Sept’02 Nick Brook – University of Bristol 16
LCG Blueprint (RTAG8 - ongoing) Scripting, interpreter (ROOTCINT, PYTHON)
GUI toolkits (to build expt specific interfaces)
Graphics (underlying general tools)
Analysis tools (histogramming, fitting, graphical representation, …)
Math libraries and statistics (already established)
Job management
Core services (platform indep interface to system resources on LCG platforms – Linux (gnu & intel compilers), Solaris & Windows)
Foundation and utility libraries (essentially maths libs & core services)
Grid middleware interfaces (already an agreed 2 expt “common” project - GANGA)
16th Sept’02 Nick Brook – University of Bristol 17
LCG Blueprint (RTAG8 - ongoing)
Object dictionary and object model (in context of POOL) Persistency and data management (in context of POOL) Event processing framework (poss. long term common project
components) Event model Event generation (ancilliary services & support) Detector simulation (ditto) Detector geometry and materials (standard tools for describing,
storing & modeling detector geometry) Trigger/DAQ Event reconstruction Detector calibration
16th Sept’02 Nick Brook – University of Bristol 18
LCG Summary
GridPP input to both overall management structure on LCG and the RTAG activities
Activities are beginning to take off Persistency (POOL) Software processing & infrastructure Grid deployment board – Oct 4th first meeting Interaction with middleware providers (not just EDG)
RTAG procedures seems to be slow at taking off Lack of consistency early on – addressed by “Blueprint” RTAG Time consuming, often overlap of necessary experts