Upload
trinity-bruce
View
219
Download
0
Tags:
Embed Size (px)
Citation preview
Tony Doyle - University of Glasgow
E-Science and LCG-2E-Science and LCG-2PPAP SummaryPPAP Summary
Results from GridPP1/LCG1 Value of the UK contribution to
LCG? Aims of GridPP2/LCG2
UK special contribution to LCG2?How much effort
will be needed to continue activities during the LHC era?
26 October 2004 PPAP Tony Doyle - University of Glasgow
Outline
1. What has been achieved in GridPP1? [7’]• GridPP I (09/01-08/04) Prototype complete
2. What is being attempted in GridPP2? [6’]• GridPP II (09/04-08/07) Production short timescale• What is the value of a UK LCG Phase-2 contribution?
3. Resources needed in medium-long term? [10’]• (09/07-08/10) Exploitationmedium• Focus on resources needed in 2008• (09/10-08/14) Exploitationlong-term
26 October 2004 PPAP Tony Doyle - University of Glasgow
Executive Summary
• Introduction• Project Management
• Resources• CERN• Middleware • Applications • Tier-1/A• Tier-2 • Dissemination • Exploitation
Ref: http://www.gridpp.ac.uk/
• the Grid is a reality• A project was/is needed
(under control via Project Map)
• Deployed according to planning
• Phase 1.. Phase 2 • Prototype(s) made impact• Fully engaged (value added)• Tier-1 production mode• Resources now being utilised• UK flagship project• Preliminary planning
26 October 2004 PPAP Tony Doyle - University of Glasgow
GridPP Deployment Status
Three Grids on Global scale in HEP (similar functionality)
sites CPUs• LCG (GridPP) 82 (14) 7300
(1500)• Grid3 [USA] 29 2800• NorduGrid 30 3200
GridPP deployment is part of LCG(Currently the largest Grid in the world)The future Grid in the UK is dependent upon LCG releases
26 October 2004 PPAP Tony Doyle - University of Glasgow
Deployment Status (26/10/04)
• Incremental releases: significant improvements in reliability, performance and scalability
– within the limits of the current architecture– scalability is much better than expected a year ago
• Many more nodes and processors than anticipated– installation problems of last year overcome– many small sites have contributed to MC productions
• Full-scale testing as part of this year’s data challenges
• GridPP “The Grid becomes a reality” – widely reported British
Embassy(Russia)
TechnologySites
British Embassy
(USA)
26 October 2004 PPAP Tony Doyle - University of Glasgow
Data Challenges
• Ongoing..• Grid and
non-Grid Production
• Grid now significant
• CMS - 75 M events and 150 TB: first of this year’s Grid data challenges
• ALICE - 35 CPU Years• Phase 1 done
• Phase 2 ongoing
LCG
Entering Grid Production Phase..
26 October 2004 PPAP Tony Doyle - University of Glasgow
Data Challenge
ATLAS DC2 - LCG - September 71%
2%
0%
1%
2%
14%
3%
1%
3%
9%
8%
3%2%5%1%4%
1%
1%
3%
0%
1%
1%
4%1%
0%
12%
0%
1%
1%
2%
10%
1% 4%
at.uibk
ca.triumf
ca.ualberta
ca.umontreal
ca.utoronto
ch.cern
cz.golias
cz.skurut
de.fzk
es.ifae
es.ific
es.uam
fr.in2p3
it.infn.cnaf
it.infn.lnl
it.infn.mi
it.infn.na
it.infn.na
it.infn.roma
it.infn.to
it.infn.lnf
jp.icepp
nl.nikhef
pl.zeus
ru.msu
tw.sinica
uk.bham
uk.ic
uk.lancs
uk.man
uk.rl
ATLAS DC2 - CPU usage
LCG41%
NorduGrid30%
Grid329%
LCG
NorduGrid
Grid3
Total:
~ 1350 kSI2k.months~ 95000 jobs~ 7.7 Million events fully simulated (Geant4)~ 22 TB
• 7.7 M GEANT4 events and 22 TB• UK ~20% of LCG• Ongoing..
• (3) Grid Production
• ~150 CPU years so far
• Largest total computing requirement
• Small fraction of what ATLAS need..
Entering Grid Production Phase..
26 October 2004 PPAP Tony Doyle - University of Glasgow
LHCb Data Challenge
424 CPU years (4,000 kSI2k months), 186M events • UK’s input significant (>1/4 total) • LCG(UK) resource:
– Tier-1 7.7%– Tier-2 sites:– London 3.9%– South 2.3%– North 1.4%
• DIRAC:– Imperial 2.0%– L'pool 3.1%– Oxford 0.1%– ScotGrid 5.1%
DIRAC alone
LCG inaction
1.8 106/day
LCG paused
Phase 1 Completed
3-5 106/dayLCG
restarted
186 M Produced Events
Entering Grid Production Phase..
26 October 2004 PPAP Tony Doyle - University of Glasgow
Paradigm ShiftTransition to Grid…
Jun: 80%:20%
25% of DC’04
Aug: 27%:73%
42% of DC’04
May: 89%:11%
11% of DC’04
Jul: 77%:23%
22% of DC’04
424 CPU · Years
26 October 2004 PPAP Tony Doyle - University of Glasgow
What was GridPP1?
• A team that built a working prototype grid of significant scale
> 1,500 (7,300) CPUs> 500 (6,500) TB of storage> 1000 (6,000) simultaneous jobs
• A complex project where 82% of the 190 tasks for the first three years were completed
1 . 1 2 . 1 3 . 1 4 . 1 5 . 1 6 . 1 7 . 1
1 . 1 . 1 1 . 1 . 2 1 . 1 . 3 1 . 1 . 4 2 . 1 . 1 2 . 1 . 2 2 . 1 . 3 2 . 1 . 4 3 . 1 . 1 3 . 1 . 2 3 . 1 . 3 3 . 1 . 4 4 . 1 . 1 4 . 1 . 2 4 . 1 . 3 4 . 1 . 4 5 . 1 . 1 5 . 1 . 2 5 . 1 . 3 6 . 1 . 1 6 . 1 . 2 6 . 1 . 3 6 . 1 . 4 7 . 1 . 1 7 . 1 . 2 7 . 1 . 3 7 . 1 . 41 . 1 . 5 2 . 1 . 5 2 . 1 . 6 2 . 1 . 7 2 . 1 . 8 3 . 1 . 5 3 . 1 . 6 3 . 1 . 7 3 . 1 . 8 4 . 1 . 5 4 . 1 . 6 4 . 1 . 7 4 . 1 . 8 6 . 1 . 5
2 . 1 . 9 3 . 1 . 9 3 . 1 . 1 0 4 . 1 . 9
1 . 2 2 . 2 3 . 2 4 . 2 5 . 2 6 . 2 7 . 2 1 . 2 . 1 1 . 2 . 2 1 . 2 . 3 1 . 2 . 4 2 . 2 . 1 2 . 2 . 2 2 . 2 . 3 2 . 2 . 4 3 . 2 . 1 3 . 2 . 2 3 . 2 . 3 3 . 2 . 4 4 . 2 . 1 4 . 2 . 2 4 . 2 . 3 4 . 2 . 4 5 . 2 . 1 5 . 2 . 2 5 . 2 . 3 6 . 2 . 1 6 . 2 . 2 6 . 2 . 3 7 . 2 . 1 7 . 2 . 2 7 . 2 . 31 . 2 . 5 1 . 2 . 6 1 . 2 . 7 1 . 2 . 8 2 . 2 . 5 2 . 2 . 6 2 . 2 . 7 3 . 2 . 5 3 . 2 . 6 3 . 2 . 7 3 . 2 . 8 4 . 2 . 5 4 . 2 . 6 4 . 2 . 71 . 2 . 9 1 . 2 . 1 0 3 . 2 . 9
1 . 3 2 . 3 3 . 3 4 . 3 5 . 3 6 . 3 7 . 3
1 . 3 . 1 1 . 3 . 2 1 . 3 . 3 1 . 3 . 4 2 . 3 . 1 2 . 3 . 2 2 . 3 . 3 2 . 3 . 4 3 . 3 . 1 3 . 3 . 2 3 . 3 . 3 3 . 3 . 4 4 . 3 . 1 4 . 3 . 2 4 . 3 . 3 4 . 3 . 4 5 . 3 . 1 5 . 3 . 2 5 . 3 . 3 6 . 3 . 1 6 . 3 . 2 6 . 3 . 3 6 . 3 . 4 7 . 3 . 1 7 . 3 . 2 7 . 3 . 3 7 . 3 . 41 . 3 . 5 1 . 3 . 6 1 . 3 . 7 1 . 3 . 8 2 . 3 . 5 2 . 3 . 6 2 . 3 . 7 3 . 3 . 5 3 . 3 . 6 4 . 3 . 51 . 3 . 9 1 . 3 . 1 0 1 . 3 . 1 1
1 . 4 2 . 4 3 . 4 4 . 4 5 . 4 1 . 4 . 1 1 . 4 . 2 1 . 4 . 3 1 . 4 . 4 2 . 4 . 1 2 . 4 . 2 2 . 4 . 3 2 . 4 . 4 3 . 4 . 1 3 . 4 . 2 3 . 4 . 3 3 . 4 . 4 4 . 4 . 1 4 . 4 . 2 4 . 4 . 3 4 . 4 . 4 5 . 4 . 1 5 . 4 . 2 5 . 4 . 3 5 . 4 . 41 . 4 . 5 1 . 4 . 6 1 . 4 . 7 1 . 4 . 8 2 . 4 . 5 2 . 4 . 6 2 . 4 . 7 3 . 4 . 5 3 . 4 . 6 3 . 4 . 7 3 . 4 . 8 4 . 4 . 5 4 . 4 . 6 5 . 4 . 51 . 4 . 9 3 . 4 . 9 3 . 4 . 1 0 M e t r i c O K 1 . 1 . 1
M e t r i c n o t O K 1 . 1 . 1 1 . 5 2 . 5 3 . 5 4 . 5 T a s k c o m p le t e 1 . 1 . 1
1 . 5 . 1 1 . 5 . 2 1 . 5 . 3 1 . 5 . 4 2 . 5 . 1 2 . 5 . 2 2 . 5 . 3 2 . 5 . 4 3 . 5 . 1 3 . 5 . 2 3 . 5 . 3 3 . 5 . 4 4 . 5 . 1 4 . 5 . 2 4 . 5 . 3 4 . 5 . 4 T a s k o v e r d u e 1 . 1 . 11 . 5 . 5 1 . 5 . 6 1 . 5 . 7 1 . 5 . 8 2 . 5 . 5 2 . 5 . 6 2 . 5 . 7 3 . 5 . 5 3 . 5 . 6 3 . 5 . 7 6 0 d a y s 1 . 1 . 11 . 5 . 9 1 . 5 . 1 0 T a s k n o t d u e s o o n 1 . 1 . 1
N o t A c t i v e 1 . 1 . 1 2 . 6 3 . 6 4 . 6 N o T a s k o r m e t r i c
2 . 6 . 1 2 . 6 . 2 2 . 6 . 3 2 . 6 . 4 3 . 6 . 1 3 . 6 . 2 3 . 6 . 3 3 . 6 . 4 4 . 6 . 1 4 . 6 . 2 4 . 6 . 32 . 6 . 5 2 . 6 . 6 2 . 6 . 7 2 . 6 . 8 3 . 6 . 5 3 . 6 . 6 3 . 6 . 7 3 . 6 . 8 N a v ig a t e u p 2 . 6 . 9 3 . 6 . 9 3 . 6 . 1 0 3 . 6 . 1 1 3 . 6 . 1 2 N a v ig a t e d o w n
E x t e r n a l l i n k 2 . 7 3 . 7 L in k t o g o a l s
2 . 7 . 1 2 . 7 . 2 2 . 7 . 3 2 . 7 . 4 3 . 7 . 1 3 . 7 . 2 3 . 7 . 3 3 . 7 . 42 . 7 . 5 2 . 7 . 6 2 . 7 . 7 2 . 7 . 8 3 . 7 . 5 3 . 7 . 6
2 . 8 3 . 8 2 . 8 . 1 2 . 8 . 2 2 . 8 . 3 2 . 8 . 4 3 . 8 . 1 3 . 8 . 2 3 . 8 . 32 . 8 . 5
W P 8
1 2 3
D e p l o y m e n t
W P 4
W P 5
F a b r i c
T e c h n o l o g y
W P 6
D u e w i t h i n
A T L A S
G r i d P P G o a l
R e s o u r c e sI n t e r o p e r a b i l i t y D i s s e m i n a t i o n
T i e r - 1
T i e r - A
L H C b T i e r - 2
C E R N D a t a G r i d A p p l i c a t i o n s I n f r a s t r u c t u r e
W P 1
W P 2
W P 3
L C G C r e a t i o n
A p p l i c a t i o n s
W P 7
A T L A S / L H C b
C M S
B a B a r
C D F / D O
U K Q C D
O t h e r
D a t a C h a l l e n g e s
R o l l o u t
T e s t b e d
1 - J a n - 0 4S t a t u s D a t e
I n t . S t a n d a r d s
O p e n S o u r c e
W o r l d w i d e I n t e g r a t i o n
U K I n t e g r a t i o n
M o n i t o r i n g
D e v e l o p i n gE n g a g e m e n t
P a r t i c i p a t i o n
T o d e v e l o p a n d d e p l o y a l a r g e s c a l e s c i e n c e G r i di n t h e U K f o r t h e u s e o f t h e P a r t i c l e P h y s i c s c o m m u n i t y
P r e s e n t a t i o n D e p l o y m e n t
5 6 74
U p d a t e
C l e a r
A Success
“The achievement of
something desired, planned, or
attempted”
26 October 2004 PPAP Tony Doyle - University of Glasgow
Aims for GridPP2? From Prototype to
ProductionBaBar
D0CDF
ATLAS
CMS
LHCb
ALICE
19 UK Institutes
RAL Computer Centre
CERN ComputerCentre
SAMGrid
BaBarGrid
LCG
EDGGANGA
EGEE
UK PrototypeTier-1/A Centre
CERN PrototypeTier-0 Centre
4 UK Tier-2 Centres
LCG
UK Tier-1/ACentre
CERN Tier-0Centre
200720042001
4 UK Prototype Tier-2 Centres
ARDA
Separate Experiments, Resources, Multiple
Accounts 'One' Production GridPrototype Grids
26 October 2004 PPAP Tony Doyle - University of Glasgow
Planning: GridPP2
ProjectMap 0. Production Grid
1. 1 2.1 3.1 4.1 5.1 6.1
1. 2 2.2 3.2 4.2 5.2 6.2
1. 3 2.3 3.3 4.3 6.3
1. 4 2.4 3.4 4.4 6.4
2.5 3.5 4.5
Navigate down External link Link to goals
2.6 3.6 4.6
Network
Management
& MonitoringInformation PhenoGrid
KnowledgeTransfer
32
Management
Grid Deployment Security CMS UKQCD
Engagement
Grid Technology Workload LHCb D0
Computing Fabric Data & Storage Ganga CDF Deployment
Grid Operations
1 6M/S/N LHC Apps
54
GridPP2 GoalTo develop and deploy a large scale production quality grid in the UK for the use of the Particle Physics community
Tier-A Tier-1 Tier-2 Deployment Middleware Support Experiment Support
Interoperability
ATLAS Dissemination
Management ExternalLCG
Planning
Applications Metadata
Non-LHC Apps
BaBar
SAMGrid
LHC Deployment Portal
Need to recognise future requirements in each area…
26 October 2004 PPAP Tony Doyle - University of Glasgow
Tier 0 and LCG: Foundation
Programme
• Aim: build upon Phase 1
• Ensure development programmes are linked
• Project management:GridPP LCG
• Shared expertise:
• LCG establishes the global computing infrastructure
• Allows all participating physicists to exploit LHC data
• Earmarked UK funding being reviewed
Required Foundation: LCG Deployment
F. LHC Computing Grid Project (LCG Phase 2) [review]
26 October 2004 PPAP Tony Doyle - University of Glasgow
Tier 0 and LCG: RRB meeting
today
• Jos Engelen proposal to RRB members (Richard Wade [UK]) on how a 20MCHF shortfall for LCG phase II can be funded
• Funding from UK (£1m), France, Germany and Italy for 5 staff. Others?
• Spain to fund ~2 staff. Others at this level? • Now vitally important that the LCG effort
established predominantly via UK funding (40%) is sustained at this level (~10%)
• URGENT
Value to the UK? Required Foundation: LCG Deployment
26 October 2004 PPAP Tony Doyle - University of Glasgow
Annual data storage:12-14 PetaBytesper year
100 Million SPECint2000
100,000 PCs (3 GHz Pentium 4)
Concorde(15 km)
CD stack with1 year LHC data(~ 20 km)
What lies ahead? Some mountain
climbing..
Quantitatively, we’re ~7% of the way there in terms of CPU (7,000 ex 100,000) and disk (4 ex 12-14*3-4 years)…
In production
terms, we’ve made base camp We are here
(1 km)
Importance of step-by-step planning… Pre-plan your trip, carry an ice axe and crampons and arrange for a guide…
26 October 2004 PPAP Tony Doyle - University of Glasgow
Grid and e-Science Support
in 2008What areas require support?IV Running the Tier-1 Data CentreIV Hardware annual upgrade
IV Contribution to Tier-2 Sysman effort (non-PPARC) hardware IV Frontend Tier-2 hardware
IV Contribution to Tier-0 support
III One M/S/N expert in each of 6 areasIII Production manager and four Tier-2 coordinators
II Application/Grid experts (UK support)
I ATLAS Computing MoU commitments and supportI CMS Computing MoU commitments and supportI LHCb Core Tasks and Computing SupportI ALICE Computing supportI Future experiments adopt e-Infrastructure methods
• No GridPP management: (assume production mode established + devolved management to Institutes)
III. Grid Middleware
I. Experiment Layer
II. Application Middleware
IV. Facilities and Fabrics
26 October 2004 PPAP Tony Doyle - University of Glasgow
PPARC Financial Input:
GridPP1 Components
6/Feb/2004
£3.57m
£5.67m
£3.74m
£2.08m£1.84m
CERN
DataGrid
Tier - 1/A
ApplicationsOperations
LHC Computing Grid Project (LCG)Applications, Fabrics, Technology and Deployment
European DataGrid (EDG)Middleware Development
UK Tier-1/A Regional CentreHardware and Manpower
Grid Application DevelopmentLHC and US Experiments + Lattice QCD
Management Travel etc
26 October 2004 PPAP Tony Doyle - University of Glasgow
May 2004
£0.75m
£2.62m
£3.02m
£0.88m
£0.69m
£2.75m
£2.79m
£1.00m
£2.40m
Tier-1/AHardware
Tier-2Operations
Applications
M/S/N
LCG-2
MgrTravel
Ops
Tier-1/AOperations
C. Grid Application DevelopmentLHC and US Experiments + Lattice QCD + Phenomenology
B. Middleware Security NetworkDevelopment
F. LHC Computing Grid Project (LCG Phase 2) [review]
E. Tier-1/A Deployment:Hardware, System Management, Experiment Support
A. Management, Travel, Operations
D. Tier-2 Deployment: 4 Regional Centres - M/S/N support and System Management
PPARC Financial Input:
GridPP2 Components
26 October 2004 PPAP Tony Doyle - University of Glasgow
IV. Hardware Support
UK Tier-1 2008
CPU Total (MSI2k)
4.2
Disk Total (PB) 3.8
Total Tape (PB) 2.3
UK Tier-2 2008
CPU Total (MSI2k)
8.0
Disk Total (PB) 1.0
ALICE ATLAS CMS LHCb Total
CPU (MSI2K) 9.1 16.6 12.6 9.5 47.8 41.4 -13%
Disk (PBytes) 3.0 9.2 8.7 1.3 22.2 10.1 -55%
Tape (PBytes) 3.6 6 6.6 0.4 16.6 17.8 7%
Notes 1. 2.
Total resources required and planned in all Tier-1 Centres (except CERN)
Requirements will be reviewed by LHCC in January 2005Current planning includes estimates of resources for which funding has not yet been secured
Total resources planned at Tier-1 centres (note 2)
BalanceResource type
First full year of data taking (2008)
All data is preliminary
Estimated requirements (note 1)
1. Global shortfall of Tier-1 CPU (-13%) and Disk (-55%)
2. UK Tier-1 input corresponds to ~40% (~10%) of global disk (CPU)
3. UK Tier-2 CPU and disk resources significant
4. Rapid physics analysis turnaround is a necessity
5. Priority is to ensure that ALL required software (experiment, middleware, OS) is routinely deployed on this hardware well before 2008
26 October 2004 PPAP Tony Doyle - University of Glasgow
III. Middleware, Security and
Network
M/S/N builds upon UK strengths as part of International development
Configuration Management
Storage Interfaces
Network Monitoring
Security
Information Services
Grid Data Management
SecurityMiddleware
Networking
Require some support expertise in each of these areas in order to maintain the Grid
26 October 2004 PPAP Tony Doyle - University of Glasgow
II. Application Middleware
Fabric
TapeStorage
Elements
RequestFormulator and
Planner
Client Applications
ComputeElements
Indicates component that w ill be replaced
DiskStorage
Elements
LANs andWANs
Resource andServices Catalog
Rep licaCatalog
Meta-dataCatalog
Authentication and SecurityGSISAM-specific user, group , node, st at ion regis tration B bftp ‘cookie’
Connectivity and Resource
CORBA UDP File transfer protocol s - ftp, b bftp, rcp GridFTP
Mass Storage s ystems protocol se.g. encp, hp ss
Collective Services
C atalogproto co ls
S igni fi cant Event Log ger Naming Servi ce Database ManagerC at alog Manager
SAM R es ource M an ag em entB at ch Sys tems - LSF, FB S, PB S,
C ondorData Mov erJob Services
Storage ManagerJob ManagerCache ManagerRequest Manager
“Dataset Editor” “File Storage Server”“Project Master” “Station Master” “Station Master”
Web Python codes, Java codesCommand line D0 Framework C++ codes
“Stager”“Optimiser”
CodeRepostory
Name in “quotes” is SAM-given software component name
or addedenhanced using PPDG and Grid tools
GANGA
SAMGridLattice QCD
AliEn
CMS
BaBar
Require some support expertise in each of these areas in order to maintain the Grid applications. Need to develop e-Infrastructure portals for new experiments starting up in exploitation era.
Pheomenology
ATLAS UK e-science forward look (Roger Jones)
Both will move from development to optimisation & maintenance
Current core and infrastructure activities:
Run Time Testing and Validation Framework, tracking and trigger instantiations
Provision of ATLAS Distributed Analysis & production tools
Production management GANGA development Metadata development ATLFast simulation ATLANTIS Event Display Physics Software Tools
~11 FTEs mainly ATLAS e-science with some GridPP & HEFCE
Current Tracking
and Trigger e-science:
Alignment effort ~6FTEs
Core software ~2.5FTEs
Tracking tools ~6FTEs
Trigger ~2FTEs
The current eScience funding will only take us (at best) to first data
Expertise required for the real-world problems and maintenance
Note for the HLT, the installation and commissioning will continue into the running period because of staging
Need ~15 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support
CMS UK e-science forward look (Dave Newbold)
NB: ‘First look’ estimates; well inevitably change as we approach running
Need ~9 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support
Work area Current FTEs FTEs 2007-9 FTEs 2009 -
Cmp sys / support (e-science WP1)
2.0 3.0 - ramp UP for running phase
3.0 - steady state
Monitoring / DQM
(e-science WP3)
2.5 2.0 - initial running 1.5 - support / maintenance
Tracker software
(e-science WP4)
2.0 1.5 - initial deployment running
1.0 - support / maintenance
ECAL software
(e-science WP5)
2.0 1.5 - initial running 1.0 - support / maintenance
Data management(GridPP2)
1.5 1.5 - final dplymnt / support
1.5 - support / maintenance
Analysis system(GridPP2)
1.5 1.0 - final dplymnt / support
1.0 - support / maintenance
11.5 10.5 9.0
Computing system / supportDevelopment / tuning of computing model + system; managementUser support for T1 / T2 centres (globally); liaison with LCG ops
Monitoring / DQMOnline data gathering/‘expert systems’ for CMS tracker, trigger
Tracker /ECAL softwareInstallation / calibration support; low-level reconstruction codes
Data managementPhedex system for bulk offline data movement and trackingSystem-level metadata; movement of HLT farm data online (new area)
Analysis systemCMS-specific parts of distributed analysis system on LCG
LHCb UK e-science forward look (Nick Brook)Current core activities: GANGA development Provision of DIRAC & production
tools Development of conditions DB The production bookkeeping DB Data management & metadata Tracking Data Challenge Production
Manager
~10 FTEs mainly GridPP, e-science, studentships with some HEFCE support
Will move from development to maintenance phase - UK pro rata share of LHCb core computing activities ~5 FTEs
Current RICH & VELO e-science:
RICH: UK provide bulk of the RICH s/w team including s/w coordinator ~7 FTEs about 50:50 e-science funding+rolling grant/HEFCE
VELO: UK provide bulk of the VELO s/w team including s/w coordinator ~4 FTEs about 50:50 e-science funding+rolling grant/HEFCE
ALL essential alignment activities for both detectors through e-science funding
Will move from development to maintenance and operational alignment
~3FTEs for alignment in 2007-9Need ~9 FTE (core+alignment+UK support) in 2007/9 - continued e-science support
26 October 2004 PPAP Tony Doyle - University of Glasgow
LCG 9%
Tier-118%
Tier-222%
Applications9%
M/S/N8%
Operations7%
Operations2%
Travel3%
Tier-1 20%
Tier-22%
• Priorities in context of a financial snapshot in 2008• Grid (£5.6m p.a.) and e-Science (£2.7m p.a.)
• Assumes no GridPP project management• Savings?
– EGEE Phase 2 (2006-08) may contribute– UK e-Science context is
1. NGS (National Grid Service)2. OMII (Open Middleware Infrastructure Institute)3. DCC (Digital Curation Centre)
• Timeline?
Grid and e-Sciencefunding
requirements
New Expts.19%
ATLAS37%
CMS19%
LHCb19%
ALICE2%
Travel4%
To be compared To be compared withwith
Road Map: Not a Road Map: Not a BidBid
- Preliminary Input- Preliminary Input
26 October 2004 PPAP Tony Doyle - University of Glasgow
Grid and e-ScienceExploitation Timeline?
• PPAP initial input Oct 2004• Science Committee initial input• PPARC call assessment (2007-2010) 2005• Science Committee outcome Oct 2005• PPARC call Jan 2006• PPARC close of call May 2006• Assessment Jun-Dec 2006• PPARC outcome Dec 2006 • Institute Recruitment/Retention Jan-Aug 2007• Grid and e-Science Exploitation Sep 2007 -
….
• Note if the assessment from PPARC internal planning differs significantly from this preliminary advice from PPAP and SC, then earlier planning is required.
26 October 2004 PPAP Tony Doyle - University of Glasgow
Summary
1. What has been achieved in GridPP1? • Widely recognised as successful at many levels
2. What is being attempted in GridPP2? • Prototype to Production – typically most difficult phase• UK should invest further in LCG Phase-2
3. Resources needed for Grid and e-Science in medium-long term? • Current Road Map ~£6m
p.a. • Resources needed in 2008 estimated at £8.3m• Timeline for decision-making outlined..