39
Tony Doyle [email protected]. ac.uk “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002

Tony Doyle [email protected] “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002

Embed Size (px)

Citation preview

Tony [email protected]

“GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002

Tony Doyle - University of Glasgow

Outline Outline GridPP – Project ElementsGridPP – Project Elements

• Who are we?

• Motivation

• Overview

• Project Map

1. CERN

2. DataGrid

3. Applications

4. Infrastructure5. Interoperability6. Dissemination7. Finances

Achievements and IssuesSummary

Tony Doyle - University of Glasgow

Who are we?Who are we?

Nick White /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White member Roger Jones /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones member Sabah Salih /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih member Santanu Das /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das member Tony Cass /O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass member David Kelsey /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey member

Henry Nebrensky /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky member Paul Kyberd /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd member Peter Hobson /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson member Robin Middleton /O=Grid/O=UKHEP/OU=pp.rl.ac.uk

/CN=Robin Middleton member Alexander Holt /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt member Alasdair Earl /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl member Akram Khan /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan member Stephen Burke

/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke member Paul Millar /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar member Andy Parker /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker member Neville Harnew /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew member Pete Watkins /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins member Owen Maroney /O=Grid/O=UKHEP/OU=phy.bris.ac.uk

/CN=Owen Maroney member Alex Finch /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch member Antony Wilson /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson member Tim Folkes /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes member Stan Thompson /O=Grid/O=UKHEP/OU=ph.

gla.ac.uk/CN=A. Stan Thompson member Mark Hayes /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes member Todd Huffman /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman member Glenn Patrick /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick member Pete

Gronbech /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech member Nick Brook /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook member Marc Kelly /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly member Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave

Newbold member Kate Mackay /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay member Girish Patel /O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel member David Martin /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin member Peter Faulkner /O=Grid/O=UKHEP/OU=ph.

bham.ac.uk/CN=Peter Faulkner member David Smith /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith member Steve Traylen /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen member Ruth Dixon del Tufo /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del

Tufo member Linda Cornwall /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li member Paul D. Mealor /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk

/CN=Paul A Crosby member David Waters /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters member Bob Cranfield /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield member Ben West /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West member Rod Walker

/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker member /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis member Dave Colling /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling member Alex Howard /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard member Roger

Barlow /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow member Joe Foster /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster member Alessandra Forti /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti member Peter Clarke /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter

Clarke member Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum member John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member Andrew McNab /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab member

Richard Hughes-Jones /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones member Gavin McCance /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance member Tony Doyle /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle admin Alex Martin /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin member Steve Lloyd /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd admin John Gordon

/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member

Tony Doyle - University of Glasgow

Origin of MassOrigin of Mass- Rare Phenomenon- Rare Phenomenon

9 or

ders

of

mag

nitu

de!

The HIGGS

All interactions

Tony Doyle - University of Glasgow

Matter-Antimatter AsymmetryMatter-Antimatter AsymmetryComplex InteractionsComplex Interactions

Design Simulation

Complexity

Understanding

Tony Doyle - University of Glasgow

GridPP OverviewGridPP Overview

EDG - UK Contributions

ArchitectureTestbed-1Network MonitoringCertificates & SecurityStorage Element R-GMALCFGMDS deploymentGridSiteSlashGridSpitfire…

Applications (start-up phase)

BaBarCDF/D0 (SAM)ATLAS/LHCbCMS(ALICE)UKQCD

£17m 3-year project funded by PPARC through the e-Science Programme

CERN - LCG (start-up phase)

funding for staff and hardware...

£3.78m

£5.67m

£3.66m

£1.99m

£1.88m

CERN

DataGrid

Tier - 1/A

Applications

Operations

http://www.gridpp.ac.uk

Tony Doyle - University of Glasgow

Provide architecture and middleware

Use the Grid with simulated data

Use the Grid with real data

Future LHC Experiments

Running US Experiments

Build Tier-A/prototype Tier-1 and Tier-2 centres

in the UK and join worldwide effort to

develop middleware for the experiments

GridPP BridgeGridPP Bridge

Tony Doyle - University of Glasgow

GridPP VisionGridPP Vision

From Web to Grid - Building the next IT Revolution

PremiseThe next IT revolution will be the Grid. The Grid is a practical solution to the data-intensive problems that must be overcome if the computing needs of many scientific communities and industry are to be fulfilled over the next decade.

Aim

The GridPP Collaboration aims to develop and deploy a large-scale science Grid in the UK for use by the worldwide particle physics community.

Many Challenges..Shared distributed

infrastructure For all applications

Tony Doyle - University of Glasgow

GridPP OrganisationGridPP Organisation

Tony Doyle - University of Glasgow

GridPP Project Map GridPP Project Map - Elements- Elements

Tony Doyle - University of Glasgow

GridPP Project Map GridPP Project Map - Metrics and Tasks- Metrics and Tasks

Task complete 1.1.1Metric OK 1.1.1Task due next quarter 1.1.1Metric not OK 1.1.1Task overdue 1.1.1Not yet defined 1.1.1No Task or metric

Navigate up Navigate down

Available from Web Pages

ProvidesStructure for this talk

Tony Doyle - University of Glasgow

LHC Computing ChallengeLHC Computing Challenge

Tier2 Centre ~1 TIPS

Online System

Offline Farm~20 TIPS

CERN Computer Centre >20 TIPS

RAL Regional Centre

US Regional Centre

French Regional Centre

Italian Regional Centre

InstituteInstituteInstituteInstitute ~0.25TIPS

Workstations

~100 MBytes/sec

~100 MBytes/sec

100 - 1000 Mbits/sec

•One bunch crossing per 25 ns

•100 triggers per second

•Each event is ~1 Mbyte

Physicists work on analysis “channels”

Each institute has ~10 physicists working on one or more channels

Data for these channels should be cached by the institute server

Physics data cache

~PBytes/sec

~ Gbits/sec or Air Freight

Tier2 Centre ~1 TIPS

Tier2 Centre ~1 TIPS

~Gbits/sec

Tier Tier 00

Tier Tier 11

Tier Tier 33

Tier Tier 44

1 TIPS = 25,000 SpecInt95

PC (1999) = ~15 SpecInt95

ScotGRID++ ~1 TIPS

Tier Tier 22

1. CERN1. CERN

Tony Doyle - University of Glasgow

LHC Computing Grid:LHC Computing Grid:High Level PlanningHigh Level Planning

2002 200520042003

Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4Q1 Q2 Q3 Q4Q1 Q2 Q3 Q4

Prototype of Hybrid Event Store (Persistency Framework)

Hybrid Event Store available for general users

Distributed production using grid services

First Global Grid Service (LCG-1) available

Distributed end-user interactive analysis

Full Persistency Framework

LCG-1 reliability and performance targets

“50% prototype” (LCG-3) available

LHC Global Grid TDR

applicationsapplications

Grid as aGrid as aServiceService

1. CERN1. CERN

Tony Doyle - University of Glasgow

DataGrid Middleware DataGrid Middleware Work PackagesWork Packages

• Collect requirements for

middleware– Take into account requirements from

application groups

• Survey current technology– For all middleware

• Core Services testbed– Testbed 0: Globus (no EDG middleware)

• Grid testbed releases

• Testbed 1.2: current

release

• WP1: workload– Job resource specification &

scheduling

• WP2: data management– Data access, migration & replication

• WP3: grid monitoring services– Monitoring infrastructure, directories

& presentation tools

• WP4: fabric management– Framework for fabric configuration

management & automatic software installation

• WP5: mass storage management– Common interface for Mass Storage

• WP7: network services– Network services and monitoringTalk:

•“GridPP – Developing anOperational Grid”•Dave Colling

2. DataGrid2. DataGrid

Tony Doyle - University of Glasgow

EDG TestBed 1 StatusEDG TestBed 1 Status30 Aug 2002 17:3830 Aug 2002 17:38

Web interface showing status of (~400) servers at testbed 1 sites

Production Centres

Tony Doyle - University of Glasgow

GridPP Sites in Testbed: GridPP Sites in Testbed: Status 30 Aug 2002 17:38Status 30 Aug 2002 17:38

Project MapProject MapSoftware releases Software releases at each siteat each site

Tony Doyle - University of Glasgow

2003 CMS data grid system vision

A CMS Data Grid JobA CMS Data Grid Job 3. Applications3. Applications

Demo: •Current status of the system vision

Tony Doyle - University of Glasgow

ATLAS/LHCb ArchitectureATLAS/LHCb Architecture

Converter

Algorithm

Event DataService

PersistencyService

DataFiles

AlgorithmAlgorithm

Transient Event Store

Detec. DataService

PersistencyService

DataFiles

Transient Detector

Store

MessageService

JobOptionsService

Particle Prop.Service

OtherServices

HistogramService

PersistencyService

DataFiles

TransientHistogram

Store

ApplicationManager

ConverterConverter

The Gaudi Framework - developed by LHCb

- adopted by ATLAS (Athena)

3. Applications3. Applications

Tony Doyle - University of Glasgow

GANGA: GANGA: Gaudi ANd Grid AllianceGaudi ANd Grid Alliance

GAUDI Program

GANGAGU

I

JobOptionsAlgorithms

Collective&

ResourceGrid

Services

HistogramsMonitoringResults

Making the Grid Work

for the Experiments

3. Applications3. Applications

Tony Doyle - University of Glasgow

Overview of SAMOverview of SAM

DatabaseServer(s)(Central Database)

NameServer

Global Resource

Manager(s)Log server

Station 1Servers

Station 2Servers

Station 3 Servers

Station nServers

Mass Storage System(s)

SharedGlobally

Local

SharedLocally

Arrows indicateControl and data flow

3. Applications3. Applications

Tony Doyle - University of Glasgow

Overview of SAMOverview of SAM

Fabric

TapeStorage

Elements

RequestFormulator and

Planner

Client Applications

ComputeElements

Indicates component that w ill be replaced

DiskStorage

Elements

LANs andWANs

Resource andServices Catalog

ReplicaCatalog

Meta-dataCatalog

Authentication and SecurityGSISAM-specific user, group , node, st at ion regis tration B bftp ‘cookie’

Connectivity and Resource

CORBA UDP File transfer protocol s - ftp, b bftp, rcp GridFTP

Mass Storage s ystems protocol se.g. encp, hp ss

Collective Services

C atalogproto co ls

Signi fi cant Event Log ger Naming Service Database ManagerC atalog Manager

SAM R es ource M an ag em entB atch Sys tems - LSF, FB S, PB S,

C ondorData Mov erJob Services

Storage ManagerJob ManagerCache ManagerRequest Manager

“Dataset Editor” “File Storage Server”“Project Master” “Station Master” “Station Master”

Web Python codes, Java codesCom mand line D0 Fram ework C++ codes

“Stager”“Optim iser”

CodeRepostory

Name in “quotes” is SAM-given software component name

or addedenhanced using PPDG and Grid tools

SAM and DataGrid

using common

(lower) middleware

3. Applications3. Applications

Tony Doyle - University of Glasgow

and the Gridand the Grid

Running Experiment at SLAC (San Francisco)

Producing many Terabytes of useful data

(500 TByte Objectivity Database)

Computationally intense analysis

500+ physicists spread over 72 institutes in 9 countries

50+ in UK

Scale forces move from central to distributed computing

1/3-size prototype for LHC experimentsIn place – must respect existing practiceRunning and needing solutions today

3. Applications3. Applications

Tony Doyle - University of Glasgow

Experiment Deployment Experiment Deployment

Tony Doyle - University of Glasgow

AdvertisementAdvertisement

Talk: •“The Quantum ChromoDynamics Grid”•James Perry

3. Applications3. Applications

Tony Doyle - University of Glasgow

Tier-0 - CERNTier-0 - CERN

Commodity Processors +IBM (mirrored) EIDE

Disks..

2004 Scale: ~1,000 CPUs~0.5 PBytes

Compute Element (CE)

Storage Element (SE)

User Interface (UI)

Information Node (IN)

Storage Systems..

4. Infrastructure4. Infrastructure

Tony Doyle - University of Glasgow

UK Tier-1 RALUK Tier-1 RAL

New Computing Farm

4 racks holding 156 dual 1.4GHz Pentium III cpus. Each box has 1GB of memory, a 40GB internal disk and 100Mb ethernet.

50TByte disk-based Mass Storage Unit

after RAID 5 overhead. PCs are clustered on network switches with up to 8x1000Mb ethernet out of each rack.

Tape Robotupgraded last yearuses 60GB STK 9940 tapes 45TB currrent capacitycould hold 330TB.

2004 Scale: 1000 CPUs0.5 PBytes

4. Infrastructure4. Infrastructure

Tony Doyle - University of Glasgow

NetworkNetwork

• Internal networking is currently a hybrid of – 100Mb(ps) to nodes of cpu farms – 1Gb to disk servers– 1Gb to tape servers

• UK: academic network SuperJANET4 – 2.5Gb backbone upgrading to 20Gb in 2003

• EU: SJ4 has 2.5Gb interconnect to Geant• US: New 2.5Gb link to ESnet and Abilene for researchers

• UK involved in networking development

– internal with Cisco on QoS– external with DataTAG

4. Infrastructure4. Infrastructure

Tony Doyle - University of Glasgow

UK Tier-2 ScotGRIDUK Tier-2 ScotGRID

ScotGrid Processing nodes at Glasgow 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory • 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet • 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and 100 + 1000 Mbit/s ethernet • 1TB disk • LTO/Ultrium Tape Library • Cisco ethernet switches

ScotGrid Storage at Edinburgh• IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM • 70 x 73.4 GB IBM FC Hot-Swap HDD

CDF equipment at Glasgow• 8 x 700 MHz Xeon IBM xSeries 370 4 GB memory 1 TB disk

Griddev testrig at Glasgow• 4 x 233 MHz Pentium II

2004 Scale: 300 CPUs0.1 PBytes

BaBar UltraGrid System at Edinburgh• 4 UltraSparc 80 machines in a rack 450 MHz CPUs in each 4Mb cache, 1 GB memory • Fast Ethernet and Myrinet switching

4. Infrastructure4. Infrastructure

Tony Doyle - University of Glasgow

GridPP ContextGridPP Context 5. Interoperability5. Interoperability

Tony Doyle - University of Glasgow

Grid issues – CoordinationGrid issues – Coordination

• Technical part is not the only problem • Sociological problems? resource sharing

– Short-term productivity loss but long-term gain

• Key? communication/coordination between people/centres/countries– This kind of world-wide close coordination across multi-national

collaborations has never been done in the past

• We need mechanisms here to make sure that all centres are part of a global planning

– In spite of different conditions of funding, internal planning, timescales etc

• The Grid organisation mechanisms should be complementary and not parallel or conflicting to existing experiment organisation

– LCG-DataGRID-eSC-GridPP– BaBar-CDF-D0-ALICE-ATLAS-CMS-LHCb-UKQCD

• Local Perspective: build upon existing strong PP links in the UK to build a single Grid for all experiments

Tony Doyle - University of Glasgow

Authentication/AuthorizationAuthentication/Authorization

• Authentication (CA Working Group)– 11 national certification authorities– policies & procedures mutual trust– users identified by CA’s certificates

• Authorization (Authorization Working Group)– Based on Virtual Organizations (VO).– Management tools for LDAP-based membership lists.– 6+1 Virtual Organizations

VO’s

ALICE Earth Obs.

ATLAS Biomedical

CMS

LHCb Guidelines

CA’s

CERN

CESNET

CNRS

DataGrid-ES

GridPPGrid-Ireland

INFN

LIP

NIKHEF

NorduGrid

Russian DataGrid

2. DataGrid2. DataGrid

5. Interoperability5. InteroperabilityBuilt InBuilt In

Tony Doyle - University of Glasgow

Current User Base Current User Base Grid Support CentreGrid Support Centre

• GridPP (UKHEP) CA uses primitive technology– It works but takes effort– 201 personal certs issued– 119 other certs issued

• GSC will run a CA for UK escience CA– Uses openCA; Registration Authority uses web

– We plan to use itWe plan to use it– Namespace identifies RA, not Project– Authentication not Authorisation

• Through GSC we have access to skills of CLRC eSC

• Use helpdesk to formalise support later in the rollout

UK e-ScienceUK e-Science

CertificationCertification

AuthorityAuthority

5. Interoperability5. Interoperability

Tony Doyle - University of Glasgow

Trust RelationshipsTrust Relationships 5. Interoperability5. Interoperability

Tony Doyle - University of Glasgow

Dissemination:Dissemination:Godel’s Theorem?Godel’s Theorem?

6.

6. 1

6. 2

6. 3

Dissemination

Presentationof GridPP

Participation inrelated areas

Engagementof UK groups

6.1

6.1.1 DUE on going

12

#NAME? Monthly

Status In Progress Status Date 17-Jul-02

6.1.2 DUE on going

6

#NAME? Every two months

Status In Progress Status Date 17-Jul-02

6.1.3 DUE on going

4

#NAME? Four per year

Status In Progress Status Date 17-Jul-02

6.1.4 DUE on going

5

#NAME? Four per year

Status In Progress Status Date 17-Jul-02

6.1.5 DUE on going

3

#NAME? Two per year

Status In Progress Status Date 17-Jul-02

Current Threshold

MetricNumber of posters

Current Total

GridPP Posters

DescriptionDissemination of GridPP work via posters

Description

MetricNumber of presentations

GridPP Workshops

GridPP Presentations

Dissemination of GridPP work via presentations to other physicists, other scientists, industry,

or the general public.

Current Total

Current Threshold

DescriptionDissemination of GridPP work via technical workshops.

GridPP Demonstrations

DescriptionDissemination of GridPP work via demonstrations

Current Total

Current Threshold

MetricNumber of demonstrations

Current Total

Current Threshold

MetricNumber of workshops

Current Threshold

GridPP Publications

DescriptionDissemination of GridPP by publications

MetricNumber of publications

Current Total

6.2

6.2.1 DUE on going

44

#NAME? Forty per year

Status In Progress Status Date 17-Jul-02

6.2.2 DUE on going

13

#NAME? Each centre every 6 months

Status In Progress Status Date 17-Jul-02

6.2.3 DUE on going

19

12 (Depends on number of related projects)

Status In Progress Status Date 17-Jul-02

Current Total

Current Threshold

MetricNumber of interactions

Current Total

Current Threshold

MetricNumber of members of related projects

GridPP Membership of Related Projects

DescriptionTo promote the work of GridPP, members will belong to related projects and will keep these projects

up to date with the work of GridPP

GridPP Involvement with e-Science Centres

GridPP Representation at related events

To promote communication, GridPP members will attend related events.

Current Total

Current Threshold

DescriptionTo promote communication, GridPP will endeavor to keep the e-Science centres involved by

actively seeking contact.

Description

MetricNumber of attendances

6.3.1 DUE on going

1

1

Status In Progress Status Date 17-Jul-02

6.3.2 DUE on going

1

1

Status In Progress Status Date 17-Jul-02

6.3.3 DUE on going

0

#NAME? One per year

Status In Progress Status Date 17-Jul-02

6.3.4 DUE on going

23

#NAME? Two (hundred hits?) per month

Status In Progress Status Date 17-Jul-02 Current Threshold

MetricWebsite hit index defined as Num hits/100?

Current Total

GridPP Web-site Hits

DescriptionThe external interest in GridPP can be measured by counting the number of hits on the GridPP

website from external sources.

Current Total

Current Threshold

MetricNumber of External Groups who use GridPP Hardware

Current Total

Current Threshold

MetricNumber of articles in the scientific or general press.

Press Articles

DescriptionThe engagement of outside interest may be measured by the number of articles about GridPP

written by non members.

External use of GridPP Infrastructure

External use of GridPP Products

Engagement of the community can be measured by the number of non-GridPP funded groups

who make use of GridPP middleware, or technical assistance.

Current Total

Current Threshold

DescriptionEngagement of the community can be measured by the number of non-GridPP funded groups

who make use of the GridPP infrastructure.

Description

MetricNumber of External Groups who use GridPP Middleware

6. Dissemination6. Dissemination

Project

Map

Elements

Tony Doyle - University of Glasgow

t0

t1

From Grid to Web…From Grid to Web…using GridSiteusing GridSite 6. Dissemination6. Dissemination

Tony Doyle - University of Glasgow

£17m++ 3-Year Project£17m++ 3-Year Project

£3.78m

£5.67m

£3.66m

£1.99m

£1.88m

CERN

DataGrid

Tier - 1/A

Applications

Operations

• Five components– Tier-1/A = Hardware + CLRC ITD Support Staff

– DataGrid = DataGrid Posts + CLRC PPD Staff

– Applications = Experiments Posts

– Operations = Travel + Management + Early Investment

– CERN = LCG posts + Tier-0 + LTA

7. Finances7. Finances

Tony Doyle - University of Glasgow

GridPP –GridPP – Achievements and Issues Achievements and Issues

• 1st Year Achievements• Complete Project Map

– Applications: Middleware: Hardware

• Fully integrated with EU DataGrid and LCG Projects

• Rapid middleware deployment /testing

• Integrated US-EU applications development e.g. BaBar+EDG

• Roll-out document for all sites in the UK (Core Sites, Friendly Testers, User Only).

• Testbed up and running at 15 sites in the UK

• Tier-1 Deployment• 200 GridPP Certificates issued• First significant use of Grid by an

external user (LISA simulations) in May 2002

• Web page development (GridSite)

• Issues for Year 2• Status: 30 Aug 2002 17:38 GMT

– monitor and improve testbed deployment efficiency from Sep 1…

• Importance of EU-wide development of middleware

• Integrated Testbed for use/testing by all applications

• Common “integration” layer between middleware and application software

• Integrated US-EU applications development

• Tier-1 Grid Production Mode• Tier-2 Definitions and Deployment• Integrated Tier-1 + Tier-2 Testbed• Transfer to UK e-Science CA• Integration with other UK projects

e.g. AstroGrid, MyGrid…

Tony Doyle - University of Glasgow

SummarySummary

• Grid success is fundamental for PP

1. CERN = LCG, Grid as a Service.

2. DataGrid = Middleware built upon Globus and Condor-G. Testbed 1 deployed.

3. Applications – complex, need to interface to middleware.

LHC Analyses – ongoing feedback/development.

Other Analyses have immediate requirements. Integrated using Globus, Condor, EDG tools

4. Infrastructure = Tiered computing to the physicist desktop:

Scale in UK? 1 PByte and 2,000 distributed CPUs

GridPP in Sept 2004 5. Integration = ongoing…6. Dissemination• Co-operation required with other

disciplines/industry7. Finances – under control• Year 1 was a good starting point.

First Grid jobs have been submitted..

• Looking forward to Year 2. Web services ahead..

Tony Doyle - University of Glasgow

Holistic View:Holistic View:Multi-layered IssuesMulti-layered Issues