51
LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications LHCC Meeting January 27, 2003

LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Meeting January 27, 2003

  • View
    219

  • Download
    0

Embed Size (px)

Citation preview

LCG Applications Area Status

Torre Wenaus, BNL/CERN

LCG Applications Area Manager

http://cern.ch/lcg/peb/applications

LHCC Meeting

January 27, 2003

LHCC meeting, January 27, 2003 Slide 2

Torre Wenaus, BNL/CERN

Outline

Applications area scope and organization Architecture Personnel and planning

Little on planning since I talked to (most of you) about it in Nov Status of applications area projects

SPI POOL SEAL PI Math libraries Simulation

Relationship to other LCG activity areas Conclusion

LHCC meeting, January 27, 2003 Slide 3

Torre Wenaus, BNL/CERN

The LHC Computing Grid Project Structure

Project Overview Board

ProjectExecution

Board (PEB)

Software andComputingCommittee

(SC2)

Requirements,Work plan,Monitoring

WP

RTAG

WP WP WP WP

Project Leader

GridProjects

Project Work Packages

LHCC meeting, January 27, 2003 Slide 4

Torre Wenaus, BNL/CERN

LCG Areas of Work

Fabric (Computing System) Physics Data Management Fabric Management Physics Data Storage LAN Management Wide-area Networking Security Internet Services

Grid Technology Grid middleware Standard application

services layer Inter-project

coherence/compatibility

Physics Applications Software Application Software

Infrastructure – libraries, tools Object persistency, data

management tools Common Frameworks –

Simulation, Analysis, .. Adaptation of Physics

Applications to Grid environment Grid tools, Portals

Grid Deployment Data Challenges Grid Operations Network Planning Regional Centre Coordination Security & access policy

LHCC meeting, January 27, 2003 Slide 5

Torre Wenaus, BNL/CERN

Applications Area Organization

Project

WP WP WP

Project

WP

Project

WP WPWP

Overall management, coordination, architectureApps AreaLeader

ProjectLeaders

Work PackageLeaders

ArchitectsForum

Direct technical collaboration between experiment participants,IT, EP, ROOT, LCG personnel

LHCC meeting, January 27, 2003 Slide 6

Torre Wenaus, BNL/CERN

Focus on Experiment Need

Project structured and managed to ensure a focus on real experiment needs

SC2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves

Architects Forum to involve experiment architects in day to day project management and execution

Open information flow and decision making Direct participation of experiment developers in the projects Tight iterative feedback loop to gather user feedback from

frequent releases Early deployment and evaluation of LCG software in

experiment contexts Success defined by experiment adoption and production

deployment

LHCC meeting, January 27, 2003 Slide 7

Torre Wenaus, BNL/CERN

Applications Area Projects

Software Process and Infrastructure (SPI) (operating – A.Aimar) Librarian, QA, testing, developer tools, documentation, training, …

Persistency Framework (POOL) (operating – D.Duellmann) POOL hybrid ROOT/relational data store

Mathematical libraries (operating – F.James) Math and statistics libraries; GSL etc. as NAGC replacement Group in India will work on this (workplan in development)

Core Tools and Services (SEAL) (operating – P.Mato) Foundation and utility libraries, basic framework services, system

services, object dictionary and whiteboard, grid enabled services Physics Interfaces (PI) (launched – V.Innocente)

Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals

Simulation (launch planning in progress) Geant4, FLUKA, simulation framework, geometry model, …

Generator Services (launch as part of simu) Generator librarian, support, tool development

Bold: Recent developments (last 3 months)

LHCC meeting, January 27, 2003 Slide 8

Torre Wenaus, BNL/CERN

Project Relationships

Sof

twa

re P

roce

ss &

Inf

rast

ruct

ure

(S

PI)

Core Libraries & Services (SEAL)

Persistency(POOL)

PhysicistsInterface

(PI)

MathLibraries…

LCG Applications Area

Other LCG Projects in other Areas

LHC

Exp

erim

ents

LHCC meeting, January 27, 2003 Slide 9

Torre Wenaus, BNL/CERN

02Q1 02Q2 02Q3 02Q4 03Q1 03Q2 03Q3 03Q4 04Q1 04Q2Simulation tools XDetector description & model XConditions database XData dictionary XInteractive framew orks XStatistical analysis XDetector & event visualization XPhysics packages XFramew ork services XC++ class libraries XEvent processing framew ork XDistributed analysis interfaces XDistributed production systems XSmall scale persistency XSoftw are testing XSoftw are distribution XOO language usage XLCG benchmarking suite XOnline notebooks X

Candidate RTAG timeline from March

Blue: RTAG/activity launched or (light blue) imminent

LHCC meeting, January 27, 2003 Slide 10

Torre Wenaus, BNL/CERN

LCG Applications Area Timeline Highlights

2002 200520042003

Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4Q1 Q2 Q3 Q4Q1 Q2 Q3 Q4

POOL first production release

Distributed production using grid services

First Global Grid Service (LCG-1) available

Distributed end-user interactive analysis

Full Persistency Framework

LCG-1 reliability and performance targets

“50% prototype” (LCG-3)

LCG TDR

Applications

LCG

POOL V0.1 internal releaseArchitectural blueprint complete

LCG launch week

LHCC meeting, January 27, 2003 Slide 11

Torre Wenaus, BNL/CERN

Architecture Blueprint

Executive summary Response of the RTAG to the mandate Blueprint scope Requirements Use of ROOT Blueprint architecture design precepts

High level architectural issues, approaches Blueprint architectural elements

Specific architectural elements, suggested patterns, examples Domain decomposition Schedule and resources Recommendations

RTAG established in JuneExpt architects, other experts

After 14 meetings,much email...

A 36-page final reportAccepted by SC2 October 11

http://lcgapp.cern.ch/project/blueprint/

LHCC meeting, January 27, 2003 Slide 12

Torre Wenaus, BNL/CERN

Principal architecture requirements

Long lifetime: support technology evolution C++ today; support language evolution Seamless distributed operation Usability off-network Component modularity, public interfaces Interchangeability of implementations Integration into coherent framework and experiment software Design for end-user’s convenience more than the developer’s Re-use existing implementations Software quality at least as good as any LHC experiment Meet performance, quality requirements of trigger/DAQ software Platforms: Linux/gcc, Linux/icc, Solaris, Windows

LHCC meeting, January 27, 2003 Slide 13

Torre Wenaus, BNL/CERN

Component Model

Communication via public interfaces APIs targeted to end-users, embedding frameworks,

internal plug-ins Plug-ins

Logical module encapsulating a service that can be loaded, activated and unloaded at run time

Granularity driven by component replacement criteria dependency minimization development team organization

LHCC meeting, January 27, 2003 Slide 14

Torre Wenaus, BNL/CERN

Basic Framework

Foundation Libraries

Simulation Framework

Reconstruction Framework

Visualization Framework

Applications

. . .

Optional Libraries

OtherFrameworks

Software Structure

Implementation-neutral services

STL,ROOT libs,

CLHEP,Boost, …

Grid middleware, …

ROOT, Qt, …

LHCC meeting, January 27, 2003 Slide 15

Torre Wenaus, BNL/CERN

Distributed Operation

Architecture should enable but not require the use of distributed resources via the Grid

Configuration and control of Grid-based operation via dedicated services

Making use of optional grid middleware services at the foundation level of the software structure

Insulating higher level software from the middleware Supporting replaceability

Apart from these services, Grid-based operation should be largely transparent

Services should gracefully adapt to ‘unplugged’ environments

LHCC meeting, January 27, 2003 Slide 16

Torre Wenaus, BNL/CERN

Managing Objects

Object Dictionary To query a class about its internal structure Essential for persistency, data browsing, etc. The ROOT team and LCG plan to develop and converge

on a common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI) (Timescale ~1yr?)

Object Whiteboard Uniform access to application-defined transient objects,

including in the ROOT environment Object definition based on C++ header files

Used by ATLAS and CMS

LHCC meeting, January 27, 2003 Slide 17

Torre Wenaus, BNL/CERN

Other Architectural Elements

Python-based Component Bus Plug-in integration of components providing a wide

variety of functionality Component interfaces to bus derived from their C++

interfaces Scripting Languages

Python and CINT (ROOT) to both be available Access to objects via object whiteboard in these

environments Interface to the Grid

Must support convenient, efficient configuration of computing elements with all needed components

LHCC meeting, January 27, 2003 Slide 18

Torre Wenaus, BNL/CERN

EventGeneration

Core Services

Dictionary

Whiteboard

Foundation and Utility Libraries

DetectorSimulation

Engine

Persistency

StoreMgr

Reconstruction

Algorithms

Geometry Event Model

GridServices

I nteractiveServices

Modeler

GUIAnalysis

EvtGen

Calibration

Scheduler

Fitter

PluginMgr

Monitor

NTuple

Scripting

FileCatalog

ROOT GEANT4 DataGrid Python Qt

Monitor

. . .MySQLFLUKA

EventGeneration

Core Services

Dictionary

Whiteboard

Foundation and Utility Libraries

DetectorSimulation

Engine

Persistency

StoreMgr

Reconstruction

Algorithms

Geometry Event Model

GridServices

I nteractiveServices

Modeler

GUIAnalysis

EvtGen

Calibration

Scheduler

Fitter

PluginMgr

Monitor

NTuple

Scripting

FileCatalog

ROOT GEANT4 DataGrid Python Qt

Monitor

. . .MySQLFLUKA

Domain Decomposition

Products mentioned are examples; not a comprehensive list

Grey: not in common project scope(also event processing framework, TDAQ)

LHCC meeting, January 27, 2003 Slide 19

Torre Wenaus, BNL/CERN

Use of ROOT in LCG Software

Among the LHC experiments ALICE has based its applications directly on ROOT The 3 others base their applications on components with

implementation-independent interfaces Look for software that can be encapsulated into these components

All experiments agree that ROOT is an important element of LHC software

Leverage existing software effectively and do not unnecessarily reinvent wheels

Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT

LCG AA software will make use of ROOT as an external product Draws on a great ROOT strength: users are listened to very carefully!

So far so good: the ROOT team has been very responsive to needs for new and extended functionality coming from POOL

LHCC meeting, January 27, 2003 Slide 20

Torre Wenaus, BNL/CERN

Blueprint RTAG Outcomes

SC2 decided in October…

Blueprint is accepted RTAG recommendations accepted to

Start common project on core tools and services Start common project on physics interfaces

LHCC meeting, January 27, 2003 Slide 21

Torre Wenaus, BNL/CERN

Applications Area Personnel Status

18 LCG apps hires in place and working; +1 last week, +1 in Feb Manpower ramp is on target (expected to reach 20-23) Contributions from UK, Spain, Switzerland, Germany, Sweden,

Israel, Portugal, US, India, and Russia ~12 FTEs from IT (DB and API groups) ~12 FTEs from CERN EP/SFT, experiments

CERN established a new software group as the EP home of the LCG applications area (EP/SFT)

Led by John Harvey Taking shape well. Localized in B.32 Soon to be augmented by IT/API staff working in applications

area; they will move to EP/SFT Will improve cohesion, sense of project participation, technical

management effectiveness

LHCC meeting, January 27, 2003 Slide 22

Torre Wenaus, BNL/CERN

Applications Area Personnel Summary

People FTEs

Total LCG hires working, total 19 19.0

Working directly for apps area projects 15 15.0

ROOT 2 2.0

Grid integration work with experiments 2 2.0

Contributions from

IT/DB 3 2.1

IT/API 11 9.7

EP/SFT + experiments total 22 11.9

Working directly for apps area projects 19 9.9

Architecture, management 5 2.0

Total directly working on apps area projects 48 36.7

Overall applications area total 55 42.7

Details at http://lcgapp.cern.ch/project/mgmt/AppManpower.xls

LHCC meeting, January 27, 2003 Slide 23

Torre Wenaus, BNL/CERN

Current Personnel Distribution

FTE assignments

10.50

9.90

3.701.10

9.50

0.00

2.00

2.00

2.00

0.40

1.55

POOL

SPI

SEAL

PI

Simu

Gen

Math

ROOT

Grid

Arch

Mgmt

Summary of LCG-funded Resources Used - Estimate to end 2002

Experience-Weighted FTEsCERN Special FundingApplications 8.9

Persistency 1.8Software Process Support 2.0Simulation 1.8Root 1.4Architecture 0.0Grid Interfacing 1.1Training 0.7

Fabric 5.4System Management & Operations 1.6Development (e.g. Monitoring) 1.8Data Storage Management 0.8Grid Security 1.1

Grid Technology 3.0Data Management 2.3Grid Gatekeeper 0.7

Grid Deployment 4.6Int 2.1OPS 2.5

Management 2.1LCG 2.1

EU FundingDataGrid 6.9

Total Weighted FTEs in calendar 2002 30.8

FTE-years

LHCC meeting, January 27, 2003 Slide 25

Torre Wenaus, BNL/CERN

Personnel Resources – Required and Available

Estimate of Required Effort

0

10

20

30

40

50

60S

ep-0

2

Dec

-02

Mar

-03

Jun-

03

Sep

-03

Dec

-03

Mar

-04

Jun-

04

Sep

-04

Dec

-04

Mar

-05

Quarter ending

FT

Es

SPI

Math libraries

Physicist interface

Generator services

Simulation

SEAL

POOL

Blue = Available effort

Future estimate based on 20 LCG, 12 IT, 28 EP + experiments

Now

LHCC meeting, January 27, 2003 Slide 26

Torre Wenaus, BNL/CERN

Schedule and Resource Tracking (example)

LHCC meeting, January 27, 2003 Slide 27

Torre Wenaus, BNL/CERN

MS Project Integration – POOL Milestones

LHCC meeting, January 27, 2003 Slide 28

Torre Wenaus, BNL/CERN

Apps area planning materials

Planning page linked from applications area page Applications area plan spreadsheet: overall project plan

http://lcgapp.cern.ch/project/mgmt/AppPlan.xls High level schedule, personnel resource requirements

Applications area plan document: overall project plan http://lcgapp.cern.ch/project/mgmt/AppPlan.doc Incomplete draft

Personnel spreadsheet http://lcgapp.cern.ch/project/mgmt/AppManpower.xls Currently active/foreseen apps area personnel, activities

WBS, milestones, assigned personnel resources http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html

LHCC meeting, January 27, 2003 Slide 29

Torre Wenaus, BNL/CERN

L1 Milestones (1)

LHCC meeting, January 27, 2003 Slide 30

Torre Wenaus, BNL/CERN

L1 Milestones (2)

LHCC meeting, January 27, 2003 Slide 31

Torre Wenaus, BNL/CERN

Software Process and Infrastructure Project

General Service for Software projects

a. Provide general services needed by each project CVS repository, Web Site, Software Library Mailing Lists, Bug Reports, Collaborative Facilities

b. Provide components specific to the software phases Tools, Templates, Training, Examples, etc.

SpecificationsAnalysis and Design

Development

Release

DebuggingTesting

…..Deployment and

Installation…..

Planning

Software development Support

Alberto Aimar, CERN IT/API

LHCC meeting, January 27, 2003 Slide 32

Torre Wenaus, BNL/CERN

SPI Services

CVS repositories One repository per project Standard repository structure and #include conventions Will eventually move to IT CVS service when it is proven

AFS delivery area, Software Library /afs/cern.ch/sw/lcg Installations of LCG-developed and external software LCG Software Library ‘toolsmith’ started in December

Build servers Machines with needed Linux, Solaris configurations

Project portal (similar to SourceForge) http://lcgappdev.cern.ch Very nice new system using Savannah (savannah.gnu.org) Used by POOL, SEAL, SPI, CMS, … Bug tracking, project news, FAQ, mailing lists, download area,

CVS access, …

LHCC meeting, January 27, 2003 Slide 33

Torre Wenaus, BNL/CERN

LHCC meeting, January 27, 2003 Slide 34

Torre Wenaus, BNL/CERN

LHCC meeting, January 27, 2003 Slide 35

Torre Wenaus, BNL/CERN

SPI Components

Code documentation, browsing Doxygen, LXR, ViewCVS Testing Framework CppUnit, Oval Memory Leaks Valgrind Automatic Builds NICOS (ATLAS) Coding and design guidelines RuleChecker Standard CVS organization Configuration management SCRAM

Acceptance of SCRAM decision shows ‘the system works’ Project workbook

All components and services should be in place mid-FebMissing at this point:

Nightly build system (Being integrated with POOL for testing)

Software library (Prototype being set up now)

LHCC meeting, January 27, 2003 Slide 36

Torre Wenaus, BNL/CERN

POOL Project

Pool of persistent objects for LHC Targeted at event data but not excluding other data Hybrid technology approach

Object level data storage using file-based object store (ROOT) RDBMS for meta data: file catalogs, object collections, etc (MySQL)

Leverages existing ROOT I/O technology and adds value Transparent cross-file and cross-technology object navigation RDBMS integration

Integration with Grid technology (eg EDG/Globus replica catalog) network and grid decoupled working modes

Follows and exemplifies the LCG blueprint approach Components with well defined responsibilities Communicating via public component interfaces Implementation technology neutral

Dirk Duellmann, CERN IT/DB

LHCC meeting, January 27, 2003 Slide 37

Torre Wenaus, BNL/CERN

End September - V0.1 (Released Oct 2) All core components for navigation exist and interoperate Assumes ROOT object (TObject) on read and write

End October - V0.2 (Released Nov 15) First collection implementation

End November - V0.3 (Released Dec 18) First public release EDG/Globus FileCatalog integrated Persistency for general C++ classes (not instrumented by ROOT), but very

limited: elementary types only Event metadata annotation and query

End February – V0.4 Persistency of more complex objects, eg. with STL containers Support object descriptions from C++ header files (gcc-xml)

June 2003 – Production release

POOL Release Schedule

Principal apps area milestone

defined in March LCG launch:

Hybrid prototype by year end

LHCC meeting, January 27, 2003 Slide 38

Torre Wenaus, BNL/CERN

Dictionary: Reflection / Population / Conversion

ROOT I / O

LCGDictionaryCI NT

DictStreamer

.h

in

out

ROOTCI NT

CI NT generatedcode Dict generated

code

.xml

GOD

OtherClients

LCG

to

CIN

TD

ict

gate

way

(2)

(1)

Population

Conversion

Reflection

.h

GCC-XMLIn progress

New in POOL 0.3

LHCC meeting, January 27, 2003 Slide 39

Torre Wenaus, BNL/CERN

POOL Milestones

LHCC meeting, January 27, 2003 Slide 40

Torre Wenaus, BNL/CERN

Core Libraries and Services (SEAL) Project

Launched in Oct 6-member (~3 FTE) team initially; build up to 8 FTEs by the summer Growth mainly from experiment contributions

Scope: Foundation, utility libraries Basic framework services Object dictionary (taken over from POOL) Grid enabled services

Purpose: Provide a coherent and complete set of core classes and services in

conformance with overall architectural vision (Blueprint RTAG) Facilitate the integration of LCG and non-LCG software to build

coherent applications Avoid duplication of software; use community standards

Areas of immediate relevance to POOL given priority Users are software developers in other projects and the experiments

Pere Mato, CERN EP/SFT/LHCb

LHCC meeting, January 27, 2003 Slide 41

Torre Wenaus, BNL/CERN

SEAL Work Packages

Foundation and utility libraries Boost, CLHEP, experiment code, complementary in-house development Participation in CLHEP workshop this week

Component model and plug-in manager The core expression in code of the component architecture described in

the blueprint. Mainly in-house development. LCG object dictionary

C++ class reflection, dictionary population from C++ headers, ROOT gateways, Python binding

Basic framework services Object whiteboard, message reporting, component configuration, ‘event’

management Scripting services

Python bindings for LCG services, ROOT Grid services

Common interface to middleware Education and documentation

Assisting experiments with integration

LHCC meeting, January 27, 2003 Slide 42

Torre Wenaus, BNL/CERN

SEAL Schedule

Jan 2003 - Initial work plan delivered to SC2 on Jan 10th and approved Including contents of version v1 alpha

March 2003 - V1 alpha Essential elements with sufficient functionality for the other existing

LCG projects (POOL, …) Frequent internal releases (monthly?)

June 2003 - V1 beta Complete list of the currently proposed elements implemented with

sufficient functionality to be adopted by experiments June 2003 - Grid enabled services defined

The SEAL services which must be grid-enabled are defined and their implementation prioritized.

LHCC meeting, January 27, 2003 Slide 43

Torre Wenaus, BNL/CERN

Estimation of Needed Resources for SEAL

0.5 / 1.5Education and Documentation7

0.0 / 1.5Grid Services6

0.5 / 1.0Scripting Services5

0.5 / 1.0Basic Framework Services4

0.5 / 1.5LCG Object Dictionary3

0.5 / 0.5Component Model and Plug-in Manager2

0.5 / 1.0Foundation and Utility libraries1

FTE (available/required)NameWBS

3.0 / 8.0total

Current resources should be sufficient for v1 alpha (March)

LHCC meeting, January 27, 2003 Slide 44

Torre Wenaus, BNL/CERN

Math Libraries Project

Many different libraries in use General purpose (NAG-C, GSL, ..) HEP-specific ( CLHEP, ZOOM, ROOT) Modern libraries dealing with matrices and vectors (Blitz++,

Boost..) Financial considerations: can we replace NAG with open source

RTAG: yes Do comparative evaluation of NAG-C and GSL

Collect information on what is used/needed Evaluation of functionality and performance

HEP-specific libraries expected to evolve to meet future needs

Fred James, CERN EP/SFT

LHCC meeting, January 27, 2003 Slide 45

Torre Wenaus, BNL/CERN

Math library recommendations & status

Establish support group to provide advice and info about existing libraries, and identify and develop new functionality

Group established in October Which libraries and modules are in use by experiments

Little response to experiment requests for info; group in India is scanning experiment code to determine usage

Detailed study should be undertaken to assess needed functionality and how to provide it, particularly via free libraries such as GSL

Group in India is undertaking this study Initial plan of work developed with Fred James in December Targets completion of first round of GSL enhancements for April

Based on priority needs assessment Work plan needs to be presented to the SC2 soon

Scheduled tomorrow, but will be late

LHCC meeting, January 27, 2003 Slide 46

Torre Wenaus, BNL/CERN

Physicist Interface (PI) Project

Interfaces and tools by which physicists will directly use the software Planned scope:

Interactive environment: physicist’s desktop Analysis tools Visualization Distributed analysis, grid portals

Currently developing plans and trying to clarify the grid area Completed survey of experiments on their needs and interests Talking also to grid projects, other apps area projects, ROOT, …

Initial workplan proposal will be presented to PEB, SC2 this week Will plan inception workshops for identified work areas

Identify contributors, partners, users; deliverables and schedules; personnel assignments

Vincenzo Innocente, CERN EP/SFT/CMS

LHCC meeting, January 27, 2003 Slide 47

Torre Wenaus, BNL/CERN

Proposed Near Term Work Items for PI

Abstract interface to analysis services GEANT4 and some experiments do not wish to depend on a specific

implementation One implementation must be ROOT Request for a coherent LCG analysis tool-set

Interactive analysis environment Access to experiment objects (event-data, algorithms etc) Access to high level POOL services (collections, metaData) Transparent use of LCG and grid services

With possibility to ‘expose’ them for debugging and detailed monitoring GUI (point&click) and scripting interface

Interactive event and detector visualization Integrated with the analysis environment Offering a large palette of 2D and 3D rendering

LHCC meeting, January 27, 2003 Slide 48

Torre Wenaus, BNL/CERN

Simulation Project

Mandated by SC2 in Dec to initiate simulation project, following RTAG recommendations

Project being organized now Discussions with experiments, G4, FLUKA, ROOT, … on organization

and roles in progress Probable that I will lead the overall project during a startup period,

working with a slate of strong subproject leaders Scope (these are the tentative subprojects):

Generic simulation framework Multiple simulation engine support, geometry model, generator interface,

MC truth, user actions, user interfaces, average ‘GEANE’ tracking, utilities ALICE virtual MC as starting point if it meets requirements

Geant4 development and integration FLUKA integration Physics validation

simulation test and benchmark suite Fast (shower) parameterisation Generator services

LHCC meeting, January 27, 2003 Slide 49

Torre Wenaus, BNL/CERN

Collaborations

LCG apps area needs to collaborate well with projects broader than the LHC See that LHC requirements are met, provide help and support targeted at

LHC priorities, while being good collaborators (e.g. not assimilators) e.g. CLHEP: discussions at workshop this week e.g. Geant4 in context of simulation project

We also welcome collaborators on LCG projects e.g. (renewed?) interest from BaBar in POOL

We also depend on the other LHC activity areas Grid Technology:

Ensuring that the needed middleware is/will be there, tested, selected and of production grade

AA distributed software will be robust and usable only if the grid middleware it uses is so

Grid Deployment: AA software deployment Fabrics: CERN infrastructure, AA development and test facilities

LHCC meeting, January 27, 2003 Slide 50

Torre Wenaus, BNL/CERN

A note on my own time

Nearing the one year mark of doing the apps area leader job with 75% of my time, resident at CERN

Other 25% I am an ATLAS/BNL person in the US (~1 week/mo) Working well, and sustainable (the LCG job I mean)

From my perspective at least! I will be lightening my ATLAS/US load, which will be welcome

Expect to hand over the US ATLAS Software Manager job to a highly capable person in a few days or weeks

Expect to hand over the ATLAS Planning Officer job shortly to someone else

Remaining formal US responsibility is BNL Physics Applications Software Group Leader, for which I have a strong deputy (David Adams)

LHCC meeting, January 27, 2003 Slide 51

Torre Wenaus, BNL/CERN

Concluding Remarks

Expected area scope essentially covered by projects now defined Manpower is in quite good shape Buy-in by the experiments is good

Direct participation in leadership, development, prompt testing and evaluation, RTAGs

EP/SFT group taking shape well as a CERN hub Participants remote from CERN are contributing, but it isn’t always

easy POOL and SPI are delivering, and the other projects are ramping up

Persistency prototype released in 2002, as targeted in March Important benchmark to come: delivering production-capable POOL

scheduled for June