Upload
jayson-townsend
View
214
Download
0
Tags:
Embed Size (px)
Citation preview
OGSA Test Grid
Dave Berry, Research Manager
NeSC Review, 18th March 2003
Project Aims
To investigate deployment of OGSA Service-based Grids
Initially GT3Moving to GT4 technology preview
Focus on generic issuesDeploy sample applicationsProduce information for OMII/GOC/ETF
£20,000 value to NeSC0.5 FTE for one year
Project Members
Wolfgang Emmerich (UCL) – PIPaul Brebner – Project Manager
Dave Berry (NeSC)Oliver Malham – RA
Steven Newhouse (LeSC)David McBride – RA
Paul Watson (NEReSC)Savas Parastatidis – Local ManagerJake Wu – RA
Sister Project (UK)
One of two OGSA Test GridsSister project differs slightly in focusSlower move to WSRFMore emphasis on interoperability testingLed by Mark Baker (Portsmouth)
Two projects keep in touch
Sister Project (NeSC)
IBM Early Evaluation ProjectTest IBM’s OGSI/WSRF releases Feedback to IBM
Also test Globus releasesFeedback to IBM and Globus
Overlap with OGSA Test GridCommon testingAim to produce test application(s)
£100,000 value to NeSC1.0 FTE for two years
OGSA Test Grid Status
Project started Dec. 15th
GT3 installed at all sitesNewcastle and UCL had problems with GT3.2 alpha. Fixed after upgrade to GT3.2 beta.
Accounts and Certificates installedTesting basic connectivity
Problems with firewalls, port configurations, etc.Reverting to remote logins to resolve networking issues
Testing plans
Scripts Based on GITS from ETF GridNew performance testsCentral database for results
Application: e-MaterialsSimulate growth of crystalsTwo services co-ordinated by BPELJava and Fortran
Deployment services
How do we deploy components on remote machines?
Investigate remote deployment mechanismsWorking with HP (SmartFrog)]
Three possible scenarios…
Scenario 1: Manual Installation
Library consistency checked manuallyDeployment tested manuallySecurity checked manually
1. Administrators install application on server
2. Client runs remote application
Scenario 2: Separate Services
Library consistency automatically enforcedAutomatic test scripts to check deploymentSecurity by trusting installer
2. Client runs remote application
1. Deployment service installs application from Repository
Scenario 3: Remote execution service
Library consistency automatically enforcedAutomatic test scripts check entire processSecurity by trusting client
1. Client sends application to remote server
2. Server runs application and returns result
Stronger security required(e.g. proof-carrying code?)
IBM Early Evaluation Status
Project started JulyInitial learning period
Updated GT3 training materials
Sample application: SAT-TracC++ applicationFirst port to Grid ServicesVarious problems encountered and reported at 6-month pointSecond attempt more successfulSome residual security problems
Questions?
Scenario 1: Manual Installation
1. Administrators install application on server
• Library consistency checked manually• Deployment tested manually• Security checked manually
2. Client runs remote application
Scenario 2: Separate services
1. Deployment service installs application from Repository
• Library consistency automatically detected and/or enforced
• Automatic test scripts to check deployment
• Security by trusting installer
2. Client runs remote application
Scenario 3: Remote execution service
1. Client sends application to remote server
• Library consistency automatically detected and/or enforced
2. Server runs application and returns result
• Automatic test scripts check entire process
• Security by trusting client• Possibly requires stronger security (e.g.
proof-carrying code?)