18
Campus grids: e- Infrastructure within a University Mike Mineter National e-Science Centre [email protected]

Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre [email protected]

Embed Size (px)

Citation preview

Page 1: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

Campus grids: e-Infrastructure within a University

Mike MineterNational e-Science [email protected]

Page 2: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

2

Thanks to:

• Mark Calleja, Cambridge• David McBride, Imperial College• David Wallom, Oxford• Jonathan Giddy, Welsh e-Science Centre• John Brodholt, UCL

for descriptions of their campus initiatives.

Page 3: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

3

Overview

• Goals• Methods & Examples• Some opportunities and implications

Page 4: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

4

Goals of campus grids

• Resource utilisation– Computers: in each university x,000 PCs are little

used– Research data

• Collaboration– Access Grids – meetings across sites– Sharing visualisation, data, programs

• Infrastructure for research– From resource sharing in a university to international

collaboration with one middleware stack

Page 5: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

5

Overview

• Goals• Methods & Examples

– Harvesting CPU cycles– Crossing administrative domains

• Globus• Storage Resource Broker

• Some opportunities and implications

Page 6: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

6

Harvesting CPU time

Teaching labs. + Researchers

Often-idle processors!!

Analyses constrained by CPU time!

Page 7: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

7

Harvesting CPU time

• Teaching lab machines lie idle for most of the time• Harvest spare compute cycles to create a low-cost “high

throughput computing” (HTC) platform – Goal: run many tasks in a week, month, …– Typically: many similar tasks invoked from workflow or a script

• Monte-Carlo

• Simulation – parameter sweeps

• Pool processors as a batch processing resource• Submit jobs that run when a machine is free• Condor most common approach

– http://www.cs.wisc.edu/condor/

Page 8: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

10

submission machinessubmission machines

central managercentral manager

UCL Pool > 1000 CPUSUCL Pool > 1000 CPUSPlatforms: Windows XP and 2000Platforms: Windows XP and 2000Capacity: 256Mb-1Gb RAM; Capacity: 256Mb-1Gb RAM; Intel 3 (1Ghz) – Intel 3 (1Ghz) – Intel 4 (3.2Ghz)Intel 4 (3.2Ghz)

Linux

Different projects can have their own submission machine

Note: need Windows executable (.exe file)

The UCL Condor PoolThe UCL Condor Pool

John Brodholt UCL

Page 9: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

11

Calculations:• investigate 10-20 surfaces• 2 to 5 surface terminations• 4 to 16 impurity positions• > 4 concentrations

Total number of calculationsper impurity: 120-2440

CaO-termimated

TiO2-termimated

{001} surfaces of CaTiO3

Mineral Surfaces

M. Alfredsson, J.P. Brodholt and G.D. Price et al,Submitted to Nature Materials

Page 10: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

14

Use these surface energies to predict crystal forms as a function of type dopant AND concentration:

Maria Alfredsson, UCL

Page 11: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

15

Crystal Morphologies of Ni-doped CaTiO3

SEM-picture of Ni-doped CaTiOSEM-picture of Ni-doped CaTiO33Predicted Morphology of Predicted Morphology of Ni-doped CaTiONi-doped CaTiO33

(submitted to Nature Materials)Maria Alfredsson, UCL

Page 12: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

16

Campus grids

• Resources in many administrative domains• Need basic services that provide:

– Authentication, Authorisation mechanisms• Based on certificates• Single sign-on to access many resources• Control of who can do what

– Job submission services• Submit jobs to batch queues on clusters or Condor pools

– Information systems• So you know what can be used

– Ability to share data

Page 13: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

17

Middleware for campus grids

• Globus toolkit http://www.globus.org/ – Tools built on Grid Security Infrastructure and

include:• Job submission: run a job on a remote computer• Information services: So I know which computer to use

• Storage Resource Broker http://www.sdsc.edu/srb/– Virtual filesystem: for files held in multiple locations– NIEES offers a testbed to give experience with SRB

• SRB and Globus Toolkit 2 are part of the National Grid Service stack

Page 14: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

19

Globus is a Toolkit• To submit a job to run on a remote computer

globus-job-submit grid-data.rl.ac.uk/jobmanager-pbs /bin/hostname -f

https://grid-data.rl.ac.uk:64001/1415/1110129853/

globus-job-status https://grid-data.rl.ac.uk:64001/1415/1110129853/

DONE

globus-job-get-output https://grid-data.rl.ac.uk:64001/1415/1110129853/

grid-data12.rl.ac.uk

• Needs higher level services– Brokers to allow jobs to be submitted

to “a grid” – VO- specific developments e.g. Portals

Page 15: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

20

Storage Resource Broker

SRB serverResource Driver

SRB serverResource Driver

SRB serverResource Driver

Filesystems in different admin. domains

User sees a virtual filesytem: – Command line (S-Commands)– MS Windows (InQ)– Web based (MySRB).– Java (JARGON)

– Web Services (MATRIX)

Page 16: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

Example: OxGrid, a University Campus Grid

• Single entry point for Oxford users to shared and dedicated resources

• Seamless access to National Grid Service and OSC for registered users

• Single sign-on using PKI technology integrated with current methods

NGSOSC

OxGrid Central

Management

Broker MDS/VOM Storage

College Resources

Departmental Resources

OxfordUsers

David Wallom

Page 17: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

23

Observations

• Need specific initial user communities >> vague sense this is a good idea!

• Engage with systems managers from first thoughts– Operations effort must be factored in and justifiable!

Researcher enthusiasm is not enough!

• Be alert to National Grid Service deployment of middleware, and potential for interoperability

Page 18: Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre mjm@nesc.ac.uk

24

Summary

• Campus grids enable – High throughput computing– Improve resource utilisation– Collaboration – share processes, data, storage

• Builds culture, skills and infrastructure for service-oriented research…

• Advantages of using the same stack as the National Grid Service– Expertise and support available– Smaller leaps to partnership in and use of the NGS– White Rose Grid pioneered this!... and provides the expertise

for others…