23
Windows HPC: Launching to expand grid usage Windows vs. Linux or Windows with Linux?

Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Windows HPC:Launching to expand grid usage

Windows vs. Linux or

Windows with Linux?

Page 2: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Agenda

Launching Windows HPC Server 2008

Windows HPC: who is it for?

Windows and Linux for the best fit to your needs

Windows HPC Server 2008: characteristics

Tools for Parallel Programming

Page 3: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Windows HPC launching

• Available on Nov 1

• Evaluation version for download at: www.microsoft.com/france/hpc

Page 4: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Windows HPC: Who is it for?

• Research groups who work on Windows workstations and need a local computing capability for their day-to-day calculation, often as a complement to the large shared cluster in their institution that requires reservation. Their requirements:– Use of the cluster must be as easily as possible for a non-IT skilled scientist– Cluster must be managed by their standard local IT

• Scientists and engineers that develop code on their Windows workstations and want to expand them onto a cluster with as little effort as possible

• Academic institutions that want to open their grid infrastructure to new types of users, optimize their investment in hardware and foster new research initiatives

• Academic institutions that want to prepare their students to all the computing environment alternatives they will encounter in their careers

Page 5: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Key

Storage

Existing Cluster

Infrastructure

UNIX/Linux

System

Business Intelligence

SQL Server

Analysis/

Reporting

SQL Server

Integration

Services

Storage

Administration

Partner

Microsoft

System Center

Configuration Manager

Windows Server

Update Services

Software Protection Services

Windows® HPC Server 2008Jo

b S

ub

mis

sio

n

AP

Is Ad

min

istra

tion

AP

Is

WC

F R

ou

ter

Job Scheduler w/ Failover

Compute Nodes

Storage

SQL Structured

Storage

Windows Storage

Server with DFS

Parallel/Clustered

Storage

Node Manager

Applications:

WCF, C#, C++, Fortran

New TCP/IP MPI w/Network Direct

HPC Server 2008

HPC

Profile

3rd

Party Systems

Management Utilities

Clients/Job SubmissionDevelopment Tools

System Center

Operations Manager

Windows® HPC Server 2008

Administration Console:System, Scheduling, Networking,

Imaging, Diagnostics

Windows Powershell

SharePointBatch Applications

CCS Job Console

CCS Scripts

Visual Studio: C#,

C++, WCF, OpenMP,

MPI, MPI.NET

MPI Debugging

Trace Analysis

Profiling

MPI TracingFortran

Numerical Libraries

WCF Applications

Windows Workflow

Foundation

Excel

System Center

Data Protection Manager

Open approach

Page 6: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Interoperability

At application level:Services for Unix Applications

• Novell agreements

• Double OS clusters

• Proactive integration

At OS level

At scheduling level: Open Grid Forum

Page 7: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

• OGF: In existence since 2001– Mission: Pervasive adoption of Grid technologies

– Standards through an open community process

• HPC Profile Working Group (Started 2006)– Commercial: Microsoft, Platform, Altair, …

– Research: U of Virginia, U of Southampton, Imperial College London, …

– Grid Middleware: OMII-UK, Globus, EGEE, CROWN, …

Interoperability & Open Grid Forum

Page 8: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

LSF 7.0.1

HPCBasic

Profile

SGE 6.1 on SUSE

HPCBasic

Profile

gSOAPWS

Client

Windows HPC v1

HPCBasic

Profile

Windows HPC v2

HPCBasic

Profile

SGE

HPCBasic

Profile

PBS

HPCBasic

Profile

Cross-platform integration

Page 9: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Cluster ResourceLinux

Cluster ResourceWindows

Scheduler A Windows HPC Server

Applications

Cluster ResourceLinux

Cluster ResourceWindows

Scheduler AWindows Compute Cluster Server (v2)

HPCBP HPCBP

Applications

Scheduler B

Applications

End Users End UsersEnd Users

HPCBP

Isolated Application & Resource Silos Integrated Resources

Scenario: Metascheduling

Page 10: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Opening to new usages/users at UMEA

• Requirements:

– Increase parallel computing capacity to support more demanding research projects

– Expand services to a new set of departments and researchersthat have expressed demand to support Windows-basedapplications

• Solution:

– Deployment of a dual-OS system (WHPCS2008 and Linux) on a cluster consisting of 672 blades / 5376 cores

• Results:

– Linpack on WHPCS2008 achieves 46.04Tflops / 85.6% efficiency

– Ranked 39th in June 2008 Top500 list

Page 11: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

NCSA at University of Illinois UC

• Requirements:

– Meet the evolving needs of both Academic and privateindustry users of its supercomputing center

– Enable HPC for a broader set of users in the future than the traditional ones

• Solution:

– Add Windows HPC Server to 1,200-node, 9472-core cluster options

• Results:

– Linpack on WHPCS2008 achieves 68.5 Tflops, 77.7% efficiency

– Ranked 23rd in June 2008 Top500 list

Page 12: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

70% eff

The prize: NCSA’s Abe cluster #14 on Nov 2007 Top500

The goal: Unseat #13 Barcelona cluster at 63.8 TFlops

#23

Top 500

Page 13: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

1184 nodes online

4 hours from bare metal to Linpack

Page 14: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Using Excel to Drive Linpack

Page 15: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

============================================================================

HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004

Written by A. Petitet and R. Clint Whaley, Innovative Computing Labs., UTK

============================================================================

The following parameter values will be used:

N : 1008384

NB : 192

PMAP : Row-major process mapping

P : 74

Q : 128

PFACT : Crout

NBMIN : 4

NDIV : 2

RFACT : Right

BCAST : 1ring

DEPTH : 0

SWAP : Mix (threshold = 192)

L1 : transposed form

U : transposed form

EQUIL : yes

ALIGN : 16 double precision words

============================================================================

T/V N NB P Q Time Gflops

----------------------------------------------------------------------------

W00R2C4 1008384 192 74 128 9982.08 6.848e+004----------------------------------------------------------------------------

||Ax-b||_oo / ( eps * ||A||_1 * N ) = 0.0005611 ...... PASSED

||Ax-b||_oo / ( eps * ||A||_1 * ||x||_1 ) = 0.0009542 ...... PASSED

||Ax-b||_oo / ( eps * ||A||_oo * ||x||_oo ) = 0.0001618 ...... PASSED

============================================================================

After 2.5 hours… 68.5 Tflops, 77.7% efficiency

Page 16: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Spring 2008, NCSA, #23

9472 cores, 68.5 TF, 77.7%

Fall 2007, Microsoft, #1162048 cores, 11.8 TF, 77.1%

Spring 2007, Microsoft, #1062048 cores, 9 TF, 58.8%

Spring 2006, NCSA, #130

896 cores, 4.1 TF

Spring 2008, Umea, #39

5376 cores, 46 TF, 85.5%

30% efficiencyimprovement

Windows HPC Server 2008

Windows Compute Cluster 2003

Winter 2005, Microsoft

4 procs, 9.46 GFlops

Spring 2008, Aachen, #100

2096 cores, 18.8 TF, 76.5%

Page 17: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Other examples

Expanded its cluster to include Windows HPC to support a growingnumber of Windows-based parallel applications

“A lot of Windows-based development is going on with the Microsoft® Visual Studio® development system, and most researchers

have a Windows PC on their desk,” says Christian Terboven, Project Lead for HPC on Windows at the CCC.

“In the past, if they needed more compute power, these researchers were forced to port their code to UNIX because we offered

HPC services primarily on UNIX.”

Dual boot system, 256 nodes @18.81 Tflops and 100th in June08 Top500 list

Facility for Breakthrough Science seeks to expand user base with Windows-based HPC

“Windows HPC Server 2008 will help us extend our user base by taking high-performance computing to new user communities in a way we were unable to do before”

“Porting codes from Linux to Windows HPC Server 2008 was very easy and painless. I was running the ported code within a day”

32 nodes, 256 cores, 2 head nodes, dual-boot system WHPCS2008/SuSe Linux 10.1

Integrated Windows HPC in the Proactive middleware. Goal is to offer identical access to computing ressources to users of Windows or Linux-based applications

Leading Supercomputing Center in Italy improves access to supercomputing resources for more researchers from private industry sectors, many of whom were unfamiliar with its Linux-based tools and interfaces

“Most researchers do not have time to acquire specialized IT skills. Now they can work with an HPC cluster that has an interfacesimilar to the ones they use in their office environments. The interface is a familiar Windows feature, and it’s very easy tounderstand from the beginning”

16 nodes, 128 cores, dedicated additional Windows cluster

Page 18: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Systems Management

Job Scheduling

MPIStorage

Rapid large scale deployment and built-in diagnostics suite

Integrated monitoring, management and reporting

Familiar UI and rich scripting interface

Integrated security via Active Directory

Support for batch, interactive and service-oriented applications

High availability scheduling

Interoperability via OGF’s HPC Basic Profile

MS-MPI stack based on MPICH2 reference implementation

Performance improvements for RDMA networking and multi-core shared memory

MS-MPI integrated with Windows Event Tracing

Access to SQL, Windows and Unix file servers

Key parallel file server vendor support (GPFS, Lustre, Panasas)

In-memory caching options

Windows HPC Server 2008

Page 19: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Large Scale Deployments

Page 20: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

And out-of-the-box, integrated solutionfor smaller environments

Page 21: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Parallel Programming

• Available Now– Development and Parallel debugging in Visual Studio

– 3rd party Compilers, Debuggers, Runtimes etc.. available

• Emerging Technologies – Parallel Framework– LINQ/PLINQ – natural OO language for SQL queries in .NET

– C# Futures – way to explicitly make loops parallel

• For the future: Parallel Computing Initiative (PCI)– Triple investment with a new engineering team

– Focused on common tools for developing multi-core codes from desktops to clusters

Compilers

• Visual Studio

• Intel C++

• Gcc

• PGI Fortran

• Intel Fortran

• Absoft Fortran

• Fujitsu

Profilers and Tracers

• PerfMon

• ETW (for MS-MPI)

• VSPerf /VSCover

• CLRProfiler

• Vampir (Being ported to Windows)

• Intel Collector/Analyzer(Runs on CCS w Intel MPI)

• Vtune & CodeAnalyst

• Marmot (Being ported to Windows)

• MPI Lint++

Debuggers

• Visual Studio

• WinDbg

• DDT

Runtimes and Libraries

• MPI

• OpenMP

• C# Futures

• MPI.C++ and MPI.Net

• PLINQ

Page 22: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

Resources

• Microsoft HPC Web site – download the evaluation version– http://www.microsoft.com/france/hpc

• Windows HPC Community site– http://www.windowshpc.net

• Dual-OS cluster white paper– Online soon

Page 23: Windows HPC: Launching to expand grid usage · •HPC Profile Working Group (Started 2006) ... HPLinpack 1.0a -- High-Performance Linpack benchmark -- January 20, 2004 Written by

© 2008 Microsoft Corporation. All rights reserved.This presentation is for informational purposes only. Microsoft makes no warranties, express or implied, in this summary.