GLCPC - MSU - 12/15/2009 Blue Waters An Extraordinary Resource for Extraordinary Science Thom...
If you can't read please download the document
GLCPC - MSU - 12/15/2009 Blue Waters An Extraordinary Resource for Extraordinary Science Thom Dunning, William Kramer, Marc Snir, William Gropp, Wen-mei
GLCPC - MSU - 12/15/2009 Blue Waters An Extraordinary Resource
for Extraordinary Science Thom Dunning, William Kramer, Marc Snir,
William Gropp, Wen-mei Hwu Cristina Beldica, Brett Bode, Robert
Fiedler, Merle Giles, Scott Lathrop, Mike Showerman National Center
for Supercomputing Applications, Department of Chemistry,
Department of Computer Science, and Department of Electrical &
Computer Engineering
Slide 2
GLCPC - MSU - 12/15/2009 Molecular ScienceWeather & Climate
Forecasting Earth ScienceAstronomy Health Sustained Petascale
computing will enable advances in a broad range of science and
engineering disciplines: Astrophysics Life ScienceMaterials 2
Slide 3
GLCPC - MSU - 12/15/2009 Background NSFs Strategy for High-end
Computing FY07 FY11 FY10FY09FY08 Science and Engineering Capability
( logarithmic scale ) Track 1 System Track 2 Systems UIUC/NCSA (~1
PF sustained) TACC (500+TF peak) UT/ORNL (~1PF peak) Track 2d (?)
Leading University HPC Centers (10-100 TF) Track 3 Systems 3
Slide 4
GLCPC - MSU - 12/15/2009 Survey of Scientific Community D.
Baker, University of Washington Protein structure refinement and
determination M. Campanelli, RIT Computational relativity and
gravitation D. Ceperley, UIUC Quantum Monte Carlo molecular
dynamics J. P. Draayer, LSU Ab initio nuclear structure
calculations P. Fussell, Boeing Aircraft design optimization C. C.
Goodrich Space weather modeling M. Gordon, T. Windus, Iowa State
University Electronic structure of molecules S. Gottlieb, Indiana
University Lattice quantum chromodynamics V. Govindaraju Image
processing and feature extraction M. L. Klein, University of
Pennsylvania Biophysical and materials simulations J. B. Klemp et
al., NCAR Weather forecasting/hurricane modeling R. Luettich,
University of North Carolina Coastal circulation and storm surge
modeling W. K. Liu, Northwestern University Multiscale materials
simulations M. Maxey, Brown University Multiphase turbulent flow in
channels S. McKee, University of Michigan Analysis of ATLAS data M.
L. Norman, UCSD Simulations in astrophysics and cosmology J. P.
Ostriker, Princeton University Virtual universe J. P. Schaefer,
LSST Corporation Analysis of LSST datasets P. Spentzouris, Fermilab
Design of new accelerators W. M. Tang, Princeton University
Simulation of fine-scale plasma turbulence A. W. Thomas, D.
Richards, Jefferson Lab Lattice QCD for hadronic and nuclear
physics J. Tromp, Caltech/Princeton Global and regional seismic
wave propagation P. R. Woodward, University of Minnesota
Astrophysical fluid dynamics 4
Slide 5
GLCPC - MSU - 12/15/2009 What These Scientists Told Us They
Needed Maximum Core Performance to minimize number of cores needed
for a given level of performance as well as lessen the impact of
sections of code with limited scalability Low Latency, High
Bandwidth Communications Fabric to maximize scalability of science
and engineering applications Large, Low Latency, High Bandwidth
Memory Subsystem to enable the solution of memory-intensive
problems Large Capacity, High Bandwidth I/O Subsystem to enable the
solution of data-intensive problems Reliable Operation to enable
long-running simulations 5
Slide 6
GLCPC - MSU - 12/15/2009 Diverse Large Scale Computational
Science 6
Slide 7
GLCPC - MSU - 12/15/2009 Goals of Blue Waters Project Science
and Engineering Provide knowledge/expertise/services to help
researchers develop applications that take full advantage of Blue
Waters Computing System Hardware and Software Sustain 1 petaflops
on range of science and engineering applications Enhance petascale
applications development environment and systems software Education
Prepare next generation of scientists and engineers for research at
the frontiers of petascale computing and computation Industrial
Engagement Enable industry to utilize petascale computing to
address their most challenging problems and enhance their
competitive position
Slide 8
GLCPC - MSU - 12/15/2009 Focus on Sustained Performance Blue
Waters and NSF are focusing on sustained performance in a way few
have been before.Blue Waters and NSF are focusing on sustained
performance in a way few have been before. Sustained is the
computers performance on a broad range of applications that
scientists and engineers use every day. Time to solution is the
metric not Ops/s Tests include time to read data and write the
results NSFs call emphasized sustained performance, demonstrated on
a collection of application benchmarks (application + problem set)
Not just simplistic metrics (e.g. HP Linpack) Applications include
both Petascale applications (effectively use the full machine,
solving scalability problems for both compute and I/O) and
applications that use a fraction of the system Metric is the time
to solution Blue Waters project focus is on delivering sustained
PetaFLOPS performance to all applications Develop tools,
techniques, samples, that exploit all parts of the system Explore
new tools, programming models, and libraries to help applications
get the most from the system Name Date Location
Slide 9
GLCPC - MSU - 12/15/2009 Blue Waters Project Components Blue
Waters Base System Processors, Memory, Interconnect, On-line
Storage, System Software, Programming Environment Value added
Software Collaborations Value added hardware and software Petascale
Application Collaboration Team Support Petascale Applications
(Computing Resource Allocations) Outstanding User and Production
Support WAN connections, Consulting, System Management, Security,
Operations, Outstanding User and Production Support WAN
connections, Consulting, System Management, Security, Operations,
Petascale Computing Facility Petascale Education, Industry and
Outreach Great Lakes Consortium for Petascale Computing 9
Slide 10
GLCPC - MSU - 12/15/2009 Blue Waters Petascale Computing System
Blue Waters Computing System * Reference petascale computing system
(no accelerators). Typical Cluster (NCSA Abe) Dell Intel Xeon 5300
0.090 ~0.005 9,600 0.0144 0.1 5 40 Track 2 (TACC) Sun AMD 0.58
~0.06 62,976 0.12 1.73 2.5 10 Blue Waters* IBM Power 7 ~1.0
>200,000 >0.8 >10 >500 100-400 10 System Attribute
Vendor Processor Peak Perf. (PF) Sustained Perf. (PF) Number of
cores Amount of Memory (PB) Amount of Disk Storage (PB) Amount of
Archival Storage (PB) External Bandwidth (Gbps)
Slide 11
GLCPC - MSU - 12/15/2009 From Chip to Entire Integrated System
11 Chip Quad Chip MCM Rack/Building Block Blue Waters System PCF
On-line Storage Near-line Storage Color indicates relative amount
of public information multiple MCMs
Slide 12
GLCPC - MSU - 12/15/2009 Power7 Chip: Computational Heart of
Blue Waters Base Technology 45 nm, 576 mm2 1.2 B transistors Chip 8
cores 12 execution units/core 1, 2, 4 way SMT/core Caches 32 KB I,
D-cache, 256 KB L2/core 32 MB L3 (private/shared) Dual DDR3 memory
controllers 100 GB/s sustained memory bandwidth 12 Quad-chip MCM
Power7 Chip
Slide 13
GLCPC - MSU - 12/15/2009 Memory Solutions 13
Slide 14
GLCPC - MSU - 12/15/2009 RAM Technologies DIMMs, Dense and Fast
SRAM are industry standard; eDRAM is IBM technology Used in Power
4,5,6 for off-chip L3 cache Used in Power 7 for on-chip cache (to
avoid pin limitations and support BW of 8 cores) 14
Slide 15
GLCPC - MSU - 12/15/2009 Cache Structure Innovation Combines
dense, low power attributes of eDRAM with the speed and bandwidth
advantages of SRAM all on the same chip Provides low latency L1 and
L2 dedicated per core ~3x lower latency than L3 Local region Keeps
a 256KB working set Reduced L3 power requirements and improves
throughput Provides large, shared L3 ~3x lower latency than memory
Automatically migrates per core private working set footprints (up
to 4MB) to fast local region per code at ~5x lower latency than the
full L3 cache Automatically clones shared data to multiple per core
private regions Enables a subset of cores to utilize the entire,
large shared L3 Cache when remaining cores are not using it. 15
Cache LevelCapacityArrayPolicyComment L1 Data32 KBFast SRAMStore -
thru Local thread storage update Private L2256KBFast
SRAMStore-InDe-coupled global storage update Fast L3 Private Up to
4 MB eDRAMPartial Victim Reduced power footprint (up to 4 MB)
Shared L332MBeDRAMAdaptiveLarge 32MB shared footprint
Slide 16
GLCPC - MSU - 12/15/2009 16
Slide 17
GLCPC - MSU - 12/15/2009 Illinois Petascale Computing Facility
at a Glance 88,000 GSF over two stories45 tall 30,000+ GSF of
raised floor 20,000+ unobstructed net for computers 6 clearance of
raised floor 24 MW initial power feeds + backup Three 8 MW feeds +
One 8 MW for backup 13,800 volt power to the each 5,400 Tons of
cooling Full water side economization for 50%+ of the year
Automatic Mixing of mechanical and ambient chilled water for
optimal efficiency Adjacent to (new) 6.5M gallon thermal storage
tank 480 Volt distribution to computers Energy Efficiency PUE -
~1.02 to
GLCPC - MSU - 12/15/2009 Full featured OS (AIX or Linux),
Sockets, threads, shared memory, checkpoint/restart Full featured
OS (AIX or Linux), Sockets, threads, shared memory,
checkpoint/restart Languages: C/C++, Fortran (77-2008 including
CAF), UPC IO Model: Global, Parallel shared file system (>10 PB)
and archival storage (GPFS/HPSS) MPI I/O IO Model: Global, Parallel
shared file system (>10 PB) and archival storage (GPFS/HPSS) MPI
I/O Libraries: MASS, ESSL, PESSL, PETSc, visualization Programming
Models: MPI/MP2, OpenMP, PGAS, Charm++, Cactus Hardware Multicore
POWER7 processor with Simultaneous MultiThreading (SMT) and Vector
MultiMedia Extensions (VSX) Private L1, L2 cache per core, shared
L3 cache per chip 128 GB RAM High-Performance, low-latency
interconnect supporting RDMA Hardware Multicore POWER7 processor
with Simultaneous MultiThreading (SMT) and Vector MultiMedia
Extensions (VSX) Private L1, L2 cache per core, shared L3 cache per
chip 128 GB RAM High-Performance, low-latency interconnect
supporting RDMA Environment: Traditional (command line), Eclipse
IDE (application development, debugging, performance tuning, job
and workflow management) Low-level communications API supporting
active messages (LAPI) Resource manager: Batch and interactive
access Performance tuning: HPC and HPCS toolkits, open source tools
Parallel debugging at full scale 19
Slide 20
GLCPC - MSU - 12/15/2009 Illinois-IBM Collaborative Projects. I
Computing Systems Software Goal: enhance IBMs HPC software stack
Examples Integrated System Management Console Petascale Application
Development Environment Computational Libraries Programming models
Science and Engineering Applications Goal: prepare applications to
fully utilize Blue Waters capabilities Process Before hardware:
extensive use of processor and interconnect simulators and Track 2
systems to optimize processor and communications performance
Modeling: Two modeling teams, LANLs PAL (Hoisie) and SDSC PMaC
(Snavely), are funded to fully engage with application teams After
hardware: further optimization for Power7 processor, node, , full
scale system 20
Slide 21
GLCPC - MSU - 12/15/2009 Illinois-IBM Collaborative Projects.
II Computing Systems Software and Hardware Goal: enhance the
performance of the base Blue Waters system Example Innovative Data
Management using New File-system Features Evaluation of
accelerators Petascale Computing Facility Goal: advance green
computing by optimizing PUE Elements Focus on direct-liquid cooling
On-site cooling towers with ambient water Automated control to use
optimal cooling mix Efficient electrical distribution system PUE
< 1.2 (Power Usage Effectiveness) 21
Slide 22
GLCPC - MSU - 12/15/2009 Illinois-IBM Collaborative Projects.
III Ease of Use Goal: improve the productivity of application teams
using BW Example Common Communication Infrastructure Debugging and
code management Eclipse framework integration Tuning RENCI
performance tools, others Workflow management resource scheduling
Visualization Improving the System Effectiveness Example
Interconnect Routing Cyber-security Others possible (resiliency)
22
Slide 23
GLCPC - MSU - 12/15/2009 Blue Waters Project Petascale
Computing Resources Allocations Biological Sciences 1.Computational
Microscope Klaus Schulten, Laxmikant Kale, University of Illinois
at Urbana-Champaign 2.Petascale Simulations of Complex Biological
Behavior in Fluctuating Environments Ilias Tagkopoulos, University
of California, Davis Engineering 3.Petascale Computations for
Complex Turbulent Flows Pui-Kuen Yeung, James Riley, Robert Moser,
Amitava Majumdar, Georgia Institute of Technology Name Date
Location
Slide 24
GLCPC - MSU - 12/15/2009 Blue Waters Project Petascale
Computing Resources Allocations Geosciences 4.Petascale Research in
Earthquake System Science on Blue Waters Thomas Jordan, Jacobo
Bielak, University of Southern California 5.Enabling Large-Scale,
High-Resolution, and Real-Time Earthquake Simulations on Petascale
Parallel Computers L. Wang & P. Chen, University of Wyoming
6.Testing Hypotheses about Climate Prediction at Unprecedented
Resolutions on the Blue Waters System David Randall, Ross Heikes,
Colorado State University; William Large, Richard Loft, John
Dennis, Mariana Vertenstein, National Center for Atmospheric
Research; Cristiana Stan, James Kinter, Institute for Global
Environment and Society; Benjamin Kirtman, University of Miami
7.Understanding Tornados and Their Parent Supercells Through Ultra-
High Resolution Simulation/Analysis Robert Wilhelmson, Brian
Jewett, Matthew Gilmore, University of Illinois at Urbana-
Champaign Name Date Location
Slide 25
GLCPC - MSU - 12/15/2009 Blue Waters Project Petascale
Computing Resources Allocations Mathematics & Physical Sciences
Astronomical Sciences 8.Computational Relativity and Gravitation at
Petascale: Simulating and Visualizing Astrophysically Realistic
Compact Binaries Manuela Campanelli, Carlos Lousto, Hans-Peter
Bischof, Joshua Faber, Yosef Ziochower, Rochester Institute of
Technology 9.Enabling Science at the Petascale: From Binary Systems
and Stellar Core Collapse to Gamma-Ray Bursts Eric Schnetter,
Gabrielle Allen, Mayank Tyagi, Peter Diener, Christian Ott,
Louisiana State University 10.Formation of the First Galaxies:
Predictions for the Next Generation of Observatories Brian OShea,
Michigan State University; Michael Norman, University of California
at San Diego Name Date Location
Slide 26
GLCPC - MSU - 12/15/2009 Blue Waters Project Petascale
Computing Resources Allocations Mathematics & Physical Sciences
Astronomical Sciences 11.Peta-Cosmology: Galaxy Formation and
Virtual Astronomy Kentaro Nagamine, University of Nevada at Las
Vegas; Jeremiah Ostriker, Princeton University; Renyue Cen, Greg
Bryan 12.Petascale Simulation of Turbulent Stellar Hydrodynamics
Paul Woodward, Pen-Chung Yew, University of Minnesota, Twin Cities
Chemistry 13.Computational Chemistry at the Petascale Monica Lamm,
Mark Gordon, Theresa Windus, Masha Sosonkina, Brett Bode, Iowa
State University 14.Super Instruction Architecture for Petascale
Computing Rodney Bartlett, Erik Duemens, Beverly Sanders,
University of Florida; Ponnuswamy Sadayappan, Ohio State University
Name Date Location
Slide 27
GLCPC - MSU - 12/15/2009 Blue Waters Project Petascale
Computing Resources Allocations Mathematics & Physical Science
Materials Research 15.Breakthrough Petascale Quantum Monte Carlo
Calculations Shiwei Zhang, College of William and Mary
16.Electronic Properties of Strongly Correlated Systems Using
Petascale Computing Sergey Savrasov, University of California,
Davis; Kristjan Haule, Gabriel Kotliar, Rutgers University Physics
17.Lattice QCD on Blue Waters Robert Sugar, University of
California at Santa Barbara Name Date Location
Slide 28
GLCPC - MSU - 12/15/2009 Blue Waters Project Petascale
Computing Resources Allocations Social, Behavioral and Economic
Sciences 18.Simulation of Contagion on Very Large Social Networks
with Blue Waters Keith Bisset, Xizhou Feng, Virginia Polytechnic
Institute and State University Name Date Location
Slide 29
GLCPC - MSU - 12/15/2009 Allocations for Blue Waters Petascale
Computing Resource Allocations (PRAC) Anyone can apply! Vast
majority (80%) of the resource allocated in this manner Selected by
NSF based on Need for sustained petascale platform to carry out
ground-breaking research Likely to be ready to use Blue Waters
effectively in 2011 PRAC awardees receive travel funds and
provisional time Will accept applications on a continuing basis in
future Blue Waters application and consulting staff will support
awardees in preparing codes Industry Allocation process under
development Cost reimbursable Up to 7% of the resources Contact the
BW PSP program for details Education Allocation process under
development by the GLCPC Allocation Committee Great Lake Consortium
on Petascale Computing (GLCPC) members 50,000,000 hours from
Directors reserve Allocation process under development Directors
Reserve For High Risk Start Up Projects Allocation process under
development
Slide 30
GLCPC - MSU - 12/15/2009 Great Lakes Consortium for Petascale
Computation The Ohio State University* Shiloh Community Unit School
District #1 Shodor Education Foundation, Inc. SURA 60 plus
universities University of Chicago* University of Illinois at
Chicago* University of Illinois at Urbana-Champaign* University of
Iowa* University of Michigan* University of Minnesota* University
of North CarolinaChapel Hill University of WisconsinMadison* Wayne
City High School * CIC universities Argonne National Laboratory
Fermi National Accelerator Laboratory Illinois Math and Science
Academy Illinois Wesleyan University Indiana University* Iowa State
University Illinois Mathematics and Science Academy Krell
Institute, Inc. Los Alamos National Laboratory Louisiana State
University Michigan State University* Northwestern University*
Parkland Community College Pennsylvania State University* Purdue
University* Goal: Facilitate the widespread and effective use of
petascale computing to address frontier research questions in
science, technology and engineering at research, educational and
industrial organizations across the region and nation. Charter
Members 30
Slide 31
GLCPC - MSU - 12/15/2009 Petascale Education, Industry and
Outreach Education Program Undergraduate Petascale Education
Program Professional development workshops for faculty offered
throughout the year Support for undergraduate faculty to develop
course materials Support for undergraduate student year-long
internships with petascale research projects Apply at
www.computationalscience.org/upep Virtual School for Science and
Engineering Summer Schools proposed for summer 2010 Many Core
Processors (covering GPGPU and CUDA) Scaling to Petascale Large
Data Handling Content from 2008 and 2009 Summer Schools freely
available on-line Looking for sites that want to host participants
linked together by HD video-conferencing
http://www.ncsa.illinois.edu/BlueWaters/eot.html Industrial Partner
Program Facility Industrial Use of Petascale Resources Cost
recovery program Outreach Programs To other communities Examples
GLCPC Proposal in process - U of Chicago for PRAC proposal
consultation 31
Slide 32
GLCPC - MSU - 12/15/2009 Undergraduate Petascale Education
Program (UPEP) 32 Led by Bob Panoff, Shodor Emphasis on engaging
faculty from under-represented community and institutions (MSI,
EPSCoR, 2- and 4-year) Three areas of emphasis Professional
development via campus visits and ~10 workshops per year for
undergraduate faculty to incorporate computational thinking and
petascale resources in undergraduate classroom Support
undergraduate faculty development over 3 years of 30 modules in
contemporary science, from desktop to grid to petascale
incorporating: Quantitative reasoning, Computational thinking, and
Multi-scale modeling Support immersion of 15 undergraduate students
per year in year- long petascale research projects On-going
invitation for faculty and students to apply -
www.computationalscience.org/upep
Slide 33
GLCPC - MSU - 12/15/2009 Graduate Education The Virtual School
of Computational Science and Engineering is the graduate education
component of Blue Waters The Virtual School brings together faculty
and staff from research universities around the nation to fill the
knowledge gap in CSE A primary activity of the Virtual School is to
organize annual Summer Schools for graduate students
Slide 34
GLCPC - MSU - 12/15/2009 Summer School 2009 Two week-long,
multi-site workshops were offered in the summer of 2009: Scaling to
Petascale August 37, 2009
http://www.vscse.org/summerschool/2009/scaling/
http://www.vscse.org/summerschool/2009/scaling/ Many-Core
Processors August 1014, 2009
http://www.vscse.org/summerschool/2009/manycore/
http://www.vscse.org/summerschool/2009/manycore/ A total of 232
participants attended these workshops
Slide 35
GLCPC - MSU - 12/15/2009 Summer School 2010 Proposed workshops
for 2010: Scaling to Petascale Many-Core Processing Big Data for
Science (new this year)
http://www.vscse.org/summerschool/2010/workshops.html Call for
Participation in 2010 http://www.vscse.org/ http://www.vscse.org/
We look forward to hearing from you!
Slide 36
GLCPC - MSU - 12/15/2009 How can you get involved? Everyone
Education program summer schools, courses, faculty workshops
Software developers Contact us if you are developing tools for HPC
Application developers Modest time allocations through GLCPC Large
awards via NSF PRAC program Contact the BWs team if you are
planning a submission! We can offer tips for preparing your
proposal. 36
Slide 37
GLCPC - MSU - 12/15/2009 Acknowledgements This research is part
of the Blue Waters sustained-petascale computing project, which is
supported by the National Science Foundation (award number OCI
07-25070) and the state of Illinois. Blue Waters is a joint effort
of the University of Illinois at Urbana-Champaign, its National
Center for Supercomputing Applications, IBM, and the Great Lakes
Consortium for Petascale Computation. The work described is only
achievable through the efforts of the Blue Waters Project. 37
Slide 38
GLCPC - MSU - 12/15/2009 Questions? 38 Dr. Brett Bode
NCSA/University of Illinois Blue Waters Software Development
Manager [email protected]/[email protected]/ -
http://www.ncsa.uiuc.edu/BlueWaters (217) 244-5187