38
www.nasa.gov Project Status Report High End Computing Capability Strategic Capabilities Assets Program Dr. Rupak Biswas – Project Manager NASA Ames Research Center, Moffett Field, CA [email protected] (650) 604-4411 January 10, 2015

December 2014 HECC Report

Embed Size (px)

Citation preview

Page 1: December 2014 HECC Report

www.nasa.gov

Project Status Report High End Computing Capability Strategic Capabilities Assets Program

Dr. Rupak Biswas – Project Manager NASA Ames Research Center, Moffett Field, CA [email protected] (650) 604-4411

January 10, 2015

Page 2: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

December Usage on Pleiades Exceeds 14.09 Million SBUs and Sets New Monthly Record

Mission Impact: Increasing Pleiades’ system capacity provides mission directorates with more resources for the accomplishment of their goals and objectives.

Images from projects that were among the top users in their respective mission directorates. Clockwise from top: Simulated radiated sound field produced by a full-scale Gulfstream aircraft during landing. E. Fares, Exa Corporation; P. Moran, NASA/Ames. Snapshot of temperature contours during the combustion process in the rocket engine’s combustor. B. Richardson, D. Westra, NASA/Marshall. High-resolution lunar gravity field from the Gravity Recovery and Interior Laboratory (GRAIL) primary and extended missions. A. Konopliv, NASA/JPL.

POC: Catherine Schulbach, [email protected], (650) 604-3180, NASA Advanced Supercomputing Division

•  December usage on the Pleiades supercomputer set a new monthly record.

•  Over 14.09 million Standard Billing Units (SBUs*) were used by NASA’s science and engineering organizations, exceeding the previous record of 13.31 million SBUs (set in November 2014) by almost 6%.

•  This increase was enabled by high demand and by efficient operations that delivered over 80% system utilization (75% utilization is target).

•  Over 300 projects from ARMD, HEOMD, SMD, NESC, and NAS used time on Pleiades during December.

•  Usage for the top 10 projects ranged between 267,962 and 1.07 million SBUs, and accounted for over 45% of total usage.

•  The HECC Project continues to plan and evaluate ways to address the future requirements of NASA’s users.

*1 SBU equals 1 hour of a Pleiades Westmere 12-core node.

2

GRAIL Lunar GravityPI: Alex Konopliv, JPL

HEC Project Results Quad Chart Planetary Science SMD-13-4352 Continuing As: SMD-14-5252

Project ObjectivesDetermine a High Resolution Lunar Gravity field from the GRAIL Primaryand Extended Mission spacecraft data

The expected gravity half-wavelength resolution is 3-5-km and involvesleast squares estimation of 2.3 million parameters

Role of High-End ComputingNASA HEC resources enables the generation of high resolutiongravity models to further scientific investigation of the lunar interior. High Resolution Lunar Gravity field from the GRAIL Primary and Extended

Missions

Recent Publications1. 2014 High Resolution Lunar Gravity Fields from the GRAIL Primary andExtended Mission, Alex Konopliv, Geophysical Reseach Letters,

http://onlinelibrary.wiley.com/doi/10.1002/2013GL059066/abstract2. 2014 Lunar interior properties from the GRAIL mission, James Williams,

Journal of Geophysical Research,

http://onlinelibrary.wiley.com/doi/10.1002/2013JE004559/abstract3. 2013 The JPL lunar gravity field to spherical harmonic degree 660 fromthe GRAIL Primary Mission, Alex Konopliv, Journal of GeophysicalResearch, http://onlinelibrary.wiley.com/doi/10.1002/jgre.20097/fullCo-Is/PartnersDah-ning Yuan, Jet Propulsion Laboratory; Ryan Park, Jet PropulsionLaboratory; Sami Asmar, Jet Propulsion Laboratory

Major ResultsHighest Resolution Gravity Map of the Moon (6-km, e.g. Zuber et al,,Science, 2013, Konopliv et al., GRL, 2014).

Improved measurement of the Lunar Tide and Lunar Moment ofInertia by a factor of five, (Williams et al, JGR, 2014).

Improved lighter than expected lunar crustal density estimates (seeWieczorek et al., Science, 2013).

Science Mission Directorate

Page 3: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Security Plan Updated and Uploaded to the New NASA Repository

POC: Thomas Hinke, [email protected], (650) 604-3662, NASA Advanced Supercomputing Division

•  The NAS Division security plan was updated to satisfy Revision 4, the latest version of the National Institute of Standards and Technology (NIST) SP 800-53, Security and Privacy Controls for Federal Information Systems and Organizations.

•  HECC/NAS security staff are required to use the recently adopted IT Security Center (ITSC), the new system for creating and managing NASA security plans. –  Significant work was accomplished to transform

the plan into the new ITSC-required template. –  This work was made even more challenging

since ITSC has not yet been upgraded to support the Revision 4 controls.

–  The ITSC assigned a new security plan number to the NAS plan: CD-ARC-TNE-Sys-3232.

–  This update effort ensures that the NAS plan complies with the latest NASA/NIST security requirements, and that the Security team is following the latest security guidance.

3

The process used by HECC security experts to update and transform the NASA Advanced Supercomputing (NAS) Division security plan into a new form and to a new NASA security plan repository.

Mission Impact: Keeping current with the latest National Institute of Standards and Technology and NASA security requirements helps ensure mission success through a strong, secure HECC system.

Previous  RMS  Plan  Form  and  Repository  

NAS  Security  Plan  Sa9sfying  Rev  3  of  NIST  SP  800-­‐53  

Update  Plan  and  Procedures  to  Comply  with  Rev  4  of  NIST  SP  

800-­‐53  

NAS  Security  Plan  Sa9sfying  Rev  4  of  NIST  SP  800-­‐53  

IT  Security  Center  Plan  Form  and  Repository  

Page 4: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Authentication Platform for HECC and PRACA Websites Upgraded to SiteMinder

Mission Impact: Implementation of SiteMinder’s centralized platform for secure website authentication provides users the capability to authenticate with their NASA Smartcard and standardizes website authentication for developers.

The SiteMinder authentication platform provides NASA users with an option to authenticate with their NASA Smartcard, in addition to the previously available options of using an RSA SecurID token or Agency User ID.

POCs: Jeff Melin, [email protected], (650) 604-0715; Frank Cianci, [email protected], (650) 604-2559, NASA Advanced Supercomputing Division, ADNET Systems

•  The Engineering Servers and Services (ESS) team completed the agency-directed upgrade from Launchpad to CA SiteMinder on all HECC and Problem Reporting, Analysis and Corrective Action (PRACA) websites. Twenty-nine Red Hat Linux servers were upgraded.

•  Since the NASA Enterprise Applications Competency Center (NEACC) documentation covered only Windows systems, not Linux, additional research and training was required for the team to prepare for this upgrade.

•  The ESS work on the upgrades included: –  SATERN and CA SiteMinder training. –  Development and documentation of the Linux

installation process and settings. –  Creation of 29 agents, 3 agent groups, 27 configuration

objects, and 10 applications in SiteMinder to prepare for deployment.

–  Installation and configuration of the agent on each server during dedicated time.

•  SiteMinder offers users the capability to authenticate to web services, such as the NAS account request form, using their NASA Smartcard (badge), RSA SecurID token, or Agency User ID.

4

Page 5: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Upgrade to MicroStrategy Version 9.4 Speeds HECC Monthly Report Processing

Mission Impact: Improved MicroStrategy reporting capabilities allow continued streamlining of the lengthy and complex usage accounting process required to create the HEC and HECC monthly reports.

The HECC Tools team utilizes MicroStrategy to gather and integrate supercomputer usage data into easily understood charts and slides for the monthly HEC and HECC reports.

POCs: Ryan Spaulding, [email protected], (408) 772-6567, NASA Advanced Supercomputing Division, ADNET Systems; Blaise Hartman, [email protected], (650) 604-2539, NASA Advanced Supercomputing Division, Bay Systems

•  The HECC Tools team completed the upgrade of its MicroStrategy data analytics platform from version 9.2 to version 9.4, providing a 30-45% performance improvement for processing lengthy HEC and HECC monthly reports.

•  The new release provides the following enhancements and capabilities: –  Significantly speeds the movement of reports from

development to production by allowing reports and objects to be packaged and relocated instead of having to be re-created.

–  New “Data Blending” functionality allows multiple data sources to be inserted into a single grid, simplifying the production of reports.

–  Dashboard support enables the development of daily, weekly, and monthly dashboards of usage accounting data.

•  The team thoroughly tested the new MicroStrategy version, including development testing, report checkout, and report adjustments, in order to ensure accuracy of the 42 reports and charts that are generated each month.

5

Page 6: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Forest Dynamics Scientists Use hyperwall to Explore TB-sized Forest Disturbance Dataset

Mission Impact: The hyperwall, used in conjunction with HECC visualization expertise, gives users a powerful way to explore terabyte- and petabyte-scale datasets, and can facilitate discovery of features in the data that would go unseen on desktop displays.

Chengquan Huang (University of Maryland), Gretchen Moisen (US Forest Service), and Warren Cohen (US Forest Service), visited the NAS facility on December 12, to study the results of their North American forest disturbance study on the 128-screen hyperwall.

POCs: Tim Sandstrom, [email protected], (650) 604-1429, NASA Advanced Supercomputing Division, Computer Sciences Corp.; Jennifer Dungan, [email protected], (650) 604-3618, NASA Ames Research Center

•  The HECC Visualization group hosted North American Forest Dynamics (NAFD) scientists in December to help explore the results of a terabyte-sized forest disturbance dataset on the hyperwall. –  The NAFD team used Pleiades to process

~57,000 Landsat scenes taken over a 25-year period and to produce a series of national map products, including disturbance history, forest recovery trajectories, and spatio-temporal patterns of forest disturbance causal factors.

–  The Visualization group used their bigview tool, which allows zooming and panning of large images on the 128-screen hyperwall. Stepping through the time series of images had the effect of animating the changes to forests.

•  During two extensive visits, the scientists gained understanding of the extent and patterns of forest harvest, mountain top mining, waves of insect damage, and other phenomena. They found “slices” of data that will be used to help convey NAFD results.

6

Page 7: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Simulations Trace Ultra-High-Energy Cosmic Rays Through the Galactic Magnetic Field *

Mission Impact: Enabled by NASA’s Pleiades supercomputer, new simulations help scientists improve their understanding of the galactic magnetic field—crucial for interpreting diverse astrophysical phenomena such as dark matter and cosmic rays.

Visualization of a galactic magnetic field model with 3D ultra-high-energy cosmic ray (UHECR) trajectories. The UHECRs shown (thick lines) are about to reach Earth (blue dot). Those from the opposite side of the Milky Way are typically deflected by the galactic magnetic field (thin lines) and cannot reach Earth, except at the highest energies. Chris Henze, Tim Sandstrom, NASA/Ames

POCs: Michael Sutherland, [email protected], (614) 292-5713, Ohio State University; Glennys Farrar, [email protected], (212) 992-8787, New York University

•  Using the most sophisticated galactic magnetic field (GMF) model to date, scientists have run the first high-resolution simulations ever conducted of ultra-high-energy cosmic ray (UHECR) trajectories through the Milky Way.

•  The simulations, run on Pleiades, backtrack the particle trajectories from Earth through the GMF. Results are helping to reveal the origin and composition of the UHECRs, while simultaneously improving our understanding of the GMF itself.

•  The simulations show that the impact of the GMF on the UHECR trajectories is much more profound than previously understood, and that galactic magnetic turbulence strongly affects their Earth arrival distributions.

•  Each simulation was comprised of over 51 million events and required an average of 250,000 Pleiades processor-hours to complete.

•  Future simulations incorporating astronomical observations will help to further refine the model and help scientists better understand the GMF, its origins, and its role in the history of the Milky Way.

7

* HECC provided supercomputing resources and services in support of this work

Page 8: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades Enables Efficient Computational Modeling of Launch Environments *

Mission Impact: Enabled by the Pleiades supercomputer, high-fidelity computational fluid dynamics simulations, combined with engineering analyses, will significantly improve the safety and operational cost of NASA Kennedy’s redesigned launch complex for future space missions.

Computed maximum surface pressures on the NASA Kennedy launch tower structure over a time period during a heavy-lift launch scenario. Christoph Brehm, NASA/Ames

POCs: Emre Sozre, [email protected], (650) 604-0594, Cetin Kiris, [email protected], (650) 604-4485, NASA Advanced Supercomputing Division

•  Researchers at NASA Ames are utilizing Pleiades to perform intensive computational fluid dynamics (CFD) simulations required analyze the redesign of the Space Launch System (SLS) launch site infrastructure at NASA Kennedy.

•  Simulations produced are used to characterize the plume-induced pressure loads on the site’s mobile launcher (ML) and tower, and leverage lower-fidelity analytical approaches to efficiently provide predictions for ~10,000 possible launch trajectories. –  Unsteady simulations of the launch vehicle plume in free-

air were performed to obtain time-averaged flow statistics. –  Subsequently, this time-averaged plume data was swept

past the ML/tower structure, following the path of each trajectory variation and accounting for the corresponding vehicle orientation.

–  Total pressure of the plume was projected on the ML/tower structure while employing analytical corrections for supersonic impingement on the surfaces.

•  Pleiades parallel computing capability was critical to performing the intensive, high-fidelity plume simulations and subsequent engineering analyses.

8

* HECC provided supercomputing resources and services in support of this work

Page 9: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

HECC Facility Visitors and Tours in December 2014

Reinhard Schulte-Braucks (left), who heads the Global Monitoring for Environment and Security Unit at the European Space Agency (ESA), visited the NAS facility as part of his tour of NASA Ames. The tour was requested by the European Commission (EC) for potential collaborations with both the EC and the ESA.

POC: Gina Morello, [email protected], (650) 604-4462, NASA Advanced Supercomputing Division

•  HECC hosted 12 tour groups in December; guests learned about the agency-wide missions being supported by HECC assets, and some of the groups also viewed the D-Wave Two quantum computer system. Visitors this month included: –  Dr. Bjorn Ottersten, Center for Security, Reliability and

Trust, Luxembourg. –  An Ames media press event for the Exploration Flight

Test-1 launch was held at the NAS facility. Code T staff presented Orion graphics on hyperwall.

–  James Schultz, staff writer for the Aeronautics Research Mission Directorate, NASA Headquarters.

–  A group of scientists visiting from Massachusetts Institute of Technology who are working with Ames Kepler staff on the upcoming Transiting Exoplanet Survey Satellite (TESS) project.

–  Manuel Pinto de Abreu, the Secretary of State of the Sea, Portugal.

–  Arthur Payton, Counterintelligence Division Operations Manager, NASA Headquarters, talked with NAS Division management regarding possible collaboration and gaining time on Pleiades.

–  Dr. Reinhard Schulte-Braucks, head of the European Space Agency’s Global Monitoring for Environment and Security Unit, which also oversees the Copernicus Program.

9

Page 10: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Papers and Presentations

•  “High-Temperature Miscibility of Iron and Rock During Terrestrial Planet Formation,” S. M. Wahl, M. Militzer, Earth and Planetary Science Letters, vol. 410, December 1, 2014. * http://www.sciencedirect.com/science/article/pii/S0012821X14007067

•  “Martian Atmospheric Collapse: Idealized GCM Studies,” A. Soto, M. Mischna, T. Schneider, C. Lee, M. Richardson, Icarus, December 4, 2014. * http://www.sciencedirect.com/science/article/pii/S0019103514006575

•  “Galaxy and Mass Assembly (GAMA): The Galaxy Luminosity Functions within the Cosmic Web,” E. Eardley, et al., arXiv:1412.2141 [astro-ph.CO], December 5, 2014. * http://arxiv.org/abs/1412.2141

•  “Shallow Cavities in Multiple-Planet Systems,” P. C. Duffell, R. Dong, arXiv:1412.3560 [astro-ph.EP], December 11, 2014. * http://arxiv.org/abs/1412.3560

•  “Global Simulations of Axisymmetric Radiative Black Hole Accretion Discs in General Relativity with a Mean-Field Magnetic Dynamo,” A. Sadowski, et al., Monthly Notices of the Royal Astronomical Society, vol. 447, issue 1, December 12, 2014. * http://mnras.oxfordjournals.org/content/447/1/49.full

•  2014 American Geophysical Union Fall Meeting, December 15–19, 2014, San Francisco, –  “Realistic Modeling of Multi-Scale MHD Dynamics of the Solar Atmosphere,” I. Kitiashvili. * –  “Advances in Realistic MHD Simulations,” I. Kitiashvili. *

10

* HECC provided supercomputing resources and services in support of this work

Page 11: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Papers and Presentations (cont.)

•  2014 American Geophysical Union Fall Meeting (cont.) –  “Improving NASA GPM Data Exploitation Through Cloud-Resolving Model Empowered with

Parallel AsyncIO Management,” D. Kokron, T. Clune, W.-K. Tao. –  “Three-Dimensional Magnetic Reconnection Under Low Chromospheric Conditions Using a

Two-Fluid Weakly Ionized Reactive Plasma Model,” N. Mansour. * –  “Open NASA Earth Exchange: A Public-Private Partnership for Climate Change Research,”

R. Nemani, et al. * –  “Linking the Solar Interior with the Corona: Observations, Models, and Data Assimilation,”

I. Kitiashvili, N. Mansour (Chairs), A. Wray. –  “Open NASA Earth Exchange (OpenNEX): Strategies for Enabling Cross Organization

Collaboration in the Earth Sciences,” S. Ganguly, R. Nemani, et al. * –  “Using NASA Earth Exchange (NEX) to Develop Annual US Forest Disturbance Products,”

R. Nemani, et al. * –  “The Sun’s Meridional Flow and Its Role in Magnetic Flux Transport and the Sunspot Cycle,”

D. Hathaway, L. Upton. * –  “Revised Sunspot Numbers and the Effects on Understanding the Sunspot Cycle,”

D. Hathaway. * –  “Improving Synchronic Maps with Far-Side Active Region Emergence,” D. Hathaway, L. Upton.* –  “A Spatio-Temporal Data Mining Approach to Global Scale Burned Area Modeling,” R. Nemani,

N. Oza, et al. *

11

* HECC provided supercomputing resources and services in support of this work

Page 12: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Papers and Presentations (cont.)

•  2014 American Geophysical Union Fall Meeting (cont.) –  “High-Resolution Climate Data Over Conterminous US Using Random Forest Algorithm,”

R. Nemani, et al. * –  “Preliminary Vegetation Index Products from Suomi NPP VIIRS Illuminate the California

Drought,” R. Nemani, et al. * –  “Advancing Analytics Using Big Data Climate Information System,” R. Nemani (Chair). –  “Large-Scale Image Analytics Using Deep Learning,” R. Nemani, S. Ganguly, A. Michaelis, et al. * –  “Impact of Cumulated CO2 Emission on Air Temperature: Millennial-Scale Prediction,”

R. Nemani, et al. * –  “Variability of Global Atmospheric CO2 Concentrations Over Interannual to Multi-Decadal

Timescales: A Linear Approximation,” R. Nemani, et al. * –  “Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX),”

R. Nemani, S. Ganguly, A. Michaelis, P. Votava. * –  “A Semi-Automated Machine Learning Algorithm for Tree Cover Delineation from 1-m Naip

Imagery Using a High Performance Computing Architecture,” S. Ganguly, et al. * –  “Reducing Uncertainties in Satellite-derived Forest Aboveground Biomass Estimates Using a

High Resolution Forest Cover Map,” G. Zhang, et al. * –  “Mapping Urban Expansion Across North America Using Multi-Temporal Landsat and

Nighttime Lights Data,” A. Michaelis, G. Zhang, S. Ganguly, R. Nemani, et al. * –  “On the Use of FOSS4G in Land Cover Fraction Estimation with Unmixing Algorithms,”

U. Kumar, S. Ganguly, G. Zhang, R. Nemani, et al. *

12

* HECC provided supercomputing resources and services in support of this work

Page 13: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Papers and Presentations (cont.)

•  2014 American Geophysical Union Fall Meeting (cont.) –  “Long-Term Data Records of Biophysical Parameters from Multiple Satellite Systems,”

S. Ganguly, R. Nemani, G. Zhang, et al. * –  “A Quarter-Century U.S. Forest Disturbance History Mapped from Landsat,” J. Dungan, et al. * –  “Quantum Boosting and Fast Classical Metrics for Tree Cover Detection in Remote Sensing

Data,” S. Ganguly, R. Nemani, et al. * –  “Cost Optimal Elastic Auto-Scaling in Cloud Infrastructure,” S. Ganguly, R. Nemani, et al. * –  “Geography of Global Forest Carbon Stocks & Dynamics,” S. Ganguly, R. Nemani, et al. * –  “Mapping Drought Impacts on Agricultural Production in California’s Central Valley,” F. Melton,

A. Guzman, et al. * –  “Integrating Parallel Distributed Data Mining Algorithms into the NASA Earth Exchange

(NEX),” N. Oza, R. Nemani, A. Michaelis, et al. * •  “Electron and Ion Heating by Whistler Turbulence: Three-Dimensional Particle-in-

Cell Simulations,” R. S. Hughes, S. P. Gary, J. Wang, Geophysical Research Letters, December 17, 2014. * http://onlinelibrary.wiley.com/doi/10.1002/2014GL062070/full

•  “Asynchronous Amazon Forest Canopy Phenology Indicates Adaption to Both Water and Light Availability,” M. Jones, J. Kimball, R. Nemani, Environmental Research Letters, December 18, 2014. * http://iopscience.iop.org/1748-9326/9/12/124021/pdf/1748-9326_9_12_124021.pdf

13

* HECC provided supercomputing resources and services in support of this work

Page 14: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Papers and Presentations (cont.)

•  “Radiation from Particles Accelerated in Relativistic Jet Shocks and Shear-flows,” K.-I. Nishikawa, et al., arXiv:1412.7064v1 [astro-ph.HE] December 22, 2014. http://arxiv.org/abs/1412.7064 *

•  “A Steady-State Picture of Solar Wind Acceleration and Charge State Composition Derived from a Global Wave-Driven MHD Model,” R. Oran, et al., arXiv:1412.8288v1 [astro-ph.SR] December 29, 2014. http://arxiv.org/abs/1412.8288 *

14

* HECC provided supercomputing resources and services in support of this work

Page 15: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

News and Events

•  NASA at the AGU Fall Meeting, December 15-19, 2014, San Francisco—NAS Division staff attended the American Geophysical Union’s fall meeting, one of the largest Earth and space science conferences in the world. They presented their work to the public at the NASA booth’s hyperwall on the exhibit floor, chaired paper sessions, and displayed posters of some of the year’s accomplishments in the poster hall.

•  Northern California NASA Scientists Bring Drought Research Home, NASA Ames press release, December 15, 2014—Researchers at NASA Ames Research Center utilized the NASA Earth Exchange (NEX) platform and HECC systems at the NAS facility to study satellite data, in order to analyze and project the effects of California's record drought. http://www.nasa.gov/ames/northern-california-nasa-scientists-bring-drought-research-home

•  The Greenland Ice Sheet: Now in HD, Ohio State University press release, December 18, 2014—Researchers at OSU hope to expand their project to turn satellite images into publicly available elevation maps to the Pleiades supercomputer starting in 2015. http://news.osu.edu/news/2014/12/18/the-greenland-ice-sheet-now-in-hd/

15

Page 16: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

HECC Utilization

16

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Pleiades Endeavour Merope Production

Share Limit

Job Drain

Dedtime Drain

Limits Exceeded

Specific CPUs

Unused Devel Queue

Insufficient CPUs

Held

Queue Not Schedulable

Not Schedulable

No Jobs

Dedicated

Down

Degraded

Boot

Used

December 2014

Page 17: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

HECC Utilization Normalized to 30-Day Month

17

0

2,000,000

4,000,000

6,000,000

8,000,000

10,000,000

12,000,000

14,000,000

Stan

dard

Bill

ing

Uni

ts

NAS!

NLCS!

NESC!

SMD!

HEOMD!

ARMD!

Alloc. to Orgs!

Page 18: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

HECC Utilization Normalized to 30-Day Month

18

0

1

2

3

4

5

6

7

8

Stan

dard

Bill

ing

Uni

ts in

Mill

ions

ARMD ARMD Allocation

ARMD

1 2 3 4

5 67

8

9s  

1 Columbia  21,  23,  and  24  re3red,  Endeavour  2  added  

2 Columbia  22  re3red;  Endeavour  1  added  

3 32  Harpertown  Racks  re3red  

4 32  Harpertown  Racks  re3red;  46  Ivy  Bridge  Racks  added  

5 6  Ivy  Bridge  Racks  added;  20  Nehalem,  12  Westmere  Racks  Re3red  

6 8  Ivy  Bridge  Racks  added  mid-­‐Feb;  8  Ivy  Bridge  Racks  added  late  Feb.    

7 4  Ivy  Bridge  Racks  added  mid-­‐March    

8 6  Westmere  Racks  added  to  Merope,  Merope  Harpertown  re3red  

9 16  Westmere  Racks  re3red;  10  Nehalem  Racks  and  2  Westmere  Racks                                  added  to  Merope;  3  Ivy  Bridge  Racks  added;    15  Haswell  Racks  added  

s 16  Westmere  Racks  re3red                

0

1

2

3

4

5

6

7

8

Stan

dard

Bill

ing

Uni

ts in

Mill

ions

HEOMD NESC HEOMD+NESC Allocation

HEOMD, NESC

765

4 321

8 9 s  

0

1

2

3

4

5

6

7

8

Stan

dard

Bill

ing

Uni

ts in

Mill

ions

SMD SMD Allocation

SMD

2 1 34

5 6 7

8

9  

s

Page 19: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Tape Archive Status

19

0

10

20

30

40

50

60

70

80

90

100

110

120

Unique File Data Unique Tape Data Total Tape Data Tape Capacity Tape Library Capacity

Peta

Byt

es

Capacity

Used

HECC

Non Mission Specific NAS

NLCS

NESC

SMD

HEOMD

ARMD

!

December 2014

Page 20: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Tape Archive Status

20

0

10

20

30

40

50

60

70

80

90

100

110

120

Peta

Byt

es Tape Library Capacity

Tape Capacity

Total Tape Data

Unique Tape Data

!

Page 21: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: SBUs Reported, Normalized to 30-Day Month

21

0

2,000,000

4,000,000

6,000,000

8,000,000

10,000,000

12,000,000

14,000,000

Stan

dard

Bill

ing

Uni

ts

NAS

NLCS

NESC

SMD

HEOMD

ARMD

Alloc. to Orgs

Page 22: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: Devel Queue Utilization

22

0

200,000

400,000

600,000

800,000

1,000,000

1,200,000

1,400,000

1,600,000

Stan

dard

Bill

ing

Uni

ts

NAS

NLCS

NESC

SMD

HEOMD

ARMD

Devel Queue Alloc.

! !

Page 23: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: Monthly Utilization by Job Length

23

0  

500,000  

1,000,000  

1,500,000  

2,000,000  

2,500,000  

3,000,000  

0  -­‐  1  hours   >  1  -­‐  4  hours   >  4  -­‐  8  hours   >  8  -­‐  24  hours   >  24  -­‐  48  hours   >  48  -­‐  72  hours   >  72  -­‐  96  hours  >  96  -­‐  120  hours   >  120  hours  

Stan

dard  Billing  Units  

Job  Run  Time  (hours)  December 2014

Page 24: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: Monthly Utilization by Size and Mission

24

0  

500,000  

1,000,000  

1,500,000  

2,000,000  

2,500,000  

3,000,000  

3,500,000  

1  -­‐  32   33  -­‐  64   65  -­‐  128   129  -­‐  256   257  -­‐  512   513  -­‐  1024   1025  -­‐  2048  2049  -­‐  4096  4097  -­‐  8192   8193  -­‐  16384  

16385  -­‐  32768  

Stan

dard  Billing  Units  

Job  Size  (cores)  

NAS  

NLCS  

NESC  

SMD  

HEOMD  

ARMD  

   

December 2014

Page 25: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: Monthly Utilization by Size and Length

25

0  

500,000  

1,000,000  

1,500,000  

2,000,000  

2,500,000  

3,000,000  

3,500,000  

1  -­‐  32   33  -­‐  64   65  -­‐  128   129  -­‐  256   257  -­‐  512   513  -­‐  1024   1025  -­‐  2048  

2049  -­‐  4096  

4097  -­‐  8192  

8193  -­‐  16384  

16385  -­‐  32768  

Stan

dard  Billing  Units  

Job  Size  (cores)  

>  120  hours  

>  96  -­‐  120  hours  

>  72  -­‐  96  hours  

>  48  -­‐  72  hours  

>  24  -­‐  48  hours  

>  8  -­‐  24  hours  

>  4  -­‐  8  hours  

>  1  -­‐  4  hours  

0  -­‐  1  hours  

December 2014

Page 26: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: Average Time to Clear All Jobs

26

0  

24  

48  

72  

96  

120  

144  

168  

192  

Jan-­‐14   Feb-­‐14   Mar-­‐14   Apr-­‐14   May-­‐14   Jun-­‐14   Jul-­‐14   Aug-­‐14   Sep-­‐14   Oct-­‐14   Nov-­‐14   Dec-­‐14  

Hours  

ARMD   HEOMD/NESC   SMD      

Page 27: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Pleiades: Average Expansion Factor

27

 1.00    

 2.00    

 3.00    

 4.00    

 5.00    

 6.00    

 7.00    

 8.00    

Jan-­‐14   Feb-­‐14   Mar-­‐14   Apr-­‐14   May-­‐14   Jun-­‐14   Jul-­‐14   Aug-­‐14   Sep-­‐14   Oct-­‐14   Nov-­‐14   Dec-­‐14  

ARMD   HEOMD   SMD      

ARMD  6.63  HEOMD  15.87  SMD  9.06  

Page 28: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Endeavour: SBUs Reported, Normalized to 30-Day Month

28

0

10,000

20,000

30,000

40,000

50,000

60,000

70,000

80,000

90,000

100,000

Stan

dard

Bill

ing

Uni

ts

NAS

NLCS

NESC

SMD

HEOMD

ARMD

Alloc. to Orgs

Page 29: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Endeavour: Monthly Utilization by Job Length

29

0  

1,500  

3,000  

4,500  

6,000  

7,500  

9,000  

10,500  

12,000  

13,500  

15,000  

0  -­‐  1  hours   >  1  -­‐  4  hours   >  4  -­‐  8  hours   >  8  -­‐  24  hours   >  24  -­‐  48  hours   >  48  -­‐  72  hours   >  72  -­‐  96  hours  >  96  -­‐  120  hours   >  120  hours  

Stan

dard  Billing  Units  

Job  Run  Time  (hours)  December 2014

Page 30: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Endeavour: Monthly Utilization by Size and Mission

30

0  

1,500  

3,000  

4,500  

6,000  

7,500  

9,000  

10,500  

12,000  

13,500  

15,000  

1  -­‐  32   33  -­‐  64   65  -­‐  128   129  -­‐  256   257  -­‐  512   513  -­‐  1024  

Stan

dard  Billing  Units  

Job  Size  (cores)  

NAS  

NESC  

SMD  

HEOMD  

ARMD  

   

December 2014

Page 31: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Endeavour: Monthly Utilization by Size and Length

31

0  

1,500  

3,000  

4,500  

6,000  

7,500  

9,000  

10,500  

12,000  

13,500  

15,000  

1  -­‐  32   33  -­‐  64   65  -­‐  128   129  -­‐  256   257  -­‐  512   513  -­‐  1024  

Stan

dard  Billing  Units  

Job  Size  (cores)  

>  120  hours  

>  96  -­‐  120  hours  

>  72  -­‐  96  hours  

>  48  -­‐  72  hours  

>  24  -­‐  48  hours  

>  8  -­‐  24  hours  

>  4  -­‐  8  hours  

>  1  -­‐  4  hours  

0  -­‐  1  hours  

December 2014

Page 32: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Endeavour: Average Time to Clear All Jobs

32

0  

6  

12  

18  

24  

30  

36  

42  

48  

Jan-­‐14   Feb-­‐14   Mar-­‐14   Apr-­‐14   May-­‐14   Jun-­‐14   Jul-­‐14   Aug-­‐14   Sep-­‐14   Oct-­‐14   Nov-­‐14   Dec-­‐14  

Hours  

ARMD   HEOMD/NESC   SMD      

Page 33: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Endeavour: Average Expansion Factor

33

 1.00    

 2.00    

 3.00    

 4.00    

 5.00    

 6.00    

 7.00    

 8.00    

Jan-­‐14   Feb-­‐14   Mar-­‐14   Apr-­‐14   May-­‐14   Jun-­‐14   Jul-­‐14   Aug-­‐14   Sep-­‐14   Oct-­‐14   Nov-­‐14   Dec-­‐14  

ARMD   HEOMD   SMD  

Page 34: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Merope: SBUs Reported, Normalized to 30-Day Month

34

0

25,000

50,000

75,000

100,000

125,000

150,000

Stan

dard

Bill

ing

Uni

ts

NAS

NLCS

NESC

SMD

HEOMD

ARMD

Page 35: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Merope: Monthly Utilization by Job Length

35

0  

10,000  

20,000  

30,000  

40,000  

50,000  

60,000  

70,000  

80,000  

90,000  

100,000  

110,000  

120,000  

0  -­‐  1  hours   >  1  -­‐  4  hours   >  4  -­‐  8  hours   >  8  -­‐  24  hours   >  24  -­‐  48  hours   >  48  -­‐  72  hours   >  72  -­‐  96  hours  >  96  -­‐  120  hours   >  120  hours  

Stan

dard  Billing  Units  

Job  Run  Time  (hours)  December 2014

Page 36: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project 36

Merope: Monthly Utilization by Size and Mission

0  

20,000  

40,000  

60,000  

80,000  

100,000  

120,000  

140,000  

1  -­‐  32   33  -­‐  64   65  -­‐  128   129  -­‐  256   257  -­‐  512   513  -­‐  1024   1025  -­‐  2048   2049  -­‐  4096   4097  -­‐  8192  

Stan

dard  Billing  Units  

Job  Size  (cores)  

NAS  

NESC  

SMD  

HEOMD  

ARMD  

   

December 2014

Page 37: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Merope: Monthly Utilization by Size and Length

37

0  

20,000  

40,000  

60,000  

80,000  

100,000  

120,000  

140,000  

1  -­‐  32   33  -­‐  64   65  -­‐  128   129  -­‐  256   257  -­‐  512   513  -­‐  1024   1025  -­‐  2048   2049  -­‐  4096   4097  -­‐  8192  

Stan

dard  Billing  Units  

Job  Size  (cores)  

>  120  hours  

>  96  -­‐  120  hours  

>  72  -­‐  96  hours  

>  48  -­‐  72  hours  

>  24  -­‐  48  hours  

>  8  -­‐  24  hours  

>  4  -­‐  8  hours  

>  1  -­‐  4  hours  

0  -­‐  1  hours  

December 2014

Page 38: December 2014 HECC Report

Jan 10, 2015 National Aeronautics and Space Administration High-End Computing Capability Project

Merope: Average Expansion Factor

38

1  

2  

3  

4  

5  

6  

7  

8  

Jan-­‐14   Feb-­‐14   Mar-­‐14   Apr-­‐14   May-­‐14   Jun-­‐14   Jul-­‐14   Aug-­‐14   Sep-­‐14   Oct-­‐14   Nov-­‐14   Dec-­‐14  

ARMD   HEOMD   SMD