19
NSF Engineering and High Performance Computing Pramod P. Khargonekar Assistant Director National Science Foundation Directorate of Engineering

NSF Engineering and High Performance Computing · NSF Engineering and High Performance Computing Pramod P. Khargonekar Assistant Director National Science Foundation Directorate of

  • Upload
    others

  • View
    9

  • Download
    0

Embed Size (px)

Citation preview

NSF Engineering and High Performance Computing

Pramod P. KhargonekarAssistant Director

National Science FoundationDirectorate of Engineering

The Directorate for Engineering

2Image credits, clockwise from top left: right; Electronic Visualization Laboratory, University of Illinois at Chicago; Philip DeCamp and Deb Roy; Integrated Surface Technologies, Menlo Park, CA; Wentai Liu, UCLA; Erik Demaine and Martin Demaine, MIT

Investinginengineeringresearchandeducationand

fosteringinnovationsforbenefittosociety

Directorate for Engineering (ENG)

3

EmergingFrontiersandMultidisciplinaryActivities

(EFMA)Sohi Rastegar

AssistantDirectorPramodKhargonekar

DeputyAssistantDirectorGraceWang

SeniorAdvisorMihailRoco

Chemical,Bioengineering,

Environmental,andTransportSystems(CBET)

JoAnnLighty (DD)

Civil,Mechanical,andManufacturingInnovation(CMMI)

DeborahGoodings(DD)

Electrical,Communications,

andCyberSystems(ECCS)

SamirEl-Ghazaly (DD)

EngineeringEducationand

Centers(EEC)

DonMillard(actingDD)

IndustrialInnovationandPartnerships

(IIP)JosephHennessy

(actingDD)

ProgramDirectorforStrategicOperations

CherylAlbus

ProgramDirectorforEvaluationandAssessmentAlexandraMedina-Borja

TranslationalFundamentalIIP

••Academic Partnerships

••Small Business Partnerships

EEC••Engineering Research Centers

••Engineering Education

••Engineering Workforce

EFRI CBET••Chemical, Biochemical, and Biotechnology Systems

••Biomedical Engineering and Engineering Healthcare

••Environmental Engineering and Sustainability••Transport and Thermal Fluids PhenomenaCMMI••Advanced Manufacturing••Mechanics and Engineering Materials••Resilient and Sustainable Infrastructure••Systems Engineering and Design

ECCS••Electronics, Photonics, and Magnetic Devices••Communications, Circuits, and Sensing Systems••Energy, Power, and Adaptive Systems

Directorate for Engineering

4

ENG Initiatives and Priorities Address National Interests • Innovations at the Nexus of Food,

Energy, and Water Systems• Risk and Resilience• Cyber-Enabled Materials, Manufacturing,

and Smart Systems– Advanced Manufacturing

• Understanding the Brain • National Nanotechnology Initiative• Optics and Photonics

• Education and Broadening Participation– IUSE: Improving Undergraduate Science and

Engineering– INCLUDES: Inclusion across the Nation of

Communities of Learners that have been Underrepresented for Diversity in Engineering and Science

• Innovation Corps

5

ENG R&RA Budget ($M)

FY 2014 Actual*

FY 2015 Estimate

FY 2016 Request

Change over FY 2015 Estimate

Amount Percent

CBET $167.76 $177.82 $192.26 $14.44 8.1%

CMMI 195.23 209.52 222.73 13.21 6.3%

ECCS 100.37 110.43 119.24 8.81 8.0%

EEC 119.50 117.49 110.39 -7.10 -6.0%

IIP 205.99 226.98 248.11 21.13 9.3%

SBIR/STTR 159.99 177.11 194.36 17.25 9.7%

EFMA 44.27 50.07 56.49 6.42 12.8%

ENG TOTAL $833.12 $892.31 $949.22 $56.91 6.4%

6* FY 2014 actuals were adjusted to reflect EFMA reallocations in order to facilitate comparison across fiscal years.

Engineering and Computing

• Engineering has a deep and long-standing connection with computing– Computational modeling, data, simulation, optimization

pervasive in all fields of engineering– Sensors, networks and computation deeply integrated

into engineered systems, manufacturing, and services– Engineering Directorate also supports research to

enable advances in computing

7

Data-IntensiveScience Community

Building

BIGDATA

DIBBs

CDS&E

NRT-DESE

SI2

EarthCube

BCC

Engineeringinvolvedin:- CDS&E:ComputationalandData- EnabledScience

andEngineering- DIBBs:DataInfrastructureBuildingBlocks- NCN:NetworkforComputationalNanotechnology- SI2:SoftwareInfrastructureforSustainedInnovation- BIGDATA- NRT

AIP

RIDIR

FY2015Activities

NCN

ENG and CIF21 activities

ENG and Cloud Computing

9

Some Examples

10

NanoHUB and NCN

11

Scienceandengineeringgatewayforcomputationalnanotechnology

HPC and Bio-Engineering• First principles modeling of biomolecular systems

– Protein folding, complexes, ion channels, reaction kinetics• From genome annotation to reverse engineering of biological systems

– Metabolic network reconstruction– Regulatory network analysis and reconstruction– Populations and ecosystems

• From computational molecular biology to computational cell biology– Whole cell modeling– Tissue and organ modeling– Virtual organisms

• Bioengineering– Structural, electrical, and physiological modeling

HPC in Power Systems and Smart Grid• HPC is primarily for dynamic simulation/operations studies

of the power grid Ø Off-line – simulation environment for large scale systems to evaluate designs and proposed controls

under different conditionsØ On-line –

• faster than real-time simulation to investigate large number of contingencies (outages, events);• fast control calculation in case of imminent risk to the grid

• ChallengesØ Extreme scale (potentially millions of components)Ø Large geographic reachØ Modeling and data validityØ Integrating real-time measurements with simulation

2-13

ENG & HPC: Turbulence“Turbulence occurs over a wide range of scales. The two most resource-intensive approaches are direct numerical simulation (DNS) which resolves all scales, and large eddy simulation (LES) which resolves only the large scales but models the small scales. The largest DNS performed on Blue Waters at UIUC uses over half a trillion grid points and over 200,000 CPU cores.”

“A realistic simulation of rain formation, involving atmospheric thermodynamics, phase change, particle inertia, humidity, and radiative heat transfer, would be a worthy Exascale problem that has very significant impact. It is clear that fluid dynamics has a leading role in this endeavor.”

[*Position paper by: P-K Yeung (G Tech), L.R. Collins (Cornell), S. Elghobashi (UC Irvine), C. Meneveau (Johns Hopkins Univ), M.W. Plesniak (George Washington Univ), W.W. Schultz (Univ. Michigan), and K.R. Sreenivasan (New York Univ)]

14

ENG & HPC: Integrated Systems Advanced Computing Challenges*

1. Optimum process synthesis and design of systems that span multiple temporal and spatial scales2. Optimum product/molecular design3. How energy and healthcare systems, both of which are complex systems with multiple stakeholders, will evolve at the product, process and supply chain levels as we try to address the national grand challenges: 4. Impact of uncertainty on all of the above mentioned open challenges

[*From CAST Division of AIChE white paper: Bri-Mathias Hodge (NREL). John Siirola, (Sandia N.L., Selen Cremaschi (Tulsa), Nael H. El-Farra (UC Davis), Marianthi Ierapetritou (Rutgers), Karl Schnelle (Dow)]

15

Some HPC issues in Engineering Research• Need for development of dedicated software with capabilities beyond those

available in commercial packages. Development and utilization of the next generation software should involve partnerships between computer scientists, domain scientists, and hardware experts.

• Multiscale/multiphysics computational resources are relevant to several programs and are not broadly available (mostly research level codes that are not ready nor available for distribution and broad adoption)

• Data transfer and visualization from supercomputing facilities to local resources (desktop) is often a challenge and a bottleneck in research

• HPC in real-time is still a challenge, given limitation in data transfer. If available, how would that affect how engineering research is performed?

• If more real-time large-scale simulation using cloud resources were conducted, could we have more environmentally friendly critical infrastructure?

The Directorate for Engineering

17Image credits, clockwise from top left: right; Electronic Visualization Laboratory, University of Illinois at Chicago; Philip DeCamp and Deb Roy; Integrated Surface Technologies, Menlo Park, CA; Wentai Liu, UCLA; Erik Demaine and Martin Demaine, MIT

Investinginengineeringresearchandeducationand

fosteringinnovationsforbenefittosociety

30nm

1.4μm

700nm

300nm

11nm

2nm

nucleosome

chromatinfiber

DNA

solenoid

chromatinloop

chromatin

Histone-DNAinteractionsatomisticMD:~300,000atoms

Stampede(128cpu)=13days

MD=MolecularDynamics;BD=BrownianDynamics

Lodish,etal.,MolecularCellBiology,2000Tachiwana,H.etal.(2011).Acta Cryst.D67,578-583J.S.Kimetal 2011Phys.Biol. 8 015004J.S.Kimetal.Phys.Rev.Lett.106,168102

ModelingMulti-scaleGenomicProcessesfromPhysicsPrinciplesChromatin/crowdingcoarsegrainBD:10,000,000crowders

Stampede(128cpu)≈20days

ChromatinatomisticMD

128cpu≈112year!

Multiplechromatin/crowding

coarsegrainBD128cpu≈1year

Multiscale modelingofnanocarriers fordrugdelivery

Motionofredcellsinflowdisplacenanocarriers towardthebloodvesselwall

Nanocarriers entercell-freelayernearendothelium

Ligandscoatingnanocarrier surfacebindtoendothelialreceptors

Scale≈10μm- Stochasticmethodsfortrajectories

Scale≈0.1μm- Continuummethods

Scale≈10-100nm- Moleculardynamics