10
U.S. epartment of Energy’s Office f Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock [email protected] 12/5/2005

U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock [email protected]

Embed Size (px)

Citation preview

Page 1: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

U.S. Department of Energy’s

Office of Science

High Performance ComputingChallenges and Opportunities

Dr. Daniel [email protected]/5/2005

Page 2: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

2

Why is high performance computing hard?

Amdahl’s Law (No Free Lunch) Moore’s Law (Even when you think

there’s a free lunch there isn’t) Software Complexity

Page 3: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

3

Amdahl’s Law

The time needed to complete a task is about the same as the sum of the times required to

complete the subtasks.

T ~ Tcpu + Tcomm + Ti/o

T ~ Tmesh + Tcomp + Tinterpret

T ~ Tgrowth + Texp + Tanalysis

Page 4: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

4

Moore’s Law

The number of transistors on an integrated circuit doubles about every 2 years.

A business principle not a law of nature!

Contributes to but not directly responsible for increases in performance!

In combination with Amdahl yields increased software complexity! Memory slower than CPU’s I/O slower than memory

Page 5: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

5

CPU Performance

Increases in clock rate but power goes as square of frequency.

Multiple instructions per clock cycle. Memory management

Caches Pin management

Clock rate increases are expected to slow and all chips will go to 4 or more cpu’s by

2010.

Page 6: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

6

Computer Architecture PrimerComputers are like cities!

Vector Interface

Serial Interface

CPU

Cache

Interconnection Network

Page 7: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

7

It’s all so complicated,

maybe I’ll just go infect someone

Stop Complaining! Plague…You

won’t catch me from computer

scientists or mathematicians

I’m way more dangerous than either of you or

them!

If I get to them first there’ll be no flesh to infect!!!

What does this have to do with biology?

Page 8: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

8

Software and Hardware Complexity

The desktop of 2010 will have 4-8 processors.

Today’s 64 processor clusters will have over 500 processors

Petascale systems may have hundreds of thousands of processors. How to divide up the work Dealing with hardware failure Managing the software Enabling science

Page 9: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

9

How to Succeed in High Performance Computing without really dying…

Never forget Amdahl Remember Willy Sutton (Go where the

money is.) The least expensive piece of software

to develop is the one you can use from someone else.

Use wetware…Collaborate

Page 10: U.S. Department of Energy’s Office of Science High Performance Computing Challenges and Opportunities Dr. Daniel Hitchcock Daniel.Hitchcock@science.doe.gov

Office of Science

U.S. Department of Energy

10

SciDAC Wetware for Scientific Discovery

Computers and Networks

Integrated SoftwareInfrastructure Centers

(ISICs)

Collaboratory Tools & Middleware

Scientific Application Partnerships Collaboratory Pilot Projects

Scientific Teams

ASCR

BES,BER,FES,HEP,NP

I love Wetware!!!