Upload
steven-meyers
View
17
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Designing a cluster for geophysical fluid dynamics applications. Göran Broström Dep. of Oceanography, Earth Science Centre, Göteborg University. Our cluster (me and Johan Nilsson, Dep. of Meterology, Stockholm University). Grant from the Knut & Alice Wallenberg foundation (1.4 MSEK) - PowerPoint PPT Presentation
Citation preview
Designing a cluster for geophysical fluid dynamics
applications
Göran BroströmDep. of Oceanography, Earth Science
Centre, Göteborg University.
Our cluster(me and Johan Nilsson, Dep. of Meterology,
Stockholm University)
• Grant from the Knut & Alice Wallenberg foundation (1.4 MSEK)
• 48 cpu cluster• Intel P4 2.26 Ghz• 500 Mb 800Mhz Rdram• SCI cards
• Delivered by South Pole• Run by NSC (thanks Niclas & Peter)
Timescales
• Atmospheric low pressures: 10 days
• Seasonal/annual cycles: 0.1-1 years
• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-200.000 years.
Timescales
• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year
• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-200.000 years.
Timescales
• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.
• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-200.000 years.
Timescales
• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.
• Turnover time of the ocean: 4.000 years.
• Glacial-interglacial timescales: 10.000-200.000 years.
Timescales
• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.
• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.
• Glacial-interglacial timescales: 10.000-200.000 years.
MIT General circulation model
• General fluid dynamics solver• Atmospheric and ocean physics• Sophisticated mixing schemes• Biogeochemical modules• Efficient solvers• Sophisticated coordinate system• Automatic adjoint schemes• Data assimilation routines
• Finite difference scheme• F77 code• Portable
MIT General circulation model
• General fluid dynamics solver• Atmospheric and ocean physics• Sophisticated mixing schemes• Biogeochemical modules• Efficient solvers• Sophisticated coordinate system• Automatic adjoint schemes• Data assimilation routines
• Finite difference scheme• F77 code• Portable
Choosing interconnection
(requires a cluster to test)
Based on earlier experience we use SCI from Dolphinics (SCALI)
Our choice
• Named Otto• SCI cards• P4 2.26 GHz (single cpus)• 800 Mhz Rdram (500 Mb)• Intel motherboards (the only available)
• 48 nodes• NSC (nicely in the shadow of Monolith)
Some tests on other machines
• INGVAR: 32 node, AMD 900 MHz, SCI• Idefix: 16 node, Dual PIII 1000 MHz, SCI• SGI 3800: 96 Proc. 500 MHz• Otto: 48 node, P4 2.26 Mhz, SCI• ? MIT, LCS: 32 node, P4 2.26 Mhz, MYRINET
SCI or Myrinet?
120*120*20 gridpoints (60*60*20 gripoints)
(ooops, I used the ifcCompiler for these tests)
SCI or Myrinet?
120*120*20 gridpoints (60*60*20 gripoints)
(ooops, I used the ifcCompiler for these tests)
(1066Mhz rdram?)
SCI or Myrinet?(time spent in pressure calc.)
120*120*20 gridpoints (60*60*20 gripoints)
(ooops, I used the ifcCompiler for these tests)
(1066Mhz rdram?)
Conclusions
• Linux clusters are useful in computational geophysical fluid dynamics!!
• SCI cards are necessary for parallel runs >10 nodes.• For efficient parallelization: >50*50*20 grid points per
node!• Few users - great for development.
• Memory limitations, for 48 proc. a’ 500 Mb, 1200*1200*30 grid points is maximum (eddy resolving North Atlantic, Baltic Sea).
• For applications similar as ours, go for SCI cards + cpu with fast memory bus and fast memory!!