Upload
others
View
6
Download
0
Embed Size (px)
Citation preview
Visualisation for WSIRon Fowler
Visualisation GroupSTFC Scientific Computing Department
Outline
• Visualisation and software engineering
• The SoFT project and STFC facilities
• Simple ParaView examples
• Running ParaView remotely and in parallel
• In-situ visualisation
Visualisation in CFD
• Used to verify simulation results
– Are the simulation results correct?
– “Integration testing” level.
• To understand simulation results e.g.
– where the largest forces occur on a structure?
• Monitor long running simulations
– Is the simulation running correctly?
– Should it be aborted?
– “Computational steering”.
Example: wave tank simulation: focused wave
• Hildebrant & Sriram 2014
• Used to compare simulation
and experiment.
• Simulation gives insight to
flow.
Using PyFoam to monitor OF simulation
• Simple dynamic
monitor of run
• Spot errors,
estimate finish
• Job stop/restart if
needed
The SOfT Project
• Project members: MMU, UoP & STFC
• Developing software virtual wave tank environment
• Combining NLFP, incompressible and full NS solvers
• Using parallel HPC and high end visualisation
STFC and Hartree facilities
• Hartree HPC: Blue joule,
dataPlex, phase 2 systems
• Fast parallel file system
• High speed connections to
visualisation rooms
• Large memory visualisation
nodes.
Facilities: 3D stereo and head tracking
• 3D stereo active glasses for room display.
• Head tracking allows immersive VR.
• ParaView supports 3D and immersive displays.
• Oculus rift etc. may be supported in future.
Visualisation tools for CFD
• Commercial tools
– EnSight
– FieldView
– TecPlot
– Matlab
– Avizo Wind
– Etc.
• Open source tools
– ParaView
– VisIT
– VTK
– OpenDX
– SciPython/matplotlib
– Mayavi
ParaView
• Open Source
– Support via KitWare/US
• Readers: 70+:
– OpenFOAM, VTK, EnSight, Fluent, NetCFD, TecPlot…
• Range of filters:
– Isosurfaces, vectors, streamlines, volume vis., derived data, mesh quality
• Animation and video
– To screen and file
• Parallel/remote render
– HPC rendering
• In-situ visualisation
– Catalyst allows visualisation in simulation
BASIC USE OF PARAVIEW
2D Dam break in ParaView & VisIT
Camera animation in ParaView
Wave tank simulation
Using ParaView remotely
• Why remote?
– Focused wave “coarse”: 52 Gbytes
– 2^3 mesh refinement: 416 Gbytes
– 4^3 mesh refinement: 3328 Gbytes
• Too large for local machines
– Need parallel file system, large memory
– Remote rendering possible
– Ideally on same HPC system as simulation
ParaView on HPC
• Simple rendering:
on client
• Heavy rendering:
on server – needs
graphics cards,
X11 or many
nodes (OSMesa)
Remote visualisation: Hartree
• Hartree HPC visualisation nodes:
– 64 Gbytes, 32 cores
– 2 Nvidia Quadro 5000 cards
– Mount parallel file system
• Access to pvserver via ssh:
– Can use parallel readers (MPI)
– User access to remote X11
server for remote rendering
• Examples used client-server
ParaView to desktop
PV client
pvservers (MPI)
ssh
ParaView Catalyst
• Integrate visualisation
into simulation
– Full data dump expensive, slow, large
– Visualisation small, e.g. isosurface << full data
– Add adaptor to code
– Define visualisation as python script
ParaView Catalyst
• Advantages:
– Reduce output data
– Visualisation as
simulation proceeds
– PV scripting for
visualisation pipeline
– Support for
Code_Saturne
• Disadvantages:
– Need adaptor (no OF)
– Define visual pipeline
in advance
– Increase size of HPC
job