Underworld

Underworld

A parallel, python, particle-in-cell, finite-element code for Geodynamics

Underworld is a python-friendly geodynamics code which provides a programmable and flexible front end to all the functionality of the code running in a parallel HPC environment. This gives significant advantages to the user, with access to the power of python libraries for setup of complex problems, analysis at runtime, problem steering, and coupling of multiple problems. Underworld2 is integrated with the literate programming environment of the jupyter notebook system for tutorials and as a teaching tool for solid Earth geoscience.

Underworld2 is an open-source, particle-in-cell finite element code tuned for large-scale geodynamics simulations. The numerical algorithms allow the tracking of history information through the high-strain deformation associated with fluid flow (for example, transport of the stress tensor in a viscoelastic, convecting medium, or the advection of fine-scale damage parameters by the large-scale flow). The finite element mesh can be static or dynamic, but it is not constrained to move in lock-step with the evolving geometry of the fluid. This hybrid approach is very well suited to complex fluids which is how the solid Earth behaves on a geological timescale.

This website is aimed at the users of Underworld2 to get started and to discuss the code with the development team and other users (through the comments / discussion group). Developers can also turn to the Github repository for source code, montoring issues, and pre-release features.

Governance

Underworld is funded by AuScope which is part of the Australian Government's NCRIS initiative to provide community research infrastructure (please see www.auscope.org.au for more information).

The Underworld2 development team is based in Australia at the Australian National University, the University of Melbourne and at Monash University led by Louis Moresi.

All development is overseen by a steering committee drawn from the supporting organisations and representatives from the Underworld community.

Acknowledgements

We would like to acknowledge AuScope Simulation, Analysis and Modelling for providing long term funding which has made the project possible. Additional funding for specific improvements and additional functionality has come from the Australian Research Council (http://www.arc.gov.au). The python toolkit was funded by the NeCTAR eresearch_tools program. Underworld2 was originally developed in collaboration with the Victorian Partnership for Advanced Computing.

Background

The numerical methods have been published in detail in Moresi et al, (2002, 2003). These papers dealt primarily with 2D applications but in recent years, we have introduced a number of improvements in the method to enable us to scale the problem to 3D (Moresi et al, 2007). For example we use a fast discrete Voronoi method to compute the integration weights of the particle-to-mesh mapping efficiently (Velic et al, 2009). We have also concentrated on extremely robust solvers / preconditioners which are necessary because the material variations and geometrical complexity are both large and unpredictable when we start of the simulation.

The benefit of this approach is associated with the separation of the computational mesh from the swarm of points which track the history. This allows us to retain a much more structured computational mesh than the deformation / material history would otherwise allow. We can take full advantage of the most efficient geometrical multigrid solvers and there is no need to preserve structure during any remeshing operations we undertake (for example if we do need to track a free surface or an internal interface). Although there are several complexities introduced by enforcing this separation, we find that the benefits, for our particular class of problems, are significant.

Implementation and parallelism

Underworld2 builds upon the StGermain framework described in Quenette et al, (2007). This provides the essential infrastructure to manage i/o, meshes, particle swarms, finite element operations, in a parallel (domain decomposition, message passing) environment. The numerical solvers are based around the PETSc software suite which focuses on delivering good parallel scalability (up to thousands-of-cores). Our experience to date shows good scalability for thermal problems to 2000+ cores, but for the relatively memory-intensive dynamic simulations, we have a drop-off in scaling at 500+ cores.

References

Moresi, L., Dufour, F., and Muhlhaus, H.B., 2002, Mantle convection modeling with viscoelastic/brittle lithosphere: Numerical methodology and plate tectonic modeling: Pure And Applied Geophysics, v. 159, no. 10, p. 2335–2356, doi: 10.1007/s00024-002-8738-3.

Moresi, L., Dufour, F., and Muhlhaus, H.B., 2003, A Lagrangian integration point finite element method for large deformation modeling of viscoelastic geomaterials: Journal of Computational Physics, v. 184, no. 2, p. 476–497.

Moresi, L., Quenette, S., Lemiale, V., Mériaux, C., Appelbe, W., Mühlhaus, 2007, Computational approaches to studying non-linear dynamics of the crust and mantle: Phys. Earth Planet. Inter, v. 163, p. 69–82, doi: 10.1016/j.pepi.2007.06.009.

Quenette, S., Moresi, L. N., Sunter, P. D., & Appelbe, W. F. (2007). Explaining StGermain: An aspect oriented environment for building extensible computational mechanics modeling software. Presented at the HIPS 2007 Workshop, Parallel and Distributed Processing Symposium, 2007. Proceedings. 19th IEEE International.

Velić, M., D. A. May, and L. N. Moresi (2009), A fast robust algorithm for computing discrete voronoi diagrams, Journal of Mathematical Modelling and …, doi:10.1007/s10852-008-9097-6.