Current version: 1.3.0 (December 2023).
The software solves time-harmonic inverse
wave problems for acoustic and elastic media
using the Hybridizable Discontinuous Galerkin
method for discretization.
The latest version available at
https://gitlab.com/ffaucher/hawen.
It combines MPI
and OpenMP
parallelism to solve large-scale applications
such as Earth's imaging and helioseismology.
Contact:
Please email florian.faucher [at] inria.fr
if you use hawen
and to add your applications on the website.
Reference:
F. Faucher,
hawen: time-harmonic wave modeling and inversion using hybridizable discontinuous Galerkin discretization,
Journal of Open Source Software, 6 (57), 2021.
The Applications page illustrates several experiments in helioseismology, forward and inverse problems using the software.
The Documentation page contains the User's guide with all information regarding the software utilization.
The Installation page describes the steps to compile the software.
The Tutorial page provides step-by-step examples to learn how to use the software.
License: This project is distributed under the GNU General Public License v3.0.
It is provided without warranty.
It is developed by F. Faucher
(florian.faucher [at] inria.fr).
The software solves time-harmonic waves forward and inverse problems. The PDE depends on the context (acoustic, elastic, etc), which is discretized using the Hybridizable Discontinuous Galerkin method. The followings are currently available:
Bootstrap 4.3.1
jQuery 3.4.1
In addition to this dedicated web page, the README.md
file
and the user's guide documentation
also contain information in order to install the software.
Once you have downloaded or cloned the repository, the following should be present:
Upon successful compilation, the folder
The dependencies can be installed directly from the official repositories with
For the visualization of the results, ParaView should also be installed.
For dependencies installed via the official repositories, one can
use the default configuration file to compile the code:
Go to the
By default the acoustic forward problem is compiled, you just have to use
Upon successful compilation, the output is created in the
that should show forward_helmholtz_acoustic-iso_hdg.out
.
Other formulations (e.g., elastic wave, inverse problem) can be
selected by editing the file config/make.version
.
It important to run
in between compilations. For instance, to compile the elastic forward problem after
the acoustic one has been compiled, you can use
Here, instead of the sed
command, you can simply edit
the file config/make.version
with your favourite text editor.
It is possible to install using CMake
, it is
still an experimental version, please send any
coment/question in case of problems during installation.
To install with CMake
, after all dependencies have
been installed, one runs in the root directory:
such that the compilation files generated by CMake
will be in the newly created build/
repository.
It is possible to specify the propagator and type for problem
for compilation using -D
flags, such as
to compile the inversion code for the helmholtz elastic-iso propgator.
One can also use a file to link with the options, e.g.,
Where preload.cmake
contains the option. An example of such file
can be found in cmake/preload.cmake.Template
We further refer to the documentation to link manually installed libraries.
Then, the compilation can be performed in parallel with CMake
.
For instance, using 10 cores to compile, one runs,
The executables are then generated in folder build/bin
.
One can also verify the installation by going into the build
directory with cd build/
, and then running
The code combine MPI
and OpenMP
parallelism, and can be launched such that
MPI
processors with 4 threads per MPI
processor.
The input parameter file is parameter-file.txt
,
see the user's guide documentation
for its formalism.
In the folder examples/
, benchmarks can be found to test the
compiled executables.
Some experiments can be found in
the Tutorial page.
These are the mandatory library dependencies, which can be retrieved from the official repository as indicated above.
The direct solver MUMPS
is used to solve the linear system.
It is designed for the factorization of large-scaled sparse matrix, and
allows for multiple right-hand sides. It is available at
http://mumps-consortium.org/.
Two libraries of MUMPS
must be installed prior
to hawen
: libcmumps
(for simple precision
complex matrices) and libzmumps
(for double precision complex matrices).
As part of MUMPS
, BLAS
, LAPACK
and ScaLAPACK
are also required (https://www.netlib.org/scalapack/),
they are also used by hawen
.
The partitioner Metis
is used to split
the mesh among the MPI
processors, that is, every processor retains a list of cell.
It is available at http://glaros.dtc.umn.edu/gkhome/metis/metis/overview.
Before the compilation, one needs to create the file
code/config/make.version_config
according to the architecture (e.g., GNU
, Intel
, PGI
)
and indicating the path towards the external libraries.
Templates can be found in the folder code/config/config-files_make/
.
In particular, the same libraries used to compile MUMPS
must be linked in the code/config/make.version_config
file.
In addition, one can select the arithmetic precision in
the file code/src/m_define_precision.f90
The choice of problem is made prior to the compilation, such that
one executable is linked with one problem (forward or inverse,
choice of propagation). This is selected in the file
code/config/make.version
,
together with come compilation options. The main options are
DEBUG
and OPTIMIZATION
to select either the compilation in debug or with optimization
flags (e.g., -O3
flag). One should be 0
and the other 1
. PROBLEM
indicates the type of problem: this can be forward
or inversion
respectively to compile the forward or the inverse problem. PROPAGATOR
indicates the type of choice of PDE: this can be
helmholtz_acoustic-iso
helmholtz_elastic-iso_Su
helio_scalar-conjugated_spherical1d
helio_scalar-conjugated
MUMPS_LATEST
should be set to 1
if the installed version of MUMPS is at least the 5.3.3. It is 0
by default.DEPENDENCY_ARB
: should be 0
if arb
library is not linked,
or 1
if it linked in the file
code/config/make.version_config
.We further refer to the user's guide documentation for more information on the problem solved.
One can link the library arb
,
http://arblib.org/.
It is used to precisely compute special functions (e.g., Bessel’s
functions are not stable depending on the parameters).
In particular, it is used to compute the exact Dirichlet-to-Neumann
map condition for the scalar wave problem in helioseismology under
spherical symmetry, that needs the Whittaker’s special functions,
cf. [1,
2,
3].
To link this external library, you must indicate the path towards the
library in the code/config/make.version_config
, and the
use of arb
in the file code/config/make.version
.
Once the dependencies are installed and the configuration files
code/config/make.version_config
and
code/config/make.version
are ready,
Go to the code/
directory and compile the
code
It important to run
in between successive compilations (e.g., forward and inverse problems),
that is, every time the file config/make.version
is modified.
The code is used to solve time-harmonic wave problems for modeling and inversion. This includes visco-acoustic and visco-elastic inverse wave problems, as well as helioseismology. We provide below some illustrations of results obtained with the software.
If you are a user of hawen
, you can email
florian.faucher [at] inria.fr
to advertise your applications here.
Acoustic inversion using FRgWI
Reconstruction of the three-dimensional acoustic SEAM model
using Full Reciprocity-gap Waveform Inversion. It allows to
work with data from unknown sources and to combine sparse and
dense acquisitions.
Reference:
Full reciprocity-gap waveform inversion enabling sparse-source acquisition.
F. Faucher, G. Alessandrini, H. Barucq, M. V. de Hoop, R. Gaburro & E. Sincich; Geophysics, 85 (6), 2020.
True wave-speed model.
Reconstructed wave-speed.
Elastic inversion using FRgWI
Reconstruction of elastic medium using Full Reciprocity-gap
Waveform Inversion. It allows to work with data from unknown
sources and we combine active and passive sources for
multi-resolution inversion.
Reference:
Reciprocity-gap misfit functional for Distributed Acoustic Sensing, combining
data from active and passive sources .
F. Faucher, M. V. de Hoop & O. Scherzer; Geophysics, 86 (1), 2020.
Initial P and S-wave speeds.
Reconstructed P and S-wave speeds.
Viscoacoustic ultrasound tomography
We perform visco-acoustic inversion in the context of
breast imaging from ultrasound tomography, including
attenuation model uncertainty.
Reference:
Quantitative inverse problem in visco-acoustic media under attenuation model uncertainty .
F. Faucher & O. Scherzer; Journal of Computational Physics, 470 (1), 2022.
True, initial and reconstructed wave-speed models for breast imaging. The initial guess is an homogeneous (constant) speed.
Green's kernel for the scalar modal equation
Efficient computation of the Solar Green's kernel under
spherical symmetry applications to helioseismology: the
entire kernel for all positions is recovered from only
2 simulations.
Reference:
Efficient and accurate algorithm for the full modal
Green's kernel of the scalar wave equation in
helioseismology,
H. Barucq, F. Faucher, D. Fournier, L. Gizon & H. Pham,
SIAM Journal on Applied Mathematics, 80, 2020,
(84 pages extended research report).
Green's kernel at 7mHz for mode l=100.
Solar power-spectrum.
Three-dimensional modeling of solar scalar waves
We solve the full 3D scalar wave equations to allow for non-spherical perturbations.
MPI
/14threads): 20min
Discretization mesh.
1mHz Solar wave in r=0.99.
3D elastic wave propagation in media with topography using p-adaptivity
Elastic medium of size 20x20x10 km, with topography: the cells composing the mesh are small near the surface to accurately represent the topography, and larger below to use high-order polynomial: We rely on p-adaptivity.
MPI
/12threads): 45min Reference for acoustics: Adjoint-state method for Hybridizable Discontinuous Galerkin discretization: application to the inverse acoustic wave problem. F. Faucher & O. Scherzer; Computer methods in applied mechanics and engineering, 2020.
Adapted mesh to handle topography and elastic solution v_z at 3 Hz.
3D elastic global Earth simulation
Global Earth elastic simulation at 10mHz, using the P- and S-wave speed, the density and quality factors from the PREM Earth models.
MPI
/14threads = 1280 cores): 18min. PREM P-wave speed model.
10mHz elastic solution \Vert v \Vert, 3D visualization and cross-sections.
The software runs using a parameter file as input, where
all the details on the problem considered are given, e.g., the
dimension, the physical parameters to used, where to find the
mesh of the domain, etc. These are detailed in the
downloadable
User's guide, which contains an Index with the list of keywords.
The code combines MPI
and OpenMP
parallelism.
We illustrate the possibilities in the
Applications page while
a tutorial can be found on the dedicated page.
The code solves time-harmonic problems and relies on a direct solver.
Therefore, one should be careful regarding the memory required to
run an experiment for large 3D media.
For instance, in the Applications page,
we illustrate experiments that require TeraBytes of memory.
Click here to download the User's guide for the details on the utilization.
There is also the
technical documentation, obtained with doxygen
.
We review the main equations for the propagation of time-harmonic waves, currently implemented in the code.
The first-order forward problem at frequency \omega is given by the solutions (p,\,\boldsymbol{v}) that solves, for a source f,
In the case without attenuation, the physical properties c and \rho are real function. Using time-harmonic waves, the attenuation can be encoded using complex-valued parameters, for instance, we replace c_0 by c \, = \, c_0 \, - \, \mathrm{i} \alpha, a complex wave speed where \alpha encodes the attenuation, cf. [1, 2].
The forward problem at frequency \omega is given by the solutions (\boldsymbol{\sigma},\,\boldsymbol{u}) that solves, for a source {\boldsymbol{f}},
In the case with attenuation, the consistutive law is changed
with, for instance for the Kelvin--Voigt model, see
[1],
\boldsymbol{\sigma} \, = \, ( \boldsymbol{C} \,-\, \mathrm{i} \, \omega \boldsymbol{\eta}) \,:\, \boldsymbol{\epsilon}
constitutive law for the Kelvin--Voigt model of attenuation.
The forward problem at frequency \omega is given by the solutions (w,\,v) that solves, for a source {g}, cf. [3, 4],
For the discretization of the wave PDE, we rely on the hybridizable discontinuous Galerkin method, which has the following features, cf. [5, 6, 7],
hawen
are prescribed
in [5].
Comparisons of the degrees of freedom between Finite Element (left), Internal Penalty Discontinuous Galerkin (middle) and Hybridizable Discontinuous Galerkin (right). The global system in HDG only accounts for the dof on the faces (right picture), and then the (small) local system provides the solution in the volume (middle picture). HDG is efficient for sufficiently high-order polynomials, as more inner degrees of freedom are omitted.
We provide some tutorials to illustrate the code capability, with the following:
examples/
folder.
examples
In the examples
folder from the root directory,
start by extracting the archive and moving to the folder:
The folder contains all the necessary files to run the code:
The folder input
contains the mesh of the domain
and the text file with the positions of the receivers and we
The parameter files for acoustic and elastic medium are contained
in the main folder, and we refer to
the user's guide documentation
for more details.
After the executables for the forward problems have been successfully
compiled, the software
can be run combining MPI
and OpenMP
parallelism.
For instance, combining 2 MPI
and 2 threads per MPI
,
one runs
Information on the run are printed on the screen, such as the frequency, the number
of cells in the mesh, and timers. Once the solutions are obtained, output folders
are created and contain the wavefields. One can visualize these wavefields
using ParaView:
Acoustic wave pressure solution at 0.4 Hz
read with ParaView from file
results_acoustic/wavefield/wavefield_frequency_0.00000E+00_4.00000E-01Hz_src000001.pvtu
.
The color scale varies from -0.2 to 0.2.
Elastic wave solution v_z at 0.2 Hz
read with ParaView from file
results_elastic/wavefield/wavefield_frequency_0.00000E+00_2.00000E-01Hz_src000001.pvtu
.
The color scale varies from -15 to 15.
Once you have downloaded the files,
you can extract the archive with
sep
format: the Marmousi wave speed and the starting
wave speed for the reconstruction. triangle
: one of about
75 000 elements (only used for modeling), one of about 25 000
elements, used for the initial iterations of the inversion, and
one of about 50 000 elements, user for the iterations at higher
frequencies, to improve the resolution.results-reference.tar.xz
contains
the results that should be reproduced following the steps
given below.
2D Marmousi model, of size 9.2 by 3 km²,
from file models/cp_marmousi.H@
Starting model for reconstruction: one-dimensional profile,
from file models/cp_start-1d.H@
Reconstruction using frequencies from 2 to 10 Hz, see below.
All of the pictures use the same color scale, from 1500 to 5500 meters per second.
Step 1: use the modeling to generate the (synthetic) observed data;
that is the solution of the acousti wave propagation a the
receiver locations.
export OMP_NUM_THREADS=2
mpirun -np 16 forward_helmholtz_acoustic-iso_hdg.out parameter=par.modeling_75k
modeling/
folder.
The parameter file par.modeling_75k
contains the informations
associated to run the problem, in particular we notice the following selected
options:
results/modeling/
. In particular,
the folder results/modeling/receivers/
contain the ascii files that represent
the solution recorded at the receivers positions.
There is one file per source and per frequency, containing one complex numbers per receiver.
Step 2: Run the inverse problem, that is, from the (synthetic) observed data
generated in Step 1, we try to reconstruct the Marmousi wave speed
starting from the one-dimensional variation pictured above.
Here we use two steps:
firstly we invert frequencies between 2 and 4 Hz, and a coarse mesh
and secondly we use frequencies from 5 to 10 Hz, with a refined mesh
to improve the resolution of the reconstruction.
export OMP_NUM_THREADS=2
mpirun -np 16 inversion_helmholtz_acoustic-iso_hdg.out parameter=par.fwi_set1
mpirun -np 16 inversion_helmholtz_acoustic-iso_hdg.out parameter=par.fwi_set2
inversion/
folder.
The parameter files par.fwi_set1
and par.fwi_set2
contain the informations associated to run the inverse problem, in particular
we notice the following selected options:
results/fwi/
. In particular,
the folder results/fwi/set2/model/
contain the reconstructed model, with one
saved after each frequency iterations.
We picture below the reconstruction.
In addition, you have a look at the files results/fwi/set2/residual/misfit-function.txt
to see the evolution of the misfit function with the iterations.
10Hz reconstruction visualized with ParaView, i.e., on the unstructured domain,
from file results/fwi/set2/model/model_frequency_0.00000E+00_1.00000E+01Hz.pvtu
.
The color scale varies from 1500 to 5500 meters per second.
10Hz reconstruction visualized with sep
format, i.e., the unstructured solution
is projected onto a structured grid,
from file results/fwi/set2/model/vp_frequency_0.00000E+00_1.00000E+01Hz.H@
.
The color scale varies from 1500 to 5500 meters per second.
If you are using hawen
in your works, please send you references to florian.faucher [at] inria.fr.
To cite the software, please use the following reference:
Here is a list of publications and preprints where hawen
is used for the numerical experiments.
For any information, suggestions and feedback
please contact
florian.faucher [at] inria.fr,
indicating ``hawen'' in the email subject.
All comments are very much appreciated.
In addition, issues regarding the code can be indicated in the
dedicated page of the gitlab
repository.
Since October 2019:
The code and website are developed and maintained by
Florian Faucher, member of the project-team Makutu at
Inria and Unversity of Pau and Pays de l'Adour,
formerly at the Faculty of Mathematics, University of Vienna, Austria.
personal webpage: https://ffaucher.gitlab.io/cv
Computational resources used to for the software are: