<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Rcapron</id>
		<title>Planets - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Rcapron"/>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Special:Contributions/Rcapron"/>
		<updated>2026-04-15T12:22:49Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.27.7</generator>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Tool_Box_Venus_PCM&amp;diff=2125</id>
		<title>Tool Box Venus PCM</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Tool_Box_Venus_PCM&amp;diff=2125"/>
				<updated>2024-08-14T10:00:02Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* LMDZ.VENUS/Tools */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Post-processing tools provided with the Venus PCM ==&lt;br /&gt;
First and foremost there are a number of postprocessing utilities (self-standing tools) which can be found in the ''UTIL'' and ''LMDZ.VENUS/Tools'' directories. Secondly one has some extra main programs in the dynamics-physics interface.&lt;br /&gt;
&lt;br /&gt;
=== UTIL ===&lt;br /&gt;
In the top-level ''UTIL'' directory can be found some utilities for post-processing Venus PCM outputs such as:&lt;br /&gt;
* zrecast : a utility to vertically interpolate PCM outputs (which are on the native hybrid sigma-pressure vertical coordinate) onto pressure or altitude levels. An example of a compiling script, &amp;lt;code&amp;gt;compile&amp;lt;/code&amp;gt;, to compile ''zrecast'' is also provided; to be adapted to your local settings.&lt;br /&gt;
* ... To be completed ...&lt;br /&gt;
&lt;br /&gt;
=== LMDZ.VENUS/Tools ===&lt;br /&gt;
This subdirectory contains the following utilities (check out the README file present in that directory for additional information):&lt;br /&gt;
* angmom : to compute angular momentum and torque components&lt;br /&gt;
* energy : to compute specific and integarted potential and kinetic energy&lt;br /&gt;
* fft : to compute the Fourier decomposition&lt;br /&gt;
* localtime_mean_and_std : to interpolate variables at the same local time everywhere&lt;br /&gt;
* psi : to compute the streamfunction&lt;br /&gt;
* stability: to compute stability, Richardson number and distance to cyclostrophic equilibrium&lt;br /&gt;
* tem : to compute TransEulerianMean variables&lt;br /&gt;
* tmc : to compute angular momentum transport from high-frequency outputs&lt;br /&gt;
&lt;br /&gt;
The ''startarchive2icosa'' subdirectory contains some programs and instructions (see the README there) to generate a DYNAMICO-Venus set of start files from a lon-lat &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file&lt;br /&gt;
&lt;br /&gt;
====  Using angmom ====&lt;br /&gt;
First, one need to compile the subprograms using this script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
# Source&lt;br /&gt;
source /chemin/vers/votre/XIOS/arch.env&lt;br /&gt;
source /chemin/vers/votre/XIOS/arch.path&lt;br /&gt;
&lt;br /&gt;
# Chemins NetCDF&lt;br /&gt;
NETCDF_INCLUDE=$NETCDF_INCDIR&lt;br /&gt;
NETCDF_LIB=$NETCDF_LIBDIR&lt;br /&gt;
&lt;br /&gt;
# Compilation des sous-programmes&lt;br /&gt;
mpif90 -c -g -traceback cpdet.F90 moyzon.F moyzon2.F moytim.F dx_dp.F epflux.F90 io.F90 dmass.F90 reverse.F90 \&lt;br /&gt;
$NETCDF_INCLUDE $NETCDF_LIB&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, compile the main programs with this script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
# Source &lt;br /&gt;
source /chemin/vers/votre/XIOS/arch.env&lt;br /&gt;
source /chemin/vers/votre/XIOS/arch.path&lt;br /&gt;
&lt;br /&gt;
# Chemins NetCDF&lt;br /&gt;
NETCDF_INCLUDE=$NETCDF_INCDIR&lt;br /&gt;
NETCDF_LIB=$NETCDF_LIBDIR&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
main_programs=(&amp;quot;stability.F90&amp;quot; &amp;quot;energy.F90&amp;quot; &amp;quot;psi.F90&amp;quot; &amp;quot;tem.F90&amp;quot; &amp;quot;angmom.F90&amp;quot; &amp;quot;tmc.F90&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
for program in &amp;quot;${main_programs[@]}&amp;quot;; do&lt;br /&gt;
    program_name=$(basename &amp;quot;$program&amp;quot; .F90)&lt;br /&gt;
    mpif90 -g -traceback &amp;quot;$program&amp;quot; *.o \&lt;br /&gt;
    -I /chemin/vers/votre/XIOS/inc -L /chemin/vers/votre/XIOS/lib -lxios -lstdc++ \&lt;br /&gt;
    $NETCDF_INCLUDE $NETCDF_LIB -o &amp;quot;${program_name}.e&amp;quot;&lt;br /&gt;
done&lt;br /&gt;
&lt;br /&gt;
\rm *.o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Be aware that, if you modify a one of the main programs and wish to re-compile it, don't forget to re-compile the subprograms first. Without, you'll end up with an issue, that the script cannot find the .o files needed for the compilation. &lt;br /&gt;
&lt;br /&gt;
After doing this, you should obtain the executable &amp;lt;code&amp;gt;angmom.e&amp;lt;/code&amp;gt;. Copy it in the repository that contains the NetCDF files that you want to analyse. &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cp ../../your/path/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then execute it &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./angmom.e&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When the program is run, you will first be asked for the name of the file you wish to use. Write it down and then type enter.&lt;br /&gt;
You will then be asked if you wish to use the file &amp;quot;dynzon.nc&amp;quot;. If you don't want to, simply type &amp;quot;enter&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
Normally, at the end of execution, you should obtain a file called &amp;quot;your_file_GAM.nc&amp;quot; (if the original name was &amp;quot;your_file.nc&amp;quot;).&lt;br /&gt;
&lt;br /&gt;
'''Nota Bene :'''&lt;br /&gt;
Here is a modified &amp;quot;alternative&amp;quot; script of angmom.F90, that don't take into account the &amp;lt;code&amp;gt;dyzon.nc&amp;lt;/code&amp;gt; file, and, which is run only by this command :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./angmom.e your_file.nc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This is a way more practical use of it.&lt;br /&gt;
&lt;br /&gt;
'''Nota Bene 2 :'''&lt;br /&gt;
If you use angmom on a file that is a bit heavy, you will probably get an execution error, specifying that the machine has not been able to allocate enough memory. To remedy this, simply run a job in a compute node. The advantage of '''Nota Bene 1''' is that it is easier to run the program in a compute node that is not necessarily interactive (using the &amp;quot;alternative&amp;quot; script). &lt;br /&gt;
&lt;br /&gt;
Example of script job submission in this case (simply use &amp;lt;code&amp;gt;sbatch your_submission_script&amp;lt;/code&amp;gt;) :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=1&lt;br /&gt;
#SBATCH --partition=zen16&lt;br /&gt;
#SBATCH --mem=24G&lt;br /&gt;
#SBATCH -J job_angmom&lt;br /&gt;
#SBATCH --time=1:00:00&lt;br /&gt;
#SBATCH --output=job_angmom.%j.out&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ARCH/arch-ifort_MESOIPSL.env&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
./angmom.e your_file.nc &amp;gt; angmom.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can follow the run of the program in angmom.out, using for example &amp;lt;code&amp;gt;tail -f angmom.out&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Main programs (other than GCM) but included in the Venus PCM package ==&lt;br /&gt;
There are a few other main programs that are included with the GCM. &lt;br /&gt;
&lt;br /&gt;
Advanced stuff: these main programs are located under ''LMDZ.VENUS/libf/dynphy_lonlat/phyvenus/'' as they are at the interface between lon-lat dynamics and the Venus physics package &lt;br /&gt;
&lt;br /&gt;
=== start2archive ===&lt;br /&gt;
This program collects multiple &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files from a series simulations and store them in a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file. For this one simply needs to run the &amp;lt;code&amp;gt;startarchive&amp;lt;/code&amp;gt; program in the directory. It will automatically fetch &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files and generate &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt;. If a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file is already present then the current &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files are added to the &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file (which can contain multiple initial states, as long as they are on the same grid and correspond to different dates.&lt;br /&gt;
&lt;br /&gt;
The &amp;lt;code&amp;gt;startarchive&amp;lt;/code&amp;gt; program should be compiled at the same resolution as the gcm which produced the start files, using the [[The_makelmdz_fcm_GCM_Compilation_Script | makelmdz_fcm]] compilation script &lt;br /&gt;
 &lt;br /&gt;
=== newstart ===&lt;br /&gt;
This program is to:&lt;br /&gt;
* extract (and interpolate) &amp;lt;code&amp;gt;restart.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;restartfi.nc&amp;lt;/code&amp;gt; files from a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file or from a pair of &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files. The subtle difference between the two setup is that grid interpolation (horizontal and/or vertical) is only possible if using a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; input file&lt;br /&gt;
* modify values and fields contained in the initial condition file&lt;br /&gt;
* Compiling &amp;lt;code&amp;gt;newstart&amp;lt;/code&amp;gt; is done using the [[The makelmdz fcm GCM Compilation Script|makelmdz_fcm]] utility. The program is then meant to be run interactively with the user providing options and choices when prompted.&lt;br /&gt;
* Once the program has run and finished without error, it will generate &amp;lt;code&amp;gt;restart.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;restartfi.nc&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== rearrange_startphy ===&lt;br /&gt;
In a nutshell, this program allows us to convert a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; coming from Venus LMDZ simulation startfiles, into a &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startphy.nc&amp;lt;/code&amp;gt; files, usable to begin a '''Venus - DYNAMICO''' simulation. &lt;br /&gt;
You can find it here : &lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
/your/path/LMDZ.VENUS/Tools/startarchive2icosa/&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Every step is well explained in the REAMDE file contained in the same directory.&lt;br /&gt;
&lt;br /&gt;
== The rcm1d 1D column program ==&lt;br /&gt;
The source code is located under ''LMDZ.VENUS/libf/phyvenus/dyn1d''&lt;br /&gt;
&lt;br /&gt;
... Compilation...&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Tool_Box_Venus_PCM&amp;diff=2124</id>
		<title>Tool Box Venus PCM</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Tool_Box_Venus_PCM&amp;diff=2124"/>
				<updated>2024-08-14T09:23:23Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Main programs (other than GCM) but included in the Venus PCM package */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Post-processing tools provided with the Venus PCM ==&lt;br /&gt;
First and foremost there are a number of postprocessing utilities (self-standing tools) which can be found in the ''UTIL'' and ''LMDZ.VENUS/Tools'' directories. Secondly one has some extra main programs in the dynamics-physics interface.&lt;br /&gt;
&lt;br /&gt;
=== UTIL ===&lt;br /&gt;
In the top-level ''UTIL'' directory can be found some utilities for post-processing Venus PCM outputs such as:&lt;br /&gt;
* zrecast : a utility to vertically interpolate PCM outputs (which are on the native hybrid sigma-pressure vertical coordinate) onto pressure or altitude levels. An example of a compiling script, &amp;lt;code&amp;gt;compile&amp;lt;/code&amp;gt;, to compile ''zrecast'' is also provided; to be adapted to your local settings.&lt;br /&gt;
* ... To be completed ...&lt;br /&gt;
&lt;br /&gt;
=== LMDZ.VENUS/Tools ===&lt;br /&gt;
This subdirectory contains the following utilities (check out the README file present in that directory for additional information):&lt;br /&gt;
* angmom : to compute angular momentum and torque components&lt;br /&gt;
* energy : to compute specific and integarted potential and kinetic energy&lt;br /&gt;
* fft : to compute the Fourier decomposition&lt;br /&gt;
* localtime_mean_and_std : to interpolate variables at the same local time everywhere&lt;br /&gt;
* psi : to compute the streamfunction&lt;br /&gt;
* stability: to compute stability, Richardson number and distance to cyclostrophic equilibrium&lt;br /&gt;
* tem : to compute TransEulerianMean variables&lt;br /&gt;
* tmc : to compute angular momentum transport from high-frequency outputs&lt;br /&gt;
&lt;br /&gt;
The ''startarchive2icosa'' subdirectory contains some programs and instructions (see the README there) to generate a DYNAMICO-Venus set of start files from a lon-lat &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file&lt;br /&gt;
&lt;br /&gt;
== Main programs (other than GCM) but included in the Venus PCM package ==&lt;br /&gt;
There are a few other main programs that are included with the GCM. &lt;br /&gt;
&lt;br /&gt;
Advanced stuff: these main programs are located under ''LMDZ.VENUS/libf/dynphy_lonlat/phyvenus/'' as they are at the interface between lon-lat dynamics and the Venus physics package &lt;br /&gt;
&lt;br /&gt;
=== start2archive ===&lt;br /&gt;
This program collects multiple &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files from a series simulations and store them in a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file. For this one simply needs to run the &amp;lt;code&amp;gt;startarchive&amp;lt;/code&amp;gt; program in the directory. It will automatically fetch &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files and generate &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt;. If a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file is already present then the current &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files are added to the &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file (which can contain multiple initial states, as long as they are on the same grid and correspond to different dates.&lt;br /&gt;
&lt;br /&gt;
The &amp;lt;code&amp;gt;startarchive&amp;lt;/code&amp;gt; program should be compiled at the same resolution as the gcm which produced the start files, using the [[The_makelmdz_fcm_GCM_Compilation_Script | makelmdz_fcm]] compilation script &lt;br /&gt;
 &lt;br /&gt;
=== newstart ===&lt;br /&gt;
This program is to:&lt;br /&gt;
* extract (and interpolate) &amp;lt;code&amp;gt;restart.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;restartfi.nc&amp;lt;/code&amp;gt; files from a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; file or from a pair of &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startfi.nc&amp;lt;/code&amp;gt; files. The subtle difference between the two setup is that grid interpolation (horizontal and/or vertical) is only possible if using a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; input file&lt;br /&gt;
* modify values and fields contained in the initial condition file&lt;br /&gt;
* Compiling &amp;lt;code&amp;gt;newstart&amp;lt;/code&amp;gt; is done using the [[The makelmdz fcm GCM Compilation Script|makelmdz_fcm]] utility. The program is then meant to be run interactively with the user providing options and choices when prompted.&lt;br /&gt;
* Once the program has run and finished without error, it will generate &amp;lt;code&amp;gt;restart.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;restartfi.nc&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== rearrange_startphy ===&lt;br /&gt;
In a nutshell, this program allows us to convert a &amp;lt;code&amp;gt;start_archive.nc&amp;lt;/code&amp;gt; coming from Venus LMDZ simulation startfiles, into a &amp;lt;code&amp;gt;start.nc&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;startphy.nc&amp;lt;/code&amp;gt; files, usable to begin a '''Venus - DYNAMICO''' simulation. &lt;br /&gt;
You can find it here : &lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
/your/path/LMDZ.VENUS/Tools/startarchive2icosa/&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Every step is well explained in the REAMDE file contained in the same directory.&lt;br /&gt;
&lt;br /&gt;
== The rcm1d 1D column program ==&lt;br /&gt;
The source code is located under ''LMDZ.VENUS/libf/phyvenus/dyn1d''&lt;br /&gt;
&lt;br /&gt;
... Compilation...&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Managing_the_Venus_PCM_outputs&amp;diff=2123</id>
		<title>Managing the Venus PCM outputs</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Managing_the_Venus_PCM_outputs&amp;diff=2123"/>
				<updated>2024-08-14T08:56:05Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Venus PCM - LMDZ outputs with IOIPSL */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page gives some comments and instructions on managing the Venus PCM outputs, depending on the dynamical core (LMDZ or DYNAMICO) and input library (IOIPSL or XIOS) that is used.&lt;br /&gt;
&lt;br /&gt;
== Generalities ==&lt;br /&gt;
* We here describe only the outputs from the Venus physics package, not from the dynamics&lt;br /&gt;
* depending on which input/output library ([[The IOIPSL Library|IOIPSL]] or [[The XIOS Library|XIOS]]) things are quite different. &amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;Outputs using IOIPSL is depreciated&amp;lt;/span&amp;gt; (but still possible); '''using XIOS should be favored and is strongly recommended'''.&lt;br /&gt;
&lt;br /&gt;
== Venus PCM - LMDZ outputs with IOIPSL ==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;Warning: using IOIPSL for outputs is depreciated&amp;lt;/span&amp;gt;.&amp;lt;br&amp;gt; When outputs are with IOIPSL they consist in (at most) two files, ''histmth.nc'' (averages) and ''histins.nc'' (instantaneous fields) and&lt;br /&gt;
are controlled by some flags in the physiq.def file, namely:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Parameters for IOIPSL output files&lt;br /&gt;
##~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~&lt;br /&gt;
## OLD. Now we use XIOS =&amp;gt; see context_lmdz_physics.xml to taylor the output files&lt;br /&gt;
#&lt;br /&gt;
### OK_journe= y for daily output file histday.nc, =n no histday.nc output&lt;br /&gt;
### Meaningless for Venus&lt;br /&gt;
OK_journe=n&lt;br /&gt;
### OK_mensuel= y for monthly output file histmth.nc, =n no histmth.nc&lt;br /&gt;
### For Venus, only these averaged outputs&lt;br /&gt;
OK_mensuel=n&lt;br /&gt;
## rate (in days) at which the Venus histmth file is to be written               &lt;br /&gt;
# sets the output rate in histmth and/or histins &lt;br /&gt;
ecritphy=0.1&lt;br /&gt;
### OK_instan=y, make some &amp;quot;instantaneous&amp;quot; outputs (same rate as histmth)&lt;br /&gt;
OK_instan=n&lt;br /&gt;
# &lt;br /&gt;
# Output levels for the various output files&lt;br /&gt;
#&lt;br /&gt;
# output level for  &amp;quot;day&amp;quot; lev_histday&lt;br /&gt;
# - lev_hist*=1 =&amp;gt; baseline 2D fields&lt;br /&gt;
# - lev_hist*=2 =&amp;gt; baseline 3D fields (default)&lt;br /&gt;
# - lev_hist*=3 =&amp;gt; radiative transfert&lt;br /&gt;
# - lev_hist*=4 =&amp;gt; 3D tendencies&lt;br /&gt;
# - lev_hist*=5 =&amp;gt; tracers and others&lt;br /&gt;
lev_histday=2&lt;br /&gt;
#output level for &amp;quot;mth&amp;quot; lev_histmth &lt;br /&gt;
lev_histmth=2&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note that the setup (and flags) is inherited from the Earth model hence some weird unusable options (like '''OK_journe''') and flag names.&lt;br /&gt;
The user thus can trigger the generation of ''histins.nc'' outputs by setting '''OK_instan=y''' and the generation of ''histmth.nc'' by setting '''OK_mensuel=y'''. In addition the output frequency (i.e. timesteps in the output file) can be adapted using the flag '''ecritphy''' which should be set to the (Venus) day fraction to be used. As concerns the fields that will be included in the output files, these will depend on the values of flags '''lev_histday''' and '''lev_histmth''' as mentioned in the comments above. Note that if the user needs to add variables to the output files or change the default behavior then this will require modifying the source code.&lt;br /&gt;
&lt;br /&gt;
If running in parallel (MPI) with outputs using IOIPSL, each process will generate each its own set of output files: histmth_0000.nc, histins_0000.nc for process number 0, histmth_0001.nc, histins_0001.nc for process number 1, and so on. Once the run is finished it is up to the user to recombine these hist*_*.nc files into single files gathering the data over the entire planet, using the IOIPSL rebuild script. It is located it the directory :&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
/your/path/IOIPSL/rebuild &lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You need to copy both the executables '''rebuild''' and '''flio_rbld''' to the directory containing your '''histins/histmth.nc''' files.&lt;br /&gt;
Then execute it use it like that (example given on histmth, works the same with histins files), the first name is the one you want for your new file, the second (and others), are the histins/histmth.nc files you are using the rebuild on :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
rebuild histmth.nc histmth_*.nc &lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Nota Bene :&lt;br /&gt;
It is possible to use it like this too (guessing you have 4 histmth.nc files in this example), less practical, but good to know :&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
rebuild histmth.nc histmth_0001.nc histmth_0002.nc histmth_0003.nc histmth_0004.nc&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Venus PCM - LMDZ outputs with XIOS ==&lt;br /&gt;
Note that this configuration requires running in parallel and with XIOS enabled, i.e. having compiled with makelmdz_fcm options&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
-parallel mpi -io xios&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
XIOS outputs are managed and defined via dedicated XML files which must be alongside the executable when it is run. Check out the [[The XIOS Library| XIOS library]] page for more details about what the XML files should contain. In a nutshell one needs the following files:&lt;br /&gt;
* '''iodef.xml''' : the XIOS &amp;quot;master&amp;quot; input file (which includes all the other XML files)&lt;br /&gt;
* '''context_lmdz_physics.xml''' : the XIOS &amp;quot;main&amp;quot; file (concerns the physics package and includes other XML files) which also contains the description of the various input and output grids to be used.&lt;br /&gt;
* '''field_def_physics.xml''' : file which describes all the variables that the code sends to XIOS and which might be outputted.&lt;br /&gt;
* '''file_def_physics.xml''' : file which describes all the desired output files and their contents&lt;br /&gt;
Some examples  of these xml files are given in ''LMDZ.VENUS/deftank''&lt;br /&gt;
&lt;br /&gt;
TODO: MORE DETAILS ON THE XML SYNTAX AND RELEVANT PARAMETERS NEEDED HERE&lt;br /&gt;
&lt;br /&gt;
== Venus PCM - DYNAMICO outputs with XIOS ==&lt;br /&gt;
Note that using XIOS is the only possibility (no IOIPSL outputs) when using DYNAMICO&lt;br /&gt;
&lt;br /&gt;
TODO: MORE DETAILS NEEDED HERE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;br /&gt;
[[Category:Venus-LMDZ]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2104</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2104"/>
				<updated>2024-07-15T12:19:19Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Where to find .xml files */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run : &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the '''context_lmdz_physics.xml''' contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by : (replacing between &amp;lt;domain_definition&amp;gt; and &amp;lt;/domain_definition&amp;gt;)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
- '''z2sig.def''' : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works). See this page for more information : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_z2sig.def_Input_File The z2sig.def Input File]&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2103</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2103"/>
				<updated>2024-07-10T11:56:35Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Where to find .xml files */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run : &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the '''context_lmdz_physics.xml''' contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
- '''z2sig.def''' : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works). See this page for more information : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_z2sig.def_Input_File The z2sig.def Input File]&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2102</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2102"/>
				<updated>2024-07-10T09:57:28Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Where to find .def files */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run : &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
- '''z2sig.def''' : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works). See this page for more information : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_z2sig.def_Input_File The z2sig.def Input File]&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2101</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2101"/>
				<updated>2024-07-10T09:55:43Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Where to find .def files */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run : &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
- '''z2sig.def''' : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works)&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2100</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2100"/>
				<updated>2024-07-10T09:50:20Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Compilation - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run : &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2099</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2099"/>
				<updated>2024-07-10T09:45:50Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Where to find .def files */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2098</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2098"/>
				<updated>2024-07-10T09:39:25Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Where to find .def files */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&amp;amp;Suarez testCase) : &lt;br /&gt;
&lt;br /&gt;
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.&lt;br /&gt;
&lt;br /&gt;
- '''run.def''' : just a &amp;quot;bridge&amp;quot; for run_icosa.def and physics.def.&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2097</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2097"/>
				<updated>2024-07-10T09:30:07Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Running Venus - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .xml files ===&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''&amp;quot;dom_glo&amp;quot;''' issue when running DYNAMICO, here are the lines to change/add :&lt;br /&gt;
&lt;br /&gt;
Line 7 to completely replace by :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;domain_group id=&amp;quot;dom_glo&amp;quot; data_dim=&amp;quot;1&amp;quot; &amp;gt;&lt;br /&gt;
  &amp;lt;domain id=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/domain_group&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_regular&amp;quot; ni_glo=&amp;quot;96&amp;quot; nj_glo=&amp;quot;97&amp;quot; type=&amp;quot;rectilinear&amp;quot;  &amp;gt;&lt;br /&gt;
      &amp;lt;generate_rectilinear_domain lat_start=&amp;quot;-90&amp;quot; lat_end=&amp;quot;90&amp;quot; lon_start=&amp;quot;180&amp;quot; lon_end=&amp;quot;-176.25&amp;quot; /&amp;gt;&lt;br /&gt;
      &amp;lt;interpolate_domain order=&amp;quot;1&amp;quot;/&amp;gt;&lt;br /&gt;
&amp;lt;/domain&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;domain id=&amp;quot;dom_out&amp;quot; domain_ref=&amp;quot;dom_regular&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 44 to 46, add : (line 34 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_glo&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then lines 50 to 57 add : (line 37 from the original file)&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;!-- output grids --&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_3D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
    &amp;lt;axis axis_ref=&amp;quot;altitude&amp;quot; /&amp;gt;&lt;br /&gt;
&amp;lt;/grid&amp;gt;&lt;br /&gt;
&amp;lt;grid id=&amp;quot;grid_2D_out&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;domain domain_ref=&amp;quot;dom_out&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Where to find .def files ===&lt;br /&gt;
&lt;br /&gt;
=== Where to find others needed files ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2096</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2096"/>
				<updated>2024-07-10T09:07:53Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Running Venus - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where (note that this is an exemple that what's written right after correspond to the '''current''' way of choosing xml.file when writing this documentation, this if therefore strongly adviced to open the README that should be always up to date) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
- nudging_dynamico.xml&lt;br /&gt;
- sponge_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2095</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2095"/>
				<updated>2024-07-10T08:52:50Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Connection Venus - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
And the ARCH directory that contains informations about every architectures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2094</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2094"/>
				<updated>2024-07-10T08:51:03Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Installation - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].&lt;br /&gt;
That said, we will explain it once more in this page to be sure everything works.&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2091</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2091"/>
				<updated>2024-07-04T10:00:26Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Compilation Venus - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
All these compiling steps are summed up in make_isoca_lmdz :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2090</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2090"/>
				<updated>2024-06-25T14:27:29Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
== Using the restart.nc file to continue your simulation ==&lt;br /&gt;
If you want to continue you simulation using the &amp;quot;end-data&amp;quot; of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2089</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2089"/>
				<updated>2024-06-25T14:11:09Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, to be quick, you should change the '''&amp;quot;etat0&amp;quot;''' parameter into '''&amp;quot;venus&amp;quot;''', the '''&amp;quot;physics&amp;quot;''' parameter into '''&amp;quot;Lebonnois2012&amp;quot;''', the '''&amp;quot;day_step&amp;quot;''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
(one can compare with the previous run.def to see the differences) &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2088</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2088"/>
				<updated>2024-06-25T14:07:11Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Execution - Test_Case Held&amp;amp;Suarez - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Venus (type Held&amp;amp;Suarez) - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase  (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&amp;amp;Suarez basic testCase.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_VENUS&lt;br /&gt;
cp *def ../../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_VENUS&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_VENUS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Then, there are some changes to be made to the run.def and earth_const.def files.&lt;br /&gt;
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mv earth_const.def venus_const.def&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, you should change the '''venus_const.def''' completely, to match the Venus atmosphere, here's an example of script :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Rayon planétaire (m) &lt;br /&gt;
radius = 6.0518e6&lt;br /&gt;
&lt;br /&gt;
# Durée d'un jour (s))&lt;br /&gt;
daysec = 20995200&lt;br /&gt;
&lt;br /&gt;
# Gravité : réel (par défaut = 8.87)&lt;br /&gt;
g = 8.87&lt;br /&gt;
&lt;br /&gt;
# Taux de rotation planétaire&lt;br /&gt;
omega = 2.992e-7&lt;br /&gt;
&lt;br /&gt;
# kappa=Rd/cpp&lt;br /&gt;
kappa = 0.2857143&lt;br /&gt;
&lt;br /&gt;
# Capacité thermique à pression constante : réel (par défaut = 1004.70885)&lt;br /&gt;
cpp = 1004&lt;br /&gt;
&lt;br /&gt;
# Pression de référence : réel (par défaut = 9200000)&lt;br /&gt;
preff = 9.2e6&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
After this, it is time to change the '''run.def''' file, rather than explaining all the different variables that changes here's an example of a complete script (that should work from scratch):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#---------------- Mesh ----------------&lt;br /&gt;
&lt;br /&gt;
# Number of subdivisions on a main triangle : integer (default=40)&lt;br /&gt;
nbp = 40&lt;br /&gt;
&lt;br /&gt;
# Number of vertical layers : integer (default=19)&lt;br /&gt;
llm = 19&lt;br /&gt;
&lt;br /&gt;
# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)&lt;br /&gt;
disvert = std&lt;br /&gt;
&lt;br /&gt;
# Mesh optimisation : number of iterations : integer (default=0)&lt;br /&gt;
optim_it = 1000&lt;br /&gt;
&lt;br /&gt;
# Sub splitting of main rhombus : integer (default=1)&lt;br /&gt;
nsplit_i = 1&lt;br /&gt;
nsplit_j = 1&lt;br /&gt;
&lt;br /&gt;
#number of openmp task on vertical level&lt;br /&gt;
omp_level_size=1&lt;br /&gt;
&lt;br /&gt;
#---------------- Numerics ----------------&lt;br /&gt;
&lt;br /&gt;
# Advection called every itau_adv time steps : integer (default=2)&lt;br /&gt;
itau_adv = 1&lt;br /&gt;
&lt;br /&gt;
# Time step in s : real (default=480)&lt;br /&gt;
# dt = 720&lt;br /&gt;
# Alternative to specifying &amp;quot;dt&amp;quot;, specify number of steps per day : day_step&lt;br /&gt;
day_step = 240000&lt;br /&gt;
&lt;br /&gt;
# Number of tracers : integer (default=1)&lt;br /&gt;
nqtot = 1&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Time and output ----------------&lt;br /&gt;
&lt;br /&gt;
# Time style : [none|dcmip] (default=dcmip)&lt;br /&gt;
time_style = none&lt;br /&gt;
&lt;br /&gt;
# Run length in s : real (default=??)&lt;br /&gt;
# run_length = 1036800&lt;br /&gt;
# Alternative to specifying &amp;quot;run_length&amp;quot;, specify number of days to run : ndays&lt;br /&gt;
ndays=1&lt;br /&gt;
&lt;br /&gt;
# Interval in s between two outputs : integer (default=??)&lt;br /&gt;
write_period = 314928&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#---------------- Planet ----------------&lt;br /&gt;
&lt;br /&gt;
INCLUDEDEF=venus_const.def&lt;br /&gt;
&lt;br /&gt;
#---------------- Physical parameters ----------------&lt;br /&gt;
&lt;br /&gt;
# Initial state : &lt;br /&gt;
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)&lt;br /&gt;
etat0 = venus&lt;br /&gt;
&lt;br /&gt;
# Si on souhaite repartir de la &amp;quot;fin&amp;quot; d'une précédente simulation&lt;br /&gt;
#etat0=start_file&lt;br /&gt;
 &lt;br /&gt;
#start_file_name=start&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# Physics package : [none|held_suarez|dcmip] (default=none)&lt;br /&gt;
physics = Lebonnois2012&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for grad(div) : real (default=5000)&lt;br /&gt;
tau_graddiv = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of grad(div) disspation : integer (default=1)&lt;br /&gt;
nitergdiv = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for curl(curl) : real (default=5000)&lt;br /&gt;
tau_gradrot = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of curl(curl) disspation : integer (default=1)&lt;br /&gt;
nitergrot = 2&lt;br /&gt;
&lt;br /&gt;
# Dissipation time for div(grad) : real (default=5000)&lt;br /&gt;
tau_divgrad = 18000&lt;br /&gt;
&lt;br /&gt;
# Exponent of div(grad) disspation : integer (default=1)&lt;br /&gt;
niterdivgrad = 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=The_DYNAMICO_dynamical_core&amp;diff=2087</id>
		<title>The DYNAMICO dynamical core</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=The_DYNAMICO_dynamical_core&amp;diff=2087"/>
				<updated>2024-06-25T13:43:11Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is DYNAMICO ? ==&lt;br /&gt;
&lt;br /&gt;
DYNAMICO is a recent dynamical core for atmospheric general circulation models (GCM). It is based on an icosahedral hexagonal grid projected on the sphere, a hybrid pressure-based terrain-following vertical coordinate, second-order enstrophy-conserving finite-difference discretization and positive-definite advection.&lt;br /&gt;
&lt;br /&gt;
DYNAMICO is coded in Fortran and meant to be used in a massively parallel environment (using MPI and OpenMP) and has been coupled to a number of physics packages, such as the Earth LMDZ6 physics package (see https://lmdz-forge.lmd.jussieu.fr/mediawiki/LMDZPedia/index.php/Accueil ; search there for the keyword DYNAMICO to get some related documentation) but also the planetary version of the Mars, Venus or Generic Planetary Climate Models (PCM).&lt;br /&gt;
&lt;br /&gt;
The DYNAMICO source code is freely available and downloadable using git &lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
The DYNAMICO project page (be warned that information there is somewhat obsolete and related to earlier version, now obsolete, on svn) can be found at http://forge.ipsl.jussieu.fr/dynamico &lt;br /&gt;
&lt;br /&gt;
== Installing and running DYNAMICO ==&lt;br /&gt;
Here we just describe how to compile and run DYNAMICO by itself, i.e. without being coupled to any physics.&lt;br /&gt;
This is essentially done as an exercise to check that it has been correctly installed, before moving on to the more complex (and complete!) case of compiling and running with a given physics package&lt;br /&gt;
&lt;br /&gt;
=== Prerequisites ===&lt;br /&gt;
There are a couple of prerequisites to installing and using DYNAMICO:&lt;br /&gt;
# An MPI library must be available (i.e. installed and ready to use)&lt;br /&gt;
# BLAS and LAPACK libraries must be available.&lt;br /&gt;
# The XIOS library, compiled with that same MPI library, must also be available. Check out the [[The_XIOS_Library|XIOS library]] page for some information on installing it.&lt;br /&gt;
&lt;br /&gt;
=== Downloading and compiling DYNAMICO ===&lt;br /&gt;
Using git&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
will create a '''ICOSAGCM''' directory containing all the necessary source code.&lt;br /&gt;
Note that it is advised that this directory be alongside the '''XIOS''' library directory (because some relative paths in the dynamico arch*.env files assume it is the case. If not you will need to modify these files accordingly).&lt;br /&gt;
&lt;br /&gt;
In the '''ICOSAGCM''' directory is the ''make_icosa'' compilation script, which is based on FCM and thus requires that adequate architecture ASCII files be available. The '''arch''' subdirectory contains examples for a few machines. Assuming you want to compile using ''somemachine'' architecture files (i.e. files ''arch/arch-somemachine.fcm'' , ''arch/arch-somemachine.env'', ''arch/arch-somemachine.path'' are available and contain the adequate information) you would run:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch somemachine -job 8&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
If compilation went well than you will find the executable '''icosa_gcm.exe''' in the '''bin''' subdirectory&lt;br /&gt;
&lt;br /&gt;
==== For the experts: more about the arch files and their content ====&lt;br /&gt;
TO BE WRITTEN&lt;br /&gt;
&lt;br /&gt;
==== For the experts: more about the make_icosa script ====&lt;br /&gt;
To know more about possible options to the ''make_icosa'' script:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
./make_icosa -h&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Running a simple Held and Suarez test case ===&lt;br /&gt;
DYNAMICO comes with a simple test case corresponding to a Held and Suarez simulation (...DESCRIBE H&amp;amp;S IN A NUTSHELL + REF TO THEIR PAPER...).&lt;br /&gt;
&lt;br /&gt;
We will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory test_HELD_SUAREZ (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that the test-case design is such that it assumes you will be using 10 MPI processes. Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
And here another one that should work on Irene-Rome (assuming you are in project &amp;quot;gen10391&amp;quot;):&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# Partition to run on:&lt;br /&gt;
#MSUB -q rome&lt;br /&gt;
# project to run on &lt;br /&gt;
#MSUB -A  gen10391&lt;br /&gt;
# disks to use&lt;br /&gt;
#MSUB -m  scratch,work,store&lt;br /&gt;
# Job name&lt;br /&gt;
#MSUB -r job_mpi&lt;br /&gt;
# Job standard output:&lt;br /&gt;
#MSUB -o job_mpi.%I&lt;br /&gt;
# Job standard error:&lt;br /&gt;
#MSUB -e job_mpi.%I&lt;br /&gt;
# number of OpenMP threads c&lt;br /&gt;
#MSUB -c 1&lt;br /&gt;
# number of MPI tasks n&lt;br /&gt;
#MSUB -n 40&lt;br /&gt;
# number of nodes to use N&lt;br /&gt;
#MSUB -N 1&lt;br /&gt;
# max job run time T (in seconds)&lt;br /&gt;
#MSUB -T 7200&lt;br /&gt;
&lt;br /&gt;
source ../dynamico/arch.env&lt;br /&gt;
ccc_mprun -l icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Moreover the following NetCDF output files should have been produced:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
Ai.nc    daily_output_native.nc  restart.nc  time_counter.nc&lt;br /&gt;
apbp.nc  daily_output.nc         start0.nc   dynamico.nc&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
where:&lt;br /&gt;
* ''Ai.nc'', ''apbp.nc'' and ''time_counter.nc'' are unimportant &lt;br /&gt;
* ''start0.nc'' is a file containing the initial conditions of the simulation&lt;br /&gt;
* ''restart.nc'' is the output file containing the final state of the simulation&lt;br /&gt;
* ''daily_output_native.nc'' is the output file containing a time series of a selection of variables on the native (icosahedral) grid&lt;br /&gt;
* ''daily_output.nc'' is the output file containing a time series of a selection of variables re-interpolated on a regular longitude-latitude grid&lt;br /&gt;
* ''dynamico.nc'' is the output file containing all your simulation data, that can be easily plotted with ferret (for example)&lt;br /&gt;
&lt;br /&gt;
=== Using the restart.nc file to continue your simulation ===&lt;br /&gt;
If you want to use '''restart.nc''' to avoid restarting the simulation from scratch, here is the procedure to follow:&lt;br /&gt;
&lt;br /&gt;
In the run.def file, for the first launch, change the variable &amp;quot;nqtot&amp;quot; to &amp;quot;1&amp;quot; instead of 0.&lt;br /&gt;
&lt;br /&gt;
Then, run the model as usual.&lt;br /&gt;
&lt;br /&gt;
Once the execution is complete, a '''restart.nc''' file will be created (in addition to all other .nc files), rename it to &amp;quot;start.nc&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
mv restart.nc start.nc&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, modify the run.def file a second time. Change the variable &amp;quot;etat0 = held_suarez&amp;quot; to &amp;quot;etat0 = start_file&amp;quot;, and add an additional line. You would then have these lines:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
# etat0 = held_suarez (old line commented out)&lt;br /&gt;
etat0 = start_file&lt;br /&gt;
start_file_name = start&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note, even if your file is named &amp;quot;start.nc&amp;quot;, it is &amp;quot;start&amp;quot; that needs to be specified in the run.def (the .nc is already taken into account).&lt;br /&gt;
&lt;br /&gt;
Then, run the model as usual with the same script.slurm and sbatch.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
sbatch script.slurm&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You should see that the iterations start where those of the previous simulation stopped. With this command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you want to do this again, remember to always use the new '''restart.nc''' file as the starting point (renaming it to start.nc etc...).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Mixed bag of comments about the run's setup and outputs ====&lt;br /&gt;
For those interested in more details about the key aspects and main parameters:&lt;br /&gt;
* The run control parameters are set in the ''run.def'' ASCII file which is read at run-time. This is where for instance one specifies the model resolution (parameters nbp: number of subdivisions of main triangles, and llm: number of vertical levels), time step (parameter day_step: number of steps per day) and length of the run (parameter ndays: number of days to run)&lt;br /&gt;
* The sub-splitting of the main rhombus (parameters nsplit_i and nsplit_j) controls the overall number of tiles (sub-domains). As a rule of thumb when running in parallel (MPI) one want to have as many sub-domains as available MPI processes. Since the icosahedron is made up of 10 rhombus this implies that one should target using a total of 10*nsplit_i*nspilt_j processes.&lt;br /&gt;
* Controlling the outputs made by XIOS is via the XML files, namely ''file_def_dynamico.xml''&lt;br /&gt;
* and so much more...&lt;br /&gt;
&lt;br /&gt;
[[Category:FAQ]]&lt;br /&gt;
[[Category:DYNAMICO]]&lt;br /&gt;
[[Category:Generic-DYNAMICO]]&lt;br /&gt;
[[Category:Mars-DYNAMICO]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=The_DYNAMICO_dynamical_core&amp;diff=2086</id>
		<title>The DYNAMICO dynamical core</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=The_DYNAMICO_dynamical_core&amp;diff=2086"/>
				<updated>2024-06-25T13:40:26Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is DYNAMICO ? ==&lt;br /&gt;
&lt;br /&gt;
DYNAMICO is a recent dynamical core for atmospheric general circulation models (GCM). It is based on an icosahedral hexagonal grid projected on the sphere, a hybrid pressure-based terrain-following vertical coordinate, second-order enstrophy-conserving finite-difference discretization and positive-definite advection.&lt;br /&gt;
&lt;br /&gt;
DYNAMICO is coded in Fortran and meant to be used in a massively parallel environment (using MPI and OpenMP) and has been coupled to a number of physics packages, such as the Earth LMDZ6 physics package (see https://lmdz-forge.lmd.jussieu.fr/mediawiki/LMDZPedia/index.php/Accueil ; search there for the keyword DYNAMICO to get some related documentation) but also the planetary version of the Mars, Venus or Generic Planetary Climate Models (PCM).&lt;br /&gt;
&lt;br /&gt;
The DYNAMICO source code is freely available and downloadable using git &lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
The DYNAMICO project page (be warned that information there is somewhat obsolete and related to earlier version, now obsolete, on svn) can be found at http://forge.ipsl.jussieu.fr/dynamico &lt;br /&gt;
&lt;br /&gt;
== Installing and running DYNAMICO ==&lt;br /&gt;
Here we just describe how to compile and run DYNAMICO by itself, i.e. without being coupled to any physics.&lt;br /&gt;
This is essentially done as an exercise to check that it has been correctly installed, before moving on to the more complex (and complete!) case of compiling and running with a given physics package&lt;br /&gt;
&lt;br /&gt;
=== Prerequisites ===&lt;br /&gt;
There are a couple of prerequisites to installing and using DYNAMICO:&lt;br /&gt;
# An MPI library must be available (i.e. installed and ready to use)&lt;br /&gt;
# BLAS and LAPACK libraries must be available.&lt;br /&gt;
# The XIOS library, compiled with that same MPI library, must also be available. Check out the [[The_XIOS_Library|XIOS library]] page for some information on installing it.&lt;br /&gt;
&lt;br /&gt;
=== Downloading and compiling DYNAMICO ===&lt;br /&gt;
Using git&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
will create a '''dynamico''' directory containing all the necessary source code.&lt;br /&gt;
Note that it is advised that this directory be alongside the '''XIOS''' library directory (because some relative paths in the dynamico arch*.env files assume it is the case. If not you will need to modify these files accordingly).&lt;br /&gt;
&lt;br /&gt;
In the '''dynamico''' directory is the ''make_icosa'' compilation script, which is based on FCM and thus requires that adequate architecture ASCII files be available. The '''arch''' subdirectory contains examples for a few machines. Assuming you want to compile using ''somemachine'' architecture files (i.e. files ''arch/arch-somemachine.fcm'' , ''arch/arch-somemachine.env'', ''arch/arch-somemachine.path'' are available and contain the adequate information) you would run:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch somemachine -job 8&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
If compilation went well than you will find the executable '''icosa_gcm.exe''' in the '''bin''' subdirectory&lt;br /&gt;
&lt;br /&gt;
==== For the experts: more about the arch files and their content ====&lt;br /&gt;
TO BE WRITTEN&lt;br /&gt;
&lt;br /&gt;
==== For the experts: more about the make_icosa script ====&lt;br /&gt;
To know more about possible options to the ''make_icosa'' script:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
./make_icosa -h&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Running a simple Held and Suarez test case ===&lt;br /&gt;
DYNAMICO comes with a simple test case corresponding to a Held and Suarez simulation (...DESCRIBE H&amp;amp;S IN A NUTSHELL + REF TO THEIR PAPER...).&lt;br /&gt;
&lt;br /&gt;
We will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory test_HELD_SUAREZ (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that the test-case design is such that it assumes you will be using 10 MPI processes. Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
And here another one that should work on Irene-Rome (assuming you are in project &amp;quot;gen10391&amp;quot;):&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# Partition to run on:&lt;br /&gt;
#MSUB -q rome&lt;br /&gt;
# project to run on &lt;br /&gt;
#MSUB -A  gen10391&lt;br /&gt;
# disks to use&lt;br /&gt;
#MSUB -m  scratch,work,store&lt;br /&gt;
# Job name&lt;br /&gt;
#MSUB -r job_mpi&lt;br /&gt;
# Job standard output:&lt;br /&gt;
#MSUB -o job_mpi.%I&lt;br /&gt;
# Job standard error:&lt;br /&gt;
#MSUB -e job_mpi.%I&lt;br /&gt;
# number of OpenMP threads c&lt;br /&gt;
#MSUB -c 1&lt;br /&gt;
# number of MPI tasks n&lt;br /&gt;
#MSUB -n 40&lt;br /&gt;
# number of nodes to use N&lt;br /&gt;
#MSUB -N 1&lt;br /&gt;
# max job run time T (in seconds)&lt;br /&gt;
#MSUB -T 7200&lt;br /&gt;
&lt;br /&gt;
source ../dynamico/arch.env&lt;br /&gt;
ccc_mprun -l icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Moreover the following NetCDF output files should have been produced:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
Ai.nc    daily_output_native.nc  restart.nc  time_counter.nc&lt;br /&gt;
apbp.nc  daily_output.nc         start0.nc   dynamico.nc&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
where:&lt;br /&gt;
* ''Ai.nc'', ''apbp.nc'' and ''time_counter.nc'' are unimportant &lt;br /&gt;
* ''start0.nc'' is a file containing the initial conditions of the simulation&lt;br /&gt;
* ''restart.nc'' is the output file containing the final state of the simulation&lt;br /&gt;
* ''daily_output_native.nc'' is the output file containing a time series of a selection of variables on the native (icosahedral) grid&lt;br /&gt;
* ''daily_output.nc'' is the output file containing a time series of a selection of variables re-interpolated on a regular longitude-latitude grid&lt;br /&gt;
* ''dynamico.nc'' is the output file containing all your simulation data, that can be easily plotted with ferret (for example)&lt;br /&gt;
&lt;br /&gt;
=== Using the restart.nc file to continue your simulation ===&lt;br /&gt;
If you want to use '''restart.nc''' to avoid restarting the simulation from scratch, here is the procedure to follow:&lt;br /&gt;
&lt;br /&gt;
In the run.def file, for the first launch, change the variable &amp;quot;nqtot&amp;quot; to &amp;quot;1&amp;quot; instead of 0.&lt;br /&gt;
&lt;br /&gt;
Then, run the model as usual.&lt;br /&gt;
&lt;br /&gt;
Once the execution is complete, a '''restart.nc''' file will be created (in addition to all other .nc files), rename it to &amp;quot;start.nc&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
mv restart.nc start.nc&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Next, modify the run.def file a second time. Change the variable &amp;quot;etat0 = held_suarez&amp;quot; to &amp;quot;etat0 = start_file&amp;quot;, and add an additional line. You would then have these lines:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
# etat0 = held_suarez (old line commented out)&lt;br /&gt;
etat0 = start_file&lt;br /&gt;
start_file_name = start&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note, even if your file is named &amp;quot;start.nc&amp;quot;, it is &amp;quot;start&amp;quot; that needs to be specified in the run.def (the .nc is already taken into account).&lt;br /&gt;
&lt;br /&gt;
Then, run the model as usual with the same script.slurm and sbatch.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
sbatch script.slurm&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You should see that the iterations start where those of the previous simulation stopped. With this command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you want to do this again, remember to always use the new '''restart.nc''' file as the starting point (renaming it to start.nc etc...).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Mixed bag of comments about the run's setup and outputs ====&lt;br /&gt;
For those interested in more details about the key aspects and main parameters:&lt;br /&gt;
* The run control parameters are set in the ''run.def'' ASCII file which is read at run-time. This is where for instance one specifies the model resolution (parameters nbp: number of subdivisions of main triangles, and llm: number of vertical levels), time step (parameter day_step: number of steps per day) and length of the run (parameter ndays: number of days to run)&lt;br /&gt;
* The sub-splitting of the main rhombus (parameters nsplit_i and nsplit_j) controls the overall number of tiles (sub-domains). As a rule of thumb when running in parallel (MPI) one want to have as many sub-domains as available MPI processes. Since the icosahedron is made up of 10 rhombus this implies that one should target using a total of 10*nsplit_i*nspilt_j processes.&lt;br /&gt;
* Controlling the outputs made by XIOS is via the XML files, namely ''file_def_dynamico.xml''&lt;br /&gt;
* and so much more...&lt;br /&gt;
&lt;br /&gt;
[[Category:FAQ]]&lt;br /&gt;
[[Category:DYNAMICO]]&lt;br /&gt;
[[Category:Generic-DYNAMICO]]&lt;br /&gt;
[[Category:Mars-DYNAMICO]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=The_DYNAMICO_dynamical_core&amp;diff=2085</id>
		<title>The DYNAMICO dynamical core</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=The_DYNAMICO_dynamical_core&amp;diff=2085"/>
				<updated>2024-06-25T13:28:08Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Running a simple Held and Suarez test case */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is DYNAMICO ? ==&lt;br /&gt;
&lt;br /&gt;
DYNAMICO is a recent dynamical core for atmospheric general circulation models (GCM). It is based on an icosahedral hexagonal grid projected on the sphere, a hybrid pressure-based terrain-following vertical coordinate, second-order enstrophy-conserving finite-difference discretization and positive-definite advection.&lt;br /&gt;
&lt;br /&gt;
DYNAMICO is coded in Fortran and meant to be used in a massively parallel environment (using MPI and OpenMP) and has been coupled to a number of physics packages, such as the Earth LMDZ6 physics package (see https://lmdz-forge.lmd.jussieu.fr/mediawiki/LMDZPedia/index.php/Accueil ; search there for the keyword DYNAMICO to get some related documentation) but also the planetary version of the Mars, Venus or Generic Planetary Climate Models (PCM).&lt;br /&gt;
&lt;br /&gt;
The DYNAMICO source code is freely available and downloadable using git &lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
The DYNAMICO project page (be warned that information there is somewhat obsolete and related to earlier version, now obsolete, on svn) can be found at http://forge.ipsl.jussieu.fr/dynamico &lt;br /&gt;
&lt;br /&gt;
== Installing and running DYNAMICO ==&lt;br /&gt;
Here we just describe how to compile and run DYNAMICO by itself, i.e. without being coupled to any physics.&lt;br /&gt;
This is essentially done as an exercise to check that it has been correctly installed, before moving on to the more complex (and complete!) case of compiling and running with a given physics package&lt;br /&gt;
&lt;br /&gt;
=== Prerequisites ===&lt;br /&gt;
There are a couple of prerequisites to installing and using DYNAMICO:&lt;br /&gt;
# An MPI library must be available (i.e. installed and ready to use)&lt;br /&gt;
# BLAS and LAPACK libraries must be available.&lt;br /&gt;
# The XIOS library, compiled with that same MPI library, must also be available. Check out the [[The_XIOS_Library|XIOS library]] page for some information on installing it.&lt;br /&gt;
&lt;br /&gt;
=== Downloading and compiling DYNAMICO ===&lt;br /&gt;
Using git&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
will create a '''dynamico''' directory containing all the necessary source code.&lt;br /&gt;
Note that it is advised that this directory be alongside the '''XIOS''' library directory (because some relative paths in the dynamico arch*.env files assume it is the case. If not you will need to modify these files accordingly).&lt;br /&gt;
&lt;br /&gt;
In the '''dynamico''' directory is the ''make_icosa'' compilation script, which is based on FCM and thus requires that adequate architecture ASCII files be available. The '''arch''' subdirectory contains examples for a few machines. Assuming you want to compile using ''somemachine'' architecture files (i.e. files ''arch/arch-somemachine.fcm'' , ''arch/arch-somemachine.env'', ''arch/arch-somemachine.path'' are available and contain the adequate information) you would run:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch somemachine -job 8&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
If compilation went well than you will find the executable '''icosa_gcm.exe''' in the '''bin''' subdirectory&lt;br /&gt;
&lt;br /&gt;
==== For the experts: more about the arch files and their content ====&lt;br /&gt;
TO BE WRITTEN&lt;br /&gt;
&lt;br /&gt;
==== For the experts: more about the make_icosa script ====&lt;br /&gt;
To know more about possible options to the ''make_icosa'' script:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
./make_icosa -h&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Running a simple Held and Suarez test case ===&lt;br /&gt;
DYNAMICO comes with a simple test case corresponding to a Held and Suarez simulation (...DESCRIBE H&amp;amp;S IN A NUTSHELL + REF TO THEIR PAPER...).&lt;br /&gt;
&lt;br /&gt;
We will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory test_HELD_SUAREZ (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that the test-case design is such that it assumes you will be using 10 MPI processes. Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
And here another one that should work on Irene-Rome (assuming you are in project &amp;quot;gen10391&amp;quot;):&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# Partition to run on:&lt;br /&gt;
#MSUB -q rome&lt;br /&gt;
# project to run on &lt;br /&gt;
#MSUB -A  gen10391&lt;br /&gt;
# disks to use&lt;br /&gt;
#MSUB -m  scratch,work,store&lt;br /&gt;
# Job name&lt;br /&gt;
#MSUB -r job_mpi&lt;br /&gt;
# Job standard output:&lt;br /&gt;
#MSUB -o job_mpi.%I&lt;br /&gt;
# Job standard error:&lt;br /&gt;
#MSUB -e job_mpi.%I&lt;br /&gt;
# number of OpenMP threads c&lt;br /&gt;
#MSUB -c 1&lt;br /&gt;
# number of MPI tasks n&lt;br /&gt;
#MSUB -n 40&lt;br /&gt;
# number of nodes to use N&lt;br /&gt;
#MSUB -N 1&lt;br /&gt;
# max job run time T (in seconds)&lt;br /&gt;
#MSUB -T 7200&lt;br /&gt;
&lt;br /&gt;
source ../dynamico/arch.env&lt;br /&gt;
ccc_mprun -l icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Moreover the following NetCDF output files should have been produced:&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
Ai.nc    daily_output_native.nc  restart.nc  time_counter.nc&lt;br /&gt;
apbp.nc  daily_output.nc         start0.nc   dynamico.nc&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
where:&lt;br /&gt;
* ''Ai.nc'', ''apbp.nc'' and ''time_counter.nc'' are unimportant &lt;br /&gt;
* ''start0.nc'' is a file containing the initial conditions of the simulation&lt;br /&gt;
* ''restart.nc'' is the output file containing the final state of the simulation&lt;br /&gt;
* ''daily_output_native.nc'' is the output file containing a time series of a selection of variables on the native (icosahedral) grid&lt;br /&gt;
* ''daily_output.nc'' is the output file containing a time series of a selection of variables re-interpolated on a regular longitude-latitude grid&lt;br /&gt;
* ''dynamico.nc'' is the output file containing all your simulation data, that can be easily plotted with ferret (for example)&lt;br /&gt;
&lt;br /&gt;
==== Mixed bag of comments about the run's setup and outputs ====&lt;br /&gt;
For those interested in more details about the key aspects and main parameters:&lt;br /&gt;
* The run control parameters are set in the ''run.def'' ASCII file which is read at run-time. This is where for instance one specifies the model resolution (parameters nbp: number of subdivisions of main triangles, and llm: number of vertical levels), time step (parameter day_step: number of steps per day) and length of the run (parameter ndays: number of days to run)&lt;br /&gt;
* The sub-splitting of the main rhombus (parameters nsplit_i and nsplit_j) controls the overall number of tiles (sub-domains). As a rule of thumb when running in parallel (MPI) one want to have as many sub-domains as available MPI processes. Since the icosahedron is made up of 10 rhombus this implies that one should target using a total of 10*nsplit_i*nspilt_j processes.&lt;br /&gt;
* Controlling the outputs made by XIOS is via the XML files, namely ''file_def_dynamico.xml''&lt;br /&gt;
* and so much more...&lt;br /&gt;
&lt;br /&gt;
[[Category:FAQ]]&lt;br /&gt;
[[Category:DYNAMICO]]&lt;br /&gt;
[[Category:Generic-DYNAMICO]]&lt;br /&gt;
[[Category:Mars-DYNAMICO]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2084</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2084"/>
				<updated>2024-06-25T13:10:03Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Execution - Test_Case Held&amp;amp;Suarez - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Held&amp;amp;Suarez - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory test_HELD_SUAREZ (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example for spirit1):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2083</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2083"/>
				<updated>2024-06-25T13:09:04Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Execution - Test_Case Held&amp;amp;Suarez - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Held&amp;amp;Suarez - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory test_HELD_SUAREZ (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2082</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2082"/>
				<updated>2024-06-25T13:06:45Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: /* Execution - Test_Case - DYNAMICO */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case Held&amp;amp;Suarez - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2080</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2080"/>
				<updated>2024-06-21T13:53:28Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2079</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2079"/>
				<updated>2024-06-20T09:47:49Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co -r 2319 -q http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
TO CONTINUE&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2078</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2078"/>
				<updated>2024-06-20T09:44:10Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co -r 2319 -q http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus-Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2077</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2077"/>
				<updated>2024-06-20T09:43:40Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co -r 2319 -q http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus Model]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	<entry>
		<id>http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2076</id>
		<title>Venus - DYNAMICO</title>
		<link rel="alternate" type="text/html" href="http://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php?title=Venus_-_DYNAMICO&amp;diff=2076"/>
				<updated>2024-06-20T09:42:46Z</updated>
		
		<summary type="html">&lt;p&gt;Rcapron: Created page with &amp;quot;= Venus - DYNAMICO =  Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Venus - DYNAMICO =&lt;br /&gt;
&lt;br /&gt;
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]&lt;br /&gt;
&lt;br /&gt;
== Installation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:&lt;br /&gt;
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
Then, you should have '''XIOS''' too, do this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn co -r 2319 -q http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.&lt;br /&gt;
&lt;br /&gt;
Then you should compile '''XIOS''' (only once):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd your/path/trunk/XIOS&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)&lt;br /&gt;
&lt;br /&gt;
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.&lt;br /&gt;
&lt;br /&gt;
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_xios --prod --arch ifort_MESOIPSL --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will be the same thing every time you have to specify your architecture.&lt;br /&gt;
&lt;br /&gt;
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.&lt;br /&gt;
&lt;br /&gt;
== Compilation - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Go to the repository /ICOSAGCM, then compile the model:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/&lt;br /&gt;
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can use the “-debug” option to compile in debug mode, but it will be slower to run.&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.&lt;br /&gt;
&lt;br /&gt;
== Execution - Test_Case - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.&lt;br /&gt;
&lt;br /&gt;
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# Go where the .def files are&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
# Copy the .def files in the repository test_HELD_SUAREZ&lt;br /&gt;
cp *def ../../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Do the same for the .xml files:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML&lt;br /&gt;
cp *xml ../../../test_HELD_SUAREZ&lt;br /&gt;
&lt;br /&gt;
cd ..&lt;br /&gt;
cp iodef.xml ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
gedit ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If gedit doesn’t work, use nano:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
nano ~/.bashrc&lt;br /&gt;
&lt;br /&gt;
# This option will unlimit the stack size&lt;br /&gt;
ulimit -s unlimited&lt;br /&gt;
&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/bin&lt;br /&gt;
cp icosa_gcm.exe ../../test_HELD_SUAREZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.&lt;br /&gt;
&lt;br /&gt;
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/test_HELD_SUAREZ&lt;br /&gt;
sbatch script_d_execution.slurm&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Slurm script (example):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --nodes=1&lt;br /&gt;
#SBATCH --ntasks-per-node=8&lt;br /&gt;
#SBATCH --cpus-per-task=1&lt;br /&gt;
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory&lt;br /&gt;
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory&lt;br /&gt;
#SBATCH -J job_mpi_omp&lt;br /&gt;
#SBATCH --time=0:20:00&lt;br /&gt;
#SBATCH --output %x.%j.out&lt;br /&gt;
&lt;br /&gt;
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env&lt;br /&gt;
&lt;br /&gt;
export OMP_NUM_THREADS=1&lt;br /&gt;
export OMP_STACKSIZE=400M&lt;br /&gt;
&lt;br /&gt;
mpirun icosa_gcm.exe &amp;gt; icosa_gcm.out 2&amp;gt;&amp;amp;1&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).&lt;br /&gt;
&lt;br /&gt;
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tail -f icosa_gcm.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GETIN restart_file_name = restart&lt;br /&gt;
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang&lt;br /&gt;
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01&lt;br /&gt;
&lt;br /&gt;
Time elapsed :    601.628763000000    &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Connection Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.&lt;br /&gt;
&lt;br /&gt;
In addition, you should have '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
svn update -r 2655 -q ICOSA_LMDZ&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir ARCH&lt;br /&gt;
cd /your/path/trunk/ICOSAGCM/arch&lt;br /&gt;
cp YOUR_MACHINE.env /your/path/trunk/ARCH&lt;br /&gt;
cp YOUR_MACHINE.path /your/path/trunk/ARCH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
ls&lt;br /&gt;
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.&lt;br /&gt;
&lt;br /&gt;
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]&lt;br /&gt;
&lt;br /&gt;
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]&lt;br /&gt;
&lt;br /&gt;
For '''ICOSA_LMDZ''': we just installed it before.&lt;br /&gt;
&lt;br /&gt;
== Compilation Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
Everything needs to be properly compiled before you can run the model. Here’s how to do it:&lt;br /&gt;
&lt;br /&gt;
(some compilation has already been done, but this recaps everything)&lt;br /&gt;
&lt;br /&gt;
!! Everything has to be compiled in the right order !!&lt;br /&gt;
&lt;br /&gt;
Compile '''IOIPSL''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/LMDZ.COMMON/ioipsl&lt;br /&gt;
./install_ioipsl_YOUR_ARCH.bash&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''XIOS''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../../XIOS/&lt;br /&gt;
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Compile '''LMDZ.COMMON''' (the Physics packages):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd ../LMDZ.COMMON/&lt;br /&gt;
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
“-d 48x32x50” is the model dimension, it can be changed.&lt;br /&gt;
&lt;br /&gt;
Compile '''ICOSAGCM''' (the Dynamical Core):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Eventually, compile '''ICOSA_LMDZ''':&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.&lt;br /&gt;
&lt;br /&gt;
== Running Venus - DYNAMICO ==&lt;br /&gt;
&lt;br /&gt;
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/trunk/&lt;br /&gt;
mkdir Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the README.md in /ICOSA_LMDZ to know what files to take where :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;text&amp;quot;&amp;gt;&lt;br /&gt;
organization of XML files and synchronization with code&lt;br /&gt;
-------------------------------------------------------&lt;br /&gt;
&lt;br /&gt;
from ICOSAGCM/xml [DYNAMICO dynamical core]&lt;br /&gt;
- context_input_dynamico.xml&lt;br /&gt;
- field_def_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
from ICOSA_LMDZ/xml [INTERFACE]&lt;br /&gt;
- iodef.xml&lt;br /&gt;
&lt;br /&gt;
from LMDZ.VENUS/deftank [LMDZ physics]&lt;br /&gt;
- field_def_physics.xml&lt;br /&gt;
- context_lmdz_physics.xml&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM/xml&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version &lt;br /&gt;
- context_dynamico.xml&lt;br /&gt;
&lt;br /&gt;
to be created and adapted from ICOSAGCM test cases&lt;br /&gt;
&amp;gt;&amp;gt; check compatibility when changing ICOSAGCM version&lt;br /&gt;
- file_def_dynamico.xml&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSAGCM/xml/DYNAMICO/ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSAGCM/xml/DYNAMICO&lt;br /&gt;
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From ICOSA_LMDZ :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/ICOSA_LMDZ/xml&lt;br /&gt;
cp iodef.xml ../../Venus/DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From LMDZ.VENUS :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd /your/path/LMDZ.VENUS/deftank&lt;br /&gt;
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Venus Models]]&lt;/div&gt;</summary>
		<author><name>Rcapron</name></author>	</entry>

	</feed>