Difference between revisions of "Venus - DYNAMICO"

From Planets
Jump to: navigation, search
m
(Venus - DYNAMICO: Refer to DYNAMICO installation process page to avoid duplication of the information.)
Line 2: Line 2:
  
 
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:
 
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]
+
[[The DYNAMICO dynamical core]].
  
 
== Installation - DYNAMICO ==
 
== Installation - DYNAMICO ==
Line 10: Line 10:
  
 
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].
 
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].
That said, we will explain it once more in this page to be sure everything works.
 
  
Then, you should have '''XIOS''' too, do this:
+
You should have '''XIOS''' (please see this page for its installation: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]).
  
<pre>
+
Finally, you also need [[the DYNAMICO dynamical core|DYNAMICO '''installed and compiled''']].
cd /your/path/trunk/
 
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS
 
</pre>
 
 
 
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]
 
 
 
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.
 
 
 
Then you should compile '''XIOS''' (only once):
 
 
 
<pre>
 
cd your/path/trunk/XIOS
 
./make_xios --prod --arch YOUR_ARCH --job 8
 
</pre>
 
 
 
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)
 
 
 
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.
 
 
 
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:
 
 
 
<pre>
 
./make_xios --prod --arch ifort_MESOIPSL --job 8
 
</pre>
 
 
 
This will be the same thing every time you have to specify your architecture.
 
 
 
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):
 
 
 
<pre>
 
cd /your/path/trunk/
 
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM
 
</pre>
 
 
 
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.
 
 
 
== Compilation - DYNAMICO ==
 
 
 
Go to the repository /ICOSAGCM, then compile the model:
 
 
 
<pre>
 
cd /your/path/trunk/ICOSAGCM/
 
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8
 
</pre>
 
 
 
You can use the “-debug” option to compile in debug mode, but it will be slower to run :
 
 
 
<pre>
 
cd /your/path/trunk/ICOSAGCM/
 
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug
 
</pre>
 
 
 
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.
 
  
 
== Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO ==
 
== Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO ==
Line 300: Line 246:
 
Time elapsed :    601.628763000000     
 
Time elapsed :    601.628763000000     
 
</pre>
 
</pre>
 +
 +
  
 
== Connection Venus - DYNAMICO ==
 
== Connection Venus - DYNAMICO ==
Line 305: Line 253:
 
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.
 
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.
  
In addition, you should have '''ICOSA_LMDZ''':
+
In addition, you should have '''[[ICOSA LMDZ directory layout and contents|ICOSA_LMDZ/]]'''.
 
+
Your trunk folder should look like this:
 
<pre>
 
<pre>
cd /your/path/trunk/
 
svn update ICOSA_LMDZ
 
</pre>
 
 
And the ARCH directory that contains informations about every architectures.
 
 
<pre>
 
cd /your/path/trunk/
 
svn update ARCH
 
</pre>
 
 
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:
 
 
<pre>
 
cd /your/path/trunk/
 
ls
 
 
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ   
 
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ   
 
</pre>
 
</pre>
  
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.
+
Please now follow how to [[DYNAMICO_with_LMDZ_physics#Compiling_DYNAMICO_with_a_LMDZ_physics_package|compile DYNAMICO with a physics package]].
 
 
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]
 
 
 
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]
 
 
 
For '''ICOSA_LMDZ''': we just installed it before.
 
 
 
== Compilation Venus - DYNAMICO ==
 
 
 
Everything needs to be properly compiled before you can run the model. Here’s how to do it:
 
 
 
(some compilation has already been done, but this recaps everything)
 
 
 
!! Everything has to be compiled in the right order !!
 
 
 
Compile '''IOIPSL''':
 
 
 
<pre>
 
cd /your/path/trunk/LMDZ.COMMON/ioipsl
 
./install_ioipsl_YOUR_ARCH.bash
 
</pre>
 
 
 
Compile '''XIOS''':
 
 
 
<pre>
 
cd ../../XIOS/
 
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8
 
</pre>
 
 
 
Compile '''LMDZ.COMMON''' (the Physics packages):
 
 
 
<pre>
 
cd ../LMDZ.COMMON/
 
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi
 
</pre>
 
 
 
“-d 48x32x50” is the model dimension, it can be changed.
 
 
 
Compile '''ICOSAGCM''' (the Dynamical Core):
 
 
 
<pre>
 
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8
 
</pre>
 
 
 
Eventually, compile '''ICOSA_LMDZ''':
 
 
 
<pre>
 
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps
 
</pre>
 
 
 
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.
 
 
 
All these compiling steps are summed up in make_isoca_lmdz :
 
<pre>
 
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -full
 
</pre>
 
 
 
Here, -full option assure the compilation of each part ('''IOIPSL''', '''XIOS''', '''LMDZ.COMMON''', '''ICOSAGCM''' and '''ICOSA_LMDZ''') of the model.
 
  
 
== Running Venus - DYNAMICO ==
 
== Running Venus - DYNAMICO ==
Line 398: Line 272:
 
=== Where to find .xml files ===
 
=== Where to find .xml files ===
  
See the README.md in /ICOSA_LMDZ to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :
+
See the README.md in '''[[ICOSA LMDZ directory layout and contents|ICOSA_LMDZ/]]''' to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :
  
 
<syntaxhighlight lang="text">
 
<syntaxhighlight lang="text">

Revision as of 14:49, 18 October 2024

Venus - DYNAMICO

Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page: The DYNAMICO dynamical core.

Installation - DYNAMICO

Before installing DYNAMICO, you should have previously installed LMDZ Venus (and everything that refers to it), see this page: Quick Install and Run Venus PCM

One should also see the PCM directory layout page to understand and install everything : PCM directory layout.

You should have XIOS (please see this page for its installation: The XIOS Library).

Finally, you also need DYNAMICO installed and compiled.

Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO

Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.

To do this, make a new folder “test_VENUS”, alongside ICOSAGCM and XIOS.

cd /your/path/trunk/
mkdir test_VENUS

Then, we need to copy the specific .def files for this testCase (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&Suarez basic testCase.

# Go where the .def files are
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ

# Copy the .def files in the repository test_VENUS
cp *def ../../../test_VENUS

Do the same for the .xml files:

cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML
cp *xml ../../../test_VENUS

cd ..
cp iodef.xml ../../test_VENUS

Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:

gedit ~/.bashrc

# This option will unlimit the stack size
ulimit -s unlimited

source ~/.bashrc

If gedit doesn’t work, use nano:

nano ~/.bashrc

# This option will unlimit the stack size
ulimit -s unlimited

source ~/.bashrc

Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :

cd /your/path/trunk/ICOSAGCM/bin
cp icosa_gcm.exe ../../test_VENUS

If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “file_def_dynamico.xml”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.

Then, there are some changes to be made to the run.def and earth_const.def files. First, rename the earth_const.def file into venus_const.def file :

mv earth_const.def venus_const.def

Next, you should change the venus_const.def completely, to match the Venus atmosphere, here's an example of script :

# Rayon planétaire (m) 
radius = 6.0518e6

# Durée d'un jour (s))
daysec = 20995200

# Gravité : réel (par défaut = 8.87)
g = 8.87

# Taux de rotation planétaire
omega = 2.992e-7

# kappa=Rd/cpp
kappa = 0.2857143

# Capacité thermique à pression constante : réel (par défaut = 1004.70885)
cpp = 1004

# Pression de référence : réel (par défaut = 9200000)
preff = 9.2e6

After this, it is time to change the run.def file, to be quick, you should change the "etat0" parameter into "venus", the "physics" parameter into "Lebonnois2012", the "day_step" (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :


#---------------- Mesh ----------------

# Number of subdivisions on a main triangle : integer (default=40)
nbp = 40

# Number of vertical layers : integer (default=19)
llm = 19

# Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std)
disvert = std

# Mesh optimisation : number of iterations : integer (default=0)
optim_it = 1000

# Sub splitting of main rhombus : integer (default=1)
nsplit_i = 1
nsplit_j = 1

#number of openmp task on vertical level
omp_level_size=1

#---------------- Numerics ----------------

# Advection called every itau_adv time steps : integer (default=2)
itau_adv = 1

# Time step in s : real (default=480)
# dt = 720
# Alternative to specifying "dt", specify number of steps per day : day_step
day_step = 240000

# Number of tracers : integer (default=1)
nqtot = 1


#---------------- Time and output ----------------

# Time style : [none|dcmip] (default=dcmip)
time_style = none

# Run length in s : real (default=??)
# run_length = 1036800
# Alternative to specifying "run_length", specify number of days to run : ndays
ndays=1

# Interval in s between two outputs : integer (default=??)
write_period = 314928


#---------------- Planet ----------------

INCLUDEDEF=venus_const.def

#---------------- Physical parameters ----------------

# Initial state : 
#   [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06)
etat0 = venus

# Si on souhaite repartir de la "fin" d'une précédente simulation
#etat0=start_file
 
#start_file_name=start


# Physics package : [none|held_suarez|dcmip] (default=none)
physics = Lebonnois2012

# Dissipation time for grad(div) : real (default=5000)
tau_graddiv = 18000

# Exponent of grad(div) disspation : integer (default=1)
nitergdiv = 2

# Dissipation time for curl(curl) : real (default=5000)
tau_gradrot = 18000

# Exponent of curl(curl) disspation : integer (default=1)
nitergrot = 2

# Dissipation time for div(grad) : real (default=5000)
tau_divgrad = 18000

# Exponent of div(grad) disspation : integer (default=1)
niterdivgrad = 2

(one can compare with the previous run.def to see the differences)


Everything is now ready to run the model. Go to test_VENUS, then use the slurm command “sbatch” to submit a job to the cluster.

cd /your/path/trunk/test_HELD_SUAREZ
sbatch script_d_execution.slurm

Slurm script (example for spirit1):

#!/bin/bash
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=8
#SBATCH --cpus-per-task=1
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory
#SBATCH -J job_mpi_omp
#SBATCH --time=0:20:00
#SBATCH --output %x.%j.out

source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env

export OMP_NUM_THREADS=1
export OMP_STACKSIZE=400M

mpirun icosa_gcm.exe > icosa_gcm.out 2>&1

In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).

To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.

tail -f icosa_gcm.out

Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):

GETIN restart_file_name = restart
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01

Time elapsed :    601.628763000000    


Connection Venus - DYNAMICO

Now that we verified that the testCase HELD_and_SUAREZ is working, we will be able to “plug” the DYNAMICO dynamical core to some real Physics. For this, you already need LMDZ, alongside XIOS and DYNAMICO.

In addition, you should have ICOSA_LMDZ/. Your trunk folder should look like this:

ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  

Please now follow how to compile DYNAMICO with a physics package.

Running Venus - DYNAMICO

After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:

cd /your/path/trunk/
mkdir Venus_DYNAMICO

Where to find .xml files

See the README.md in ICOSA_LMDZ/ to know what xml files to take where (note that this exemple written right after correspond to the current way of choosing xml files, this if therefore strongly adviced to open the README that should be always up to date) :

organization of XML files and synchronization with code
-------------------------------------------------------

from ICOSAGCM/xml [DYNAMICO dynamical core]
- context_input_dynamico.xml
- field_def_dynamico.xml
- nudging_dynamico.xml
- sponge_dynamico.xml

from ICOSA_LMDZ/xml [INTERFACE]
- iodef.xml

from LMDZ.VENUS/deftank [LMDZ physics]
- field_def_physics.xml
- context_lmdz_physics.xml

-----

to be created and adapted from ICOSAGCM/xml
>> check compatibility when changing ICOSAGCM version 
- context_dynamico.xml

to be created and adapted from ICOSAGCM test cases
>> check compatibility when changing ICOSAGCM version
- file_def_dynamico.xml

From ICOSAGCM/xml/DYNAMICO/ :

cd /your/path/ICOSAGCM/xml/DYNAMICO
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO

From ICOSA_LMDZ :

cd /your/path/ICOSA_LMDZ/xml
cp iodef.xml ../../Venus/DYNAMICO

From LMDZ.VENUS :

cd /your/path/LMDZ.VENUS/deftank
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO

When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a "dom_glo" issue when running DYNAMICO, here are the lines to change/add :

Line 7 to completely replace by : (replacing between <domain_definition> and </domain_definition>)

<domain_group id="dom_glo" data_dim="1" >
  <domain id="dom_glo" />
</domain_group>

<domain id="dom_regular" ni_glo="96" nj_glo="97" type="rectilinear"  >
      <generate_rectilinear_domain lat_start="-90" lat_end="90" lon_start="180" lon_end="-176.25" />
      <interpolate_domain order="1"/>
</domain>

<domain id="dom_out" domain_ref="dom_regular"/>

Then lines 44 to 46, add : (line 34 from the original file)

<grid id="grid_2D">
    <domain domain_ref="dom_glo" />
</grid>

Then lines 50 to 57 add : (line 37 from the original file)

</grid>
<!-- output grids -->
<grid id="grid_3D_out">
    <domain domain_ref="dom_out" />
    <axis axis_ref="altitude" />
</grid>
<grid id="grid_2D_out">
    <domain domain_ref="dom_out" />

Where to find .def files

More .def files are needed in order to run the complete Venus-DYNAMICO with LMDZ physics (compared to the Held&Suarez testCase) :

- run_icosa.def : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : The run_icosa.def Input File

- physics.def : everything linked to the LMDZ physics will be driven in this file.

- run.def : just a "bridge" for run_icosa.def and physics.def.

- z2sig.def : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works). See this page for more information : The z2sig.def Input File

Where to find others needed files

TO CONTINUE

Using the restart.nc file to continue your simulation

If you want to continue you simulation using the "end-data" of your previous one, all is explained here in the DYNAMICO page : using restart.nc