Difference between revisions of "Venus - DYNAMICO"

From Planets
Jump to: navigation, search
(Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO)
 
(25 intermediate revisions by 2 users not shown)
Line 2: Line 2:
  
 
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:
 
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page:
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core The DYNAMICO dynamical core]
+
[[The DYNAMICO dynamical core]].
  
 
== Installation - DYNAMICO ==
 
== Installation - DYNAMICO ==
  
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ''' (and everything that refers to it), see this page:
+
Before installing '''DYNAMICO''', you should have previously installed '''LMDZ Venus''' (and everything that refers to it), see this page:
 
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]
 
[https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]
  
Then, you should have '''XIOS''' too, do this:
+
One should also see the '''PCM directory layout''' page to understand and install everything : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/PCM_directory_layout PCM directory layout].
  
<pre>
+
You should have '''XIOS''' (please see this page for its installation: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]).
cd /your/path/trunk/
 
svn co -r 2319 -q http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS
 
</pre>
 
 
 
Or see this page: [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]
 
 
 
It is more practical for what comes next, to have each package ('''LMDZ.COMMON''', '''LMDZ.VENUS''', '''XIOS''', etc.) installed alongside each other.
 
 
 
Then you should compile '''XIOS''' (only once):
 
 
 
<pre>
 
cd your/path/trunk/XIOS
 
./make_xios --prod --arch YOUR_ARCH --job 8
 
</pre>
 
 
 
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)
 
 
 
For each architecture, there will be 3 files: '''arch-YOUR_ARCH.env''', '''arch-YOUR_ARCH.path''', and '''arch-YOUR_ARCH.fcm'''.
 
 
 
No need to specify everything in the command line, just the name. For example, if my architecture is “'''ifort_MESOIPSL'''”, there will be the 3 files '''arch-ifort_MESOIPSL.env''', '''arch-ifort_MESOIPSL.path''', and '''arch-ifort_MESOIPSL.fcm''', but my command line will be:
 
 
 
<pre>
 
./make_xios --prod --arch ifort_MESOIPSL --job 8
 
</pre>
 
 
 
This will be the same thing every time you have to specify your architecture.
 
 
 
To install '''DYNAMICO''', you should clone the '''GitLab''' repository (once more, alongside '''XIOS''', etc.):
 
 
 
<pre>
 
cd /your/path/trunk/
 
git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM
 
</pre>
 
 
 
A new folder named “'''ICOSAGCM'''” will now appear, it contains the model.
 
 
 
== Compilation - DYNAMICO ==
 
 
 
Go to the repository /ICOSAGCM, then compile the model:
 
 
 
<pre>
 
cd /your/path/trunk/ICOSAGCM/
 
./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug
 
</pre>
 
 
 
You can use the “-debug” option to compile in debug mode, but it will be slower to run.
 
  
The executable “'''icosa_gcm.exe'''” will be in ICOSAGCM/bin.
+
Finally, you also need [[the DYNAMICO dynamical core|DYNAMICO '''installed and compiled''']].
  
== Execution - Test_Case - DYNAMICO ==
+
== Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO ==
  
 
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.
 
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.
  
To do this, make a new folder “'''test_HELD_SUAREZ'''”, alongside '''ICOSAGCM''' and '''XIOS'''.
+
To do this, make a new folder “'''test_VENUS'''”, alongside '''ICOSAGCM''' and '''XIOS'''.
  
 
<pre>
 
<pre>
 
cd /your/path/trunk/
 
cd /your/path/trunk/
mkdir test_HELD_SUAREZ
+
mkdir test_VENUS
 
</pre>
 
</pre>
  
Then, we need to copy the specific .def files for this testCase. (they are in /ICOSAGCM)
+
Then, we need to copy the specific .def files for this testCase (they are in /ICOSAGCM), we will use (in the most part) the same as in the [[The_DYNAMICO_dynamical_core#Running_a_simple_Held_and_Suarez_test_case|Held&Suarez basic testCase]].
  
 
<pre>
 
<pre>
Line 78: Line 32:
 
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ
 
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ
  
# Copy the .def files in the repository test_HELD_SUAREZ
+
# Copy the .def files in the repository test_VENUS
cp *def ../../../test_HELD_SUAREZ
+
cp *def ../../../test_VENUS
 
</pre>
 
</pre>
  
Line 86: Line 40:
 
<pre>
 
<pre>
 
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML
 
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML
cp *xml ../../../test_HELD_SUAREZ
+
cp *xml ../../../test_VENUS
  
 
cd ..
 
cd ..
cp iodef.xml ../../test_HELD_SUAREZ
+
cp iodef.xml ../../test_VENUS
 
</pre>
 
</pre>
  
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:
+
Then, copy the executable '''“icosa_gcm.exe”''' (it is placed in '''ICOSAGCM/bin''') in the test directory test_VENUS :
  
 
<pre>
 
<pre>
gedit ~/.bashrc
+
cd /your/path/trunk/ICOSAGCM/bin
 
+
cp icosa_gcm.exe ../../test_VENUS
# This option will unlimit the stack size
 
ulimit -s unlimited
 
 
 
source ~/.bashrc
 
 
</pre>
 
</pre>
  
If gedit doesn’t work, use nano:
+
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.
  
 +
Then, there are some changes to be made to the run.def and earth_const.def files.
 +
First, rename the '''earth_const.def''' file into '''venus_const.def''' file :
 
<pre>
 
<pre>
nano ~/.bashrc
+
mv earth_const.def venus_const.def
 +
</pre>
  
# This option will unlimit the stack size
+
Next, you should [[venus_const.def|change the '''venus_const.def''' completely, to match the Venus atmosphere]].
ulimit -s unlimited
 
  
source ~/.bashrc
 
</pre>
 
  
Then, copy the executable “icosa_gcm.exe” in the test directory (it is placed in ICOSAGCM/bin):
+
After this, it is time to change [[Run.def for Venus-DYNAMICO|the '''run.def''' file]], to be quick, you should change the '''"etat0"''' parameter into '''"venus"''', the '''"physics"''' parameter into '''"Lebonnois2012"''', the '''"day_step"''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, [[Run.def for Venus-DYNAMICO|see the dedicated example]] of a complete script (that should work from scratch) :
  
<pre>
+
(one can compare with the [[run.def for Held&Suarez test case]] to see the differences)
cd /your/path/trunk/ICOSAGCM/bin
 
cp icosa_gcm.exe ../../test_HELD_SUAREZ
 
</pre>
 
  
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “'''file_def_dynamico.xml'''”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.
 
  
Everything is ready to run the model. Go to '''test_HELD_SUAREZ''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.
+
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster.
  
 
<pre>
 
<pre>
Line 130: Line 76:
 
</pre>
 
</pre>
  
Slurm script (example):
+
Slurm script (example for spirit1):
  
 
<syntaxhighlight lang="bash">
 
<syntaxhighlight lang="bash">
Line 173: Line 119:
 
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.
 
Now that we verified that the testCase '''HELD_and_SUAREZ''' is working, we will be able to “plug” the '''DYNAMICO''' dynamical core to some real Physics. For this, you already need '''LMDZ''', alongside '''XIOS''' and '''DYNAMICO'''.
  
In addition, you should have '''ICOSA_LMDZ''':
+
In addition, you should have '''[[ICOSA LMDZ directory layout and contents|ICOSA_LMDZ/]]'''.
 
+
Your trunk folder should look like this:
 
<pre>
 
<pre>
cd /your/path/trunk/
 
svn update -r 2655 -q ICOSA_LMDZ
 
</pre>
 
 
You can also make a file named '''ARCH''', and put your '''arch-YOUR_ARCH.env''' and '''arch-YOUR_ARCH.path''' files.
 
 
<pre>
 
cd /your/path/trunk/
 
mkdir ARCH
 
cd /your/path/trunk/ICOSAGCM/arch
 
cp YOUR_MACHINE.env /your/path/trunk/ARCH
 
cp YOUR_MACHINE.path /your/path/trunk/ARCH
 
</pre>
 
 
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:
 
 
<pre>
 
cd /your/path/trunk/
 
ls
 
 
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ   
 
ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ   
 
</pre>
 
</pre>
  
If you are missing one of these folders, repeat the previous sections for '''ICOSAGCM'''.
+
Please now follow how to [[DYNAMICO_with_LMDZ_physics#Compiling_DYNAMICO_with_a_LMDZ_physics_package|compile DYNAMICO with a physics package]].
 
 
Or follow the documentation for installing the Venus GCM (which will give you '''LMDZ.COMMON''', '''LMDZ.VENUS''', '''IOIPSL'''): [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/Quick_Install_and_Run_Venus_PCM Quick Install and Run Venus PCM]
 
 
 
For '''XIOS''': [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_XIOS_Library The XIOS Library]
 
 
 
For '''ICOSA_LMDZ''': we just installed it before.
 
 
 
== Compilation Venus - DYNAMICO ==
 
 
 
Everything needs to be properly compiled before you can run the model. Here’s how to do it:
 
 
 
(some compilation has already been done, but this recaps everything)
 
 
 
!! Everything has to be compiled in the right order !!
 
 
 
Compile '''IOIPSL''':
 
 
 
<pre>
 
cd /your/path/trunk/LMDZ.COMMON/ioipsl
 
./install_ioipsl_YOUR_ARCH.bash
 
</pre>
 
 
 
Compile '''XIOS''':
 
 
 
<pre>
 
cd ../../XIOS/
 
./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8
 
</pre>
 
 
 
Compile '''LMDZ.COMMON''' (the Physics packages):
 
 
 
<pre>
 
cd ../LMDZ.COMMON/
 
./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi
 
</pre>
 
 
 
“-d 48x32x50” is the model dimension, it can be changed.
 
 
 
Compile '''ICOSAGCM''' (the Dynamical Core):
 
 
 
<pre>
 
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8
 
</pre>
 
 
 
Eventually, compile '''ICOSA_LMDZ''':
 
 
 
<pre>
 
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps
 
</pre>
 
 
 
The executable “'''icosa_lmdz.exe'''” will be in /ICOSA_LMDZ/bin/.
 
  
 
== Running Venus - DYNAMICO ==
 
== Running Venus - DYNAMICO ==
Line 260: Line 136:
 
</pre>
 
</pre>
  
See the README.md in /ICOSA_LMDZ to know what files to take where :
+
=== Where to find .xml files ===
 +
 
 +
See the README.md in '''[[ICOSA LMDZ directory layout and contents|ICOSA_LMDZ/]]''' to know what '''xml''' files to take where (note that this exemple written right after correspond to the '''current''' way of choosing '''xml files''', this if therefore strongly adviced to open the README that should be always up to date) :
  
 
<syntaxhighlight lang="text">
 
<syntaxhighlight lang="text">
Line 269: Line 147:
 
- context_input_dynamico.xml
 
- context_input_dynamico.xml
 
- field_def_dynamico.xml
 
- field_def_dynamico.xml
 +
- nudging_dynamico.xml
 +
- sponge_dynamico.xml
  
 
from ICOSA_LMDZ/xml [INTERFACE]
 
from ICOSA_LMDZ/xml [INTERFACE]
Line 309: Line 189:
 
</pre>
 
</pre>
  
[[Category:Venus Model]]
+
When writing this documentation, the '''context_lmdz_physics.xml''' contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a '''"dom_glo"''' issue when running DYNAMICO, here are the lines to change/add :
 +
 
 +
Line 7 to completely replace by : (replacing between <domain_definition> and </domain_definition>)
 +
<pre>
 +
<domain_group id="dom_glo" data_dim="1" >
 +
  <domain id="dom_glo" />
 +
</domain_group>
 +
 
 +
<domain id="dom_regular" ni_glo="96" nj_glo="97" type="rectilinear"  >
 +
      <generate_rectilinear_domain lat_start="-90" lat_end="90" lon_start="180" lon_end="-176.25" />
 +
      <interpolate_domain order="1"/>
 +
</domain>
 +
 
 +
<domain id="dom_out" domain_ref="dom_regular"/>
 +
 
 +
</pre>
 +
 
 +
Then lines 44 to 46, add : (line 34 from the original file)
 +
<pre>
 +
<grid id="grid_2D">
 +
    <domain domain_ref="dom_glo" />
 +
</grid>
 +
 
 +
</pre>
 +
 
 +
Then lines 50 to 57 add : (line 37 from the original file)
 +
<pre>
 +
</grid>
 +
<!-- output grids -->
 +
<grid id="grid_3D_out">
 +
    <domain domain_ref="dom_out" />
 +
    <axis axis_ref="altitude" />
 +
</grid>
 +
<grid id="grid_2D_out">
 +
    <domain domain_ref="dom_out" />
 +
 
 +
</pre>
 +
 
 +
=== Where to find .def files ===
 +
More '''.def''' files are needed in order to run the complete '''Venus-DYNAMICO with LMDZ physics''' (compared to the Held&Suarez testCase) :
 +
 
 +
- '''run_icosa.def''' : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_run_icosa.def_Input_File The run_icosa.def Input File]
 +
 
 +
- '''physics.def''' : everything linked to the LMDZ physics will be driven in this file.
 +
 
 +
- '''run.def''' : just a "bridge" for run_icosa.def and physics.def.
 +
 
 +
- '''z2sig.def''' : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works). See this page for more information : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_z2sig.def_Input_File The z2sig.def Input File]
 +
 
 +
=== Where to find others needed files ===
 +
 
 +
 
 +
TO CONTINUE
 +
 
 +
== Using the restart.nc file to continue your simulation ==
 +
If you want to continue you simulation using the "end-data" of your previous one, all is explained here in the DYNAMICO page : [https://lmdz-forge.lmd.jussieu.fr/mediawiki/Planets/index.php/The_DYNAMICO_dynamical_core#Using_the_restart.nc_file_to_continue_your_simulation using restart.nc]
 +
 
 +
[[Category:DYNAMICO]]
 +
[[Category:Venus-DYNAMICO]]
 +
[[Category:Venus-Model]]

Latest revision as of 17:50, 18 October 2024

Venus - DYNAMICO

Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page: The DYNAMICO dynamical core.

Installation - DYNAMICO

Before installing DYNAMICO, you should have previously installed LMDZ Venus (and everything that refers to it), see this page: Quick Install and Run Venus PCM

One should also see the PCM directory layout page to understand and install everything : PCM directory layout.

You should have XIOS (please see this page for its installation: The XIOS Library).

Finally, you also need DYNAMICO installed and compiled.

Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO

Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.

To do this, make a new folder “test_VENUS”, alongside ICOSAGCM and XIOS.

cd /your/path/trunk/
mkdir test_VENUS

Then, we need to copy the specific .def files for this testCase (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&Suarez basic testCase.

# Go where the .def files are
cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ

# Copy the .def files in the repository test_VENUS
cp *def ../../../test_VENUS

Do the same for the .xml files:

cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML
cp *xml ../../../test_VENUS

cd ..
cp iodef.xml ../../test_VENUS

Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :

cd /your/path/trunk/ICOSAGCM/bin
cp icosa_gcm.exe ../../test_VENUS

If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “file_def_dynamico.xml”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.

Then, there are some changes to be made to the run.def and earth_const.def files. First, rename the earth_const.def file into venus_const.def file :

mv earth_const.def venus_const.def

Next, you should change the venus_const.def completely, to match the Venus atmosphere.


After this, it is time to change the run.def file, to be quick, you should change the "etat0" parameter into "venus", the "physics" parameter into "Lebonnois2012", the "day_step" (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, see the dedicated example of a complete script (that should work from scratch) :

(one can compare with the run.def for Held&Suarez test case to see the differences)


Everything is now ready to run the model. Go to test_VENUS, then use the slurm command “sbatch” to submit a job to the cluster.

cd /your/path/trunk/test_HELD_SUAREZ
sbatch script_d_execution.slurm

Slurm script (example for spirit1):

#!/bin/bash
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=8
#SBATCH --cpus-per-task=1
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory
#SBATCH -J job_mpi_omp
#SBATCH --time=0:20:00
#SBATCH --output %x.%j.out

source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env

export OMP_NUM_THREADS=1
export OMP_STACKSIZE=400M

mpirun icosa_gcm.exe > icosa_gcm.out 2>&1

In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).

To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.

tail -f icosa_gcm.out

Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):

GETIN restart_file_name = restart
      masse     advec mass     rmsdpdt      energie   enstrophie     entropie     rmsv     mt.ang
GLOB  -0.999E-15 0.000E+00  0.79047E+01    0.110E-02    0.261E+00    0.155E-02    0.743E+01    0.206E-01

Time elapsed :    601.628763000000    

Connection Venus - DYNAMICO

Now that we verified that the testCase HELD_and_SUAREZ is working, we will be able to “plug” the DYNAMICO dynamical core to some real Physics. For this, you already need LMDZ, alongside XIOS and DYNAMICO.

In addition, you should have ICOSA_LMDZ/. Your trunk folder should look like this:

ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ  

Please now follow how to compile DYNAMICO with a physics package.

Running Venus - DYNAMICO

After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:

cd /your/path/trunk/
mkdir Venus_DYNAMICO

Where to find .xml files

See the README.md in ICOSA_LMDZ/ to know what xml files to take where (note that this exemple written right after correspond to the current way of choosing xml files, this if therefore strongly adviced to open the README that should be always up to date) :

organization of XML files and synchronization with code
-------------------------------------------------------

from ICOSAGCM/xml [DYNAMICO dynamical core]
- context_input_dynamico.xml
- field_def_dynamico.xml
- nudging_dynamico.xml
- sponge_dynamico.xml

from ICOSA_LMDZ/xml [INTERFACE]
- iodef.xml

from LMDZ.VENUS/deftank [LMDZ physics]
- field_def_physics.xml
- context_lmdz_physics.xml

-----

to be created and adapted from ICOSAGCM/xml
>> check compatibility when changing ICOSAGCM version 
- context_dynamico.xml

to be created and adapted from ICOSAGCM test cases
>> check compatibility when changing ICOSAGCM version
- file_def_dynamico.xml

From ICOSAGCM/xml/DYNAMICO/ :

cd /your/path/ICOSAGCM/xml/DYNAMICO
cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO

From ICOSA_LMDZ :

cd /your/path/ICOSA_LMDZ/xml
cp iodef.xml ../../Venus/DYNAMICO

From LMDZ.VENUS :

cd /your/path/LMDZ.VENUS/deftank
cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO

When writing this documentation, the context_lmdz_physics.xml contained in LMDZ.VENUS/deftank is probably lacking some lines, and you should have a "dom_glo" issue when running DYNAMICO, here are the lines to change/add :

Line 7 to completely replace by : (replacing between <domain_definition> and </domain_definition>)

<domain_group id="dom_glo" data_dim="1" >
  <domain id="dom_glo" />
</domain_group>

<domain id="dom_regular" ni_glo="96" nj_glo="97" type="rectilinear"  >
      <generate_rectilinear_domain lat_start="-90" lat_end="90" lon_start="180" lon_end="-176.25" />
      <interpolate_domain order="1"/>
</domain>

<domain id="dom_out" domain_ref="dom_regular"/>

Then lines 44 to 46, add : (line 34 from the original file)

<grid id="grid_2D">
    <domain domain_ref="dom_glo" />
</grid>

Then lines 50 to 57 add : (line 37 from the original file)

</grid>
<!-- output grids -->
<grid id="grid_3D_out">
    <domain domain_ref="dom_out" />
    <axis axis_ref="altitude" />
</grid>
<grid id="grid_2D_out">
    <domain domain_ref="dom_out" />

Where to find .def files

More .def files are needed in order to run the complete Venus-DYNAMICO with LMDZ physics (compared to the Held&Suarez testCase) :

- run_icosa.def : everything linked to the DYNAMICO dynamical core will be driven in this file, see this page : The run_icosa.def Input File

- physics.def : everything linked to the LMDZ physics will be driven in this file.

- run.def : just a "bridge" for run_icosa.def and physics.def.

- z2sig.def : defines the vertical level of discretization, to find in LMDZ.VENUS. (there are many vertical discretization, 50, 78 etc.... 50 levels will be quicker to run, therefore the best way to test if everything works). See this page for more information : The z2sig.def Input File

Where to find others needed files

TO CONTINUE

Using the restart.nc file to continue your simulation

If you want to continue you simulation using the "end-data" of your previous one, all is explained here in the DYNAMICO page : using restart.nc