Difference between revisions of "Venus - DYNAMICO"
(→Execution - Test_Case Held&Suarez - DYNAMICO) |
(→Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO) |
||
Line 153: | Line 153: | ||
</pre> | </pre> | ||
− | After this, it is time to change the '''run.def''' file, | + | After this, it is time to change the '''run.def''' file, to be quick, you should change the '''"etat0"''' parameter into '''"venus"''', the '''"physics"''' parameter into '''"Lebonnois2012"''', the '''"day_step"''' (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) : |
<pre> | <pre> | ||
Line 244: | Line 244: | ||
</pre> | </pre> | ||
+ | (one can compare with the previous run.def to see the differences) | ||
+ | |||
Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster. | Everything is now ready to run the model. Go to '''test_VENUS''', then use the slurm command “'''sbatch'''” to submit a job to the cluster. |
Revision as of 15:11, 25 June 2024
Contents
Venus - DYNAMICO
Dynamico is the recently developed Dynamical core, enabling better performance and solving some issues of the LMDZ model. To know more about it, see this page: The DYNAMICO dynamical core
Installation - DYNAMICO
Before installing DYNAMICO, you should have previously installed LMDZ (and everything that refers to it), see this page: Quick Install and Run Venus PCM
Then, you should have XIOS too, do this:
cd /your/path/trunk/ svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS
Or see this page: The XIOS Library
It is more practical for what comes next, to have each package (LMDZ.COMMON, LMDZ.VENUS, XIOS, etc.) installed alongside each other.
Then you should compile XIOS (only once):
cd your/path/trunk/XIOS ./make_xios --prod --arch YOUR_ARCH --job 8
(You have to replace “YOUR_ARCH” by your architecture. Every architecture is listed in /XIOS/arch)
For each architecture, there will be 3 files: arch-YOUR_ARCH.env, arch-YOUR_ARCH.path, and arch-YOUR_ARCH.fcm.
No need to specify everything in the command line, just the name. For example, if my architecture is “ifort_MESOIPSL”, there will be the 3 files arch-ifort_MESOIPSL.env, arch-ifort_MESOIPSL.path, and arch-ifort_MESOIPSL.fcm, but my command line will be:
./make_xios --prod --arch ifort_MESOIPSL --job 8
This will be the same thing every time you have to specify your architecture.
To install DYNAMICO, you should clone the GitLab repository (once more, alongside XIOS, etc.):
cd /your/path/trunk/ git clone https://gitlab.in2p3.fr/ipsl/projets/dynamico/dynamico.git ICOSAGCM
A new folder named “ICOSAGCM” will now appear, it contains the model.
Compilation - DYNAMICO
Go to the repository /ICOSAGCM, then compile the model:
cd /your/path/trunk/ICOSAGCM/ ./make_icosa -parallel mpi_omp -with_xios -arch YOUR_ARCH -job 8 -debug
You can use the “-debug” option to compile in debug mode, but it will be slower to run.
The executable “icosa_gcm.exe” will be in ICOSAGCM/bin.
Execution - Test_Case Venus (type Held&Suarez) - DYNAMICO
Now, we will run a testCase “without the physics”, to verify that the Dynamical Core works alone.
To do this, make a new folder “test_VENUS”, alongside ICOSAGCM and XIOS.
cd /your/path/trunk/ mkdir test_VENUS
Then, we need to copy the specific .def files for this testCase (they are in /ICOSAGCM), we will use (in the most part) the same as in the Held&Suarez basic testCase.
# Go where the .def files are cd /your/path/trunk/ICOSAGCM/TEST_CASE/HELD_SUAREZ # Copy the .def files in the repository test_VENUS cp *def ../../../test_VENUS
Do the same for the .xml files:
cd /your/path/trunk/ICOSAGCM/xml/DYNAMICO_XML cp *xml ../../../test_VENUS cd .. cp iodef.xml ../../test_VENUS
Then, you should modify the stack size to avoid any segmentation fault when running. Change your ~/.bashrc:
gedit ~/.bashrc # This option will unlimit the stack size ulimit -s unlimited source ~/.bashrc
If gedit doesn’t work, use nano:
nano ~/.bashrc # This option will unlimit the stack size ulimit -s unlimited source ~/.bashrc
Then, copy the executable “icosa_gcm.exe” (it is placed in ICOSAGCM/bin) in the test directory test_VENUS :
cd /your/path/trunk/ICOSAGCM/bin cp icosa_gcm.exe ../../test_VENUS
If, when running the model, you want a NetCDF file (.nc) with all the data, you should modify the .xml file “file_def_dynamico.xml”, line 70, changing the “false” to “true” for “enabled”. This will enable the “dynamico.nc” file to be created, it is already a re-interpolation of the dynamico-grid into long-lat, which makes it usable directly with Ferret/Panoply.
Then, there are some changes to be made to the run.def and earth_const.def files. First, rename the earth_const.def file into venus_const.def file :
mv earth_const.def venus_const.def
Next, you should change the venus_const.def completely, to match the Venus atmosphere, here's an example of script :
# Rayon planétaire (m) radius = 6.0518e6 # Durée d'un jour (s)) daysec = 20995200 # Gravité : réel (par défaut = 8.87) g = 8.87 # Taux de rotation planétaire omega = 2.992e-7 # kappa=Rd/cpp kappa = 0.2857143 # Capacité thermique à pression constante : réel (par défaut = 1004.70885) cpp = 1004 # Pression de référence : réel (par défaut = 9200000) preff = 9.2e6
After this, it is time to change the run.def file, to be quick, you should change the "etat0" parameter into "venus", the "physics" parameter into "Lebonnois2012", the "day_step" (because of the long day one Venus), etc.... Rather than explaining all the different parameters that changes, here's an example of a complete script (that should work from scratch) :
#---------------- Mesh ---------------- # Number of subdivisions on a main triangle : integer (default=40) nbp = 40 # Number of vertical layers : integer (default=19) llm = 19 # Vertical grid : [std|ncar|ncarl30;strato;strato_custom;ncar;dcmip31;dcmip200;read_apbp;plugin] (default=std) disvert = std # Mesh optimisation : number of iterations : integer (default=0) optim_it = 1000 # Sub splitting of main rhombus : integer (default=1) nsplit_i = 1 nsplit_j = 1 #number of openmp task on vertical level omp_level_size=1 #---------------- Numerics ---------------- # Advection called every itau_adv time steps : integer (default=2) itau_adv = 1 # Time step in s : real (default=480) # dt = 720 # Alternative to specifying "dt", specify number of steps per day : day_step day_step = 240000 # Number of tracers : integer (default=1) nqtot = 1 #---------------- Time and output ---------------- # Time style : [none|dcmip] (default=dcmip) time_style = none # Run length in s : real (default=??) # run_length = 1036800 # Alternative to specifying "run_length", specify number of days to run : ndays ndays=1 # Interval in s between two outputs : integer (default=??) write_period = 314928 #---------------- Planet ---------------- INCLUDEDEF=venus_const.def #---------------- Physical parameters ---------------- # Initial state : # [jablonowsky06|academic|dcmip[1-4]|heldsz|dcmip2_schaer_noshear] (default=jablonowsky06) etat0 = venus # Si on souhaite repartir de la "fin" d'une précédente simulation #etat0=start_file #start_file_name=start # Physics package : [none|held_suarez|dcmip] (default=none) physics = Lebonnois2012 # Dissipation time for grad(div) : real (default=5000) tau_graddiv = 18000 # Exponent of grad(div) disspation : integer (default=1) nitergdiv = 2 # Dissipation time for curl(curl) : real (default=5000) tau_gradrot = 18000 # Exponent of curl(curl) disspation : integer (default=1) nitergrot = 2 # Dissipation time for div(grad) : real (default=5000) tau_divgrad = 18000 # Exponent of div(grad) disspation : integer (default=1) niterdivgrad = 2
(one can compare with the previous run.def to see the differences)
Everything is now ready to run the model. Go to test_VENUS, then use the slurm command “sbatch” to submit a job to the cluster.
cd /your/path/trunk/test_HELD_SUAREZ sbatch script_d_execution.slurm
Slurm script (example for spirit1):
#!/bin/bash
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=8
#SBATCH --cpus-per-task=1
#SBATCH --partition=zen4 # zen4: 64cores/node and 240GB of memory
##SBATCH --partition=zen16 # zen16: 32 cores/node core and 496GB of memory
#SBATCH -J job_mpi_omp
#SBATCH --time=0:20:00
#SBATCH --output %x.%j.out
source /your/path/trunk/ICOSAGCM/arch/arch-YOUR_ARCH.env
export OMP_NUM_THREADS=1
export OMP_STACKSIZE=400M
mpirun icosa_gcm.exe > icosa_gcm.out 2>&1
In this script, you should modify the path and “YOUR_ARCH” with your architecture (for the source command). Note that we are not using OpenMP here, it is not functional for now (TO UPDATE).
To verify that the code is properly running, you can show directly the “icosa_gcm.out” file.
tail -f icosa_gcm.out
Once the code is finished running, something like this should appear (at the end of the icosa_gcm.out):
GETIN restart_file_name = restart masse advec mass rmsdpdt energie enstrophie entropie rmsv mt.ang GLOB -0.999E-15 0.000E+00 0.79047E+01 0.110E-02 0.261E+00 0.155E-02 0.743E+01 0.206E-01 Time elapsed : 601.628763000000
Connection Venus - DYNAMICO
Now that we verified that the testCase HELD_and_SUAREZ is working, we will be able to “plug” the DYNAMICO dynamical core to some real Physics. For this, you already need LMDZ, alongside XIOS and DYNAMICO.
In addition, you should have ICOSA_LMDZ:
cd /your/path/trunk/ svn update -r 2655 -q ICOSA_LMDZ
You can also make a file named ARCH, and put your arch-YOUR_ARCH.env and arch-YOUR_ARCH.path files.
cd /your/path/trunk/ mkdir ARCH cd /your/path/trunk/ICOSAGCM/arch cp YOUR_MACHINE.env /your/path/trunk/ARCH cp YOUR_MACHINE.path /your/path/trunk/ARCH
Once more, it is more practical to install every new package alongside the others. Here’s what you should have after all the previous steps:
cd /your/path/trunk/ ls ARCH ICOSAGCM ICOSA_LMDZ LMDZ.COMMON LMDZ.VENUS IOIPSL XIOS test_HELD_SUAREZ
If you are missing one of these folders, repeat the previous sections for ICOSAGCM.
Or follow the documentation for installing the Venus GCM (which will give you LMDZ.COMMON, LMDZ.VENUS, IOIPSL): Quick Install and Run Venus PCM
For XIOS: The XIOS Library
For ICOSA_LMDZ: we just installed it before.
Compilation Venus - DYNAMICO
Everything needs to be properly compiled before you can run the model. Here’s how to do it:
(some compilation has already been done, but this recaps everything)
!! Everything has to be compiled in the right order !!
Compile IOIPSL:
cd /your/path/trunk/LMDZ.COMMON/ioipsl ./install_ioipsl_YOUR_ARCH.bash
Compile XIOS:
cd ../../XIOS/ ./make_xios --prod --arch YOUR_ARCH --arch_path ../ARCH --job 8
Compile LMDZ.COMMON (the Physics packages):
cd ../LMDZ.COMMON/ ./makelmdz_fcm -arch YOUR_ARCH -io xios -p venus -d 48x32x50 -j 8 gcm -parallel mpi
“-d 48x32x50” is the model dimension, it can be changed.
Compile ICOSAGCM (the Dynamical Core):
./make_icosa -parallel mpi -with_xios -arch YOUR_ARCH -job 8
Eventually, compile ICOSA_LMDZ:
./make_icosa_lmdz -p venus -parallel mpi -arch YOUR_ARCH -arch_path ../ARCH -job 8 -nodeps
The executable “icosa_lmdz.exe” will be in /ICOSA_LMDZ/bin/.
Running Venus - DYNAMICO
After compiling everything in the right order, we need to prepare the directory. Make a new one alongside the others:
cd /your/path/trunk/ mkdir Venus_DYNAMICO
See the README.md in /ICOSA_LMDZ to know what files to take where :
organization of XML files and synchronization with code
-------------------------------------------------------
from ICOSAGCM/xml [DYNAMICO dynamical core]
- context_input_dynamico.xml
- field_def_dynamico.xml
from ICOSA_LMDZ/xml [INTERFACE]
- iodef.xml
from LMDZ.VENUS/deftank [LMDZ physics]
- field_def_physics.xml
- context_lmdz_physics.xml
-----
to be created and adapted from ICOSAGCM/xml
>> check compatibility when changing ICOSAGCM version
- context_dynamico.xml
to be created and adapted from ICOSAGCM test cases
>> check compatibility when changing ICOSAGCM version
- file_def_dynamico.xml
From ICOSAGCM/xml/DYNAMICO/ :
cd /your/path/ICOSAGCM/xml/DYNAMICO cp context_input_dynamico.xml field_def_dynamico.xml dynamico.xml nudging_dynamico.xml sponge_dynamico.xml ../../Venus_DYNAMICO
From ICOSA_LMDZ :
cd /your/path/ICOSA_LMDZ/xml cp iodef.xml ../../Venus/DYNAMICO
From LMDZ.VENUS :
cd /your/path/LMDZ.VENUS/deftank cp field_def_physics.xml context_lmdz_physics.xml ../../Venus_DYNAMICO
TO CONTINUE