Skip to content

Commit

Permalink
[develop] Update ufs-weather-model hash and UPP hash and use upp-addo…
Browse files Browse the repository at this point in the history
…n-env spack-stack environment (ufs-community#1136)

* Update ufs-weather-model hash to 38a29a6 (September 19)
* Update UPP hash to 81b38a8 (August 13)
* All Tier-1 modulefiles/build_*.lua files have been updated to use the upp-addon-env spack-stack environment
* srw_common.lua was updated to use g2/3.5.1 and g2tmpl/1.13.0 (these are required for UPP)
* .cicd/JENKINSFILE was updated to replace cheyenne entries with derecho.
* The doc/tables/Tests.csv table had nco-mode WE2E tests removed
* The doc/UsersGuide/CustomizingTheWorkflow/ConfigWorkflow.rst documentation was updated to updated ush/config_defaults.yaml file.
* The .github/CODEOWNERS file was updated to add Bruce Kropp to the list of reviewers
* The exregional_plot_allvars.py and exregional_plot_allvars_diff.py scripts were updated to address changes made to the postxconfig-NT-fv3lam.txt file.
* Updated ush/config_defaults.yaml to update PE_MEMBER01 calculation and documentation for OMP_NUM_THREADS_RUN_FCST to allow for the run_fcst task to properly run on Tier-1 platforms after updates to allow threading to function properly.
* The ush/machine/*.yaml files were updated to allow for the run_fcst task to properly run on Tier-1 platforms after updates to allow threading to function properly.
* There are not enough resources on Jet to run the high resolution WE2E tests (136 (ReqNodeNotAvail)). Commented out the tests in the comprehensive.jet test suite and removed one test from the coverage.jet test suite.
* The ufs-case-studies WE2E tests are currently failing on Derecho. The failure is due to the file not being available. This is an issue because the file in question is named correctly and is available, but the tests fail in the get_extrn_ics/lbs tasks stating that the files aren't present. Commented out these tests in comprehensive.derecho and moved WE2E tests to remove from coverage.derecho. Issue ufs-case-studies WE2E tests fail on Derecho in get_extrn_ics/lbcs ufs-community#1144 was opened to track this issue on Derecho.
  • Loading branch information
MichaelLueken authored Nov 1, 2024
1 parent 563dd40 commit 87b26cc
Show file tree
Hide file tree
Showing 41 changed files with 191 additions and 82 deletions.
4 changes: 2 additions & 2 deletions .cicd/Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@ pipeline {
parameters {
// Allow job runner to filter based on platform
// Use the line below to enable all PW clusters
// choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'cheyenne', 'gaea', 'hera', 'jet', 'orion', 'hercules', 'pclusternoaav2use1', 'azclusternoaav2eus1', 'gclusternoaav2usc1'], description: 'Specify the platform(s) to use')
// choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'derecho', 'gaea', 'hera', 'jet', 'orion', 'hercules', 'pclusternoaav2use1', 'azclusternoaav2eus1', 'gclusternoaav2usc1'], description: 'Specify the platform(s) to use')
// Use the line below to enable the PW AWS cluster
// choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'cheyenne', 'gaea', 'hera', 'jet', 'orion', 'hercules', 'pclusternoaav2use1'], description: 'Specify the platform(s) to use')
// choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'derecho', 'gaea', 'hera', 'jet', 'orion', 'hercules', 'pclusternoaav2use1'], description: 'Specify the platform(s) to use')
choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'derecho', 'gaea', 'hera', 'jet', 'orion', 'hercules'], description: 'Specify the platform(s) to use')
// Allow job runner to filter based on compiler
choice(name: 'SRW_COMPILER_FILTER', choices: ['all', 'gnu', 'intel'], description: 'Specify the compiler(s) to use to build')
Expand Down
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

# These owners will be the default owners for everything in the repo.
#* @defunkt
* @mkavulich @gsketefian @JeffBeck-NOAA @RatkoVasic-NOAA @BenjaminBlake-NOAA @ywangwof @chan-hoo @panll @christinaholtNOAA @christopherwharrop-noaa @danielabdi-noaa @mark-a-potts @jkbk2004 @willmayfield @dmwright526 @gspetro-NOAA @natalie-perlin @EdwardSnyder-NOAA @MichaelLueken @rickgrubin-noaa
* @mkavulich @gsketefian @JeffBeck-NOAA @RatkoVasic-NOAA @BenjaminBlake-NOAA @ywangwof @chan-hoo @panll @christinaholtNOAA @christopherwharrop-noaa @danielabdi-noaa @mark-a-potts @jkbk2004 @willmayfield @dmwright526 @gspetro-NOAA @natalie-perlin @EdwardSnyder-NOAA @MichaelLueken @rickgrubin-noaa @BruceKropp-Raytheon

# Order is important. The last matching pattern has the most precedence.
# So if a pull request only touches javascript files, only these owners
Expand Down
4 changes: 2 additions & 2 deletions Externals.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ protocol = git
repo_url = https://github.com/ufs-community/ufs-weather-model
# Specify either a branch name or a hash but not both.
#branch = develop
hash = a1143cc
hash = 38a29a6
local_path = sorc/ufs-weather-model
required = True

Expand All @@ -21,7 +21,7 @@ protocol = git
repo_url = https://github.com/NOAA-EMC/UPP
# Specify either a branch name or a hash but not both.
#branch = develop
hash = be0410e
hash = 81b38a8
local_path = sorc/UPP
required = True

Expand Down
6 changes: 3 additions & 3 deletions doc/UsersGuide/CustomizingTheWorkflow/ConfigWorkflow.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1097,7 +1097,7 @@ For each workflow task, certain parameter values must be passed to the job sched
For more information, see the `Intel Development Reference Guide <https://www.intel.com/content/www/us/en/docs/cpp-compiler/developer-guide-reference/2021-10/thread-affinity-interface.html>`__.

``OMP_NUM_THREADS_RUN_FCST``: (Default: 2)
The number of OpenMP threads to use for parallel regions. Corresponds to the ``atmos_nthreads`` value in ``model_configure``.
The number of OpenMP threads to use for parallel regions. Corresponds to the ``ATM_omp_num_threads`` value in ``ufs.configure``.

``OMP_STACKSIZE_RUN_FCST``: (Default: "1024m")
Controls the size of the stack for threads created by the OpenMP implementation.
Expand Down Expand Up @@ -1163,12 +1163,12 @@ Write-Component (Quilting) Parameters
``PRINT_ESMF``: (Default: false)
Flag that determines whether to output extra (debugging) information from :term:`ESMF` routines. Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). Valid values: ``True`` | ``False``

``PE_MEMBER01``: (Default: ``'{{ LAYOUT_Y * LAYOUT_X + WRTCMP_write_groups * WRTCMP_write_tasks_per_group if QUILTING else LAYOUT_Y * LAYOUT_X}}'``)
``PE_MEMBER01``: (Default: ``'{{ OMP_NUM_THREADS_RUN_FCST * (LAYOUT_Y * LAYOUT_X + WRTCMP_write_groups * WRTCMP_write_tasks_per_group) if QUILTING else OMP_NUM_THREADS_RUN_FCST * (LAYOUT_Y * LAYOUT_X)}}'``)
The number of MPI processes required by the forecast. When QUILTING is true, it is calculated as:

.. math::
LAYOUT\_X * LAYOUT\_Y + WRTCMP\_write\_groups * WRTCMP\_write\_tasks\_per\_group
OMP\_NUM\_THREADS\_RUN\_FCST * (LAYOUT\_X * LAYOUT\_Y + WRTCMP\_write\_groups * WRTCMP\_write\_tasks\_per\_group)
``WRTCMP_write_groups``: (Default: "")
The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. Each write group will write to one set of output files (a ``dynf${fhr}.nc`` and a ``phyf${fhr}.nc`` file, where ``${fhr}`` is the forecast hour). Each write group contains ``WRTCMP_write_tasks_per_group`` tasks. Usually, one write group is sufficient. This may need to be increased if the forecast is proceeding so quickly that a single write group cannot complete writing to its set of files before there is a need/request to start writing the next set of files at the next output time.
Expand Down
8 changes: 1 addition & 7 deletions doc/tables/Tests.csv
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
Fundamental,Comprehensive,Test Name,PREDEF_GRID_NAME,CCPP_PHYS_SUITE,EXTRN_MDL_NAME_ICS,EXTRN_MDL_NAME_LBCS,DATES (UTC),FCST_LEN_HRS (hrs),est. core hours, walltime (min),notes
yes,yes,grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta,RRFS_CONUScompact_25km,FV3_RRFS_v1beta,HRRR,RAP,2020081000,3,8,22,
yes,yes,nco_grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_timeoffset_suite_GFS_v16,RRFS_CONUS_25km,FV3_GFS_v16,FV3GFS,FV3GFS,2022081012,6,10,15,
yes,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2,RRFS_CONUS_25km,FV3_GFS_v15p2,FV3GFS,FV3GFS,2019070100,6,7,10,
yes,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot,RRFS_CONUS_25km,FV3_GFS_v17_p8,FV3GFS,FV3GFS,2019070100,6,11,15,
yes,yes,grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR,RRFS_CONUScompact_25km,FV3_HRRR,HRRR,HRRR,2020081000,24,26,20
Expand Down Expand Up @@ -50,11 +49,6 @@ yes,yes,grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_GFS_v16,RRFS_CONUS_25km,FV3_
,yes,grid_SUBCONUS_Ind_3km_ics_RAP_lbcs_RAP_suite_RRFS_v1beta,SUBCONUS_Ind_3km,FV3_RRFS_v1beta,RAP,RAP,2020080103,3,20,22,
,yes,MET_ensemble_verification_only_vx,RRFS_CONUS_25km,*none*,*none*,*none*,2019061500,6,1,15,Runs ensemble verification tasks on staged data without running the rest of the workflow
,yes,MET_verification_only_vx,RRFS_CONUS_25km,*none*,*none*,*none*,2019061500,6,1,8,Runs verification tasks on staged data without running the rest of the workflow
,yes,nco,RRFS_CONUS_25km,FV3_GFS_v16,FV3GFS,FV3GFS,2022040700,6,7,20,
,yes,nco_ensemble,RRFS_CONUS_25km,FV3_GFS_v15p2,FV3GFS,FV3GFS,2019070100 2019070112 2019070200 2019070212,6,55,20,
,yes,nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16,RRFS_CONUS_13km,FV3_GFS_v16,FV3GFS,FV3GFS,2019061500,6,26,20,
,yes,nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson_mynn_lam3km,RRFS_CONUS_3km,FV3_GFS_v15_thompson_mynn_lam3km,FV3GFS,FV3GFS,2019061500,6,320,25,
,yes,nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR,RRFS_CONUScompact_25km,FV3_HRRR,HRRR,RAP,2020081000,6,12,17,
,yes,pregen_grid_orog_sfc_climo,RRFS_CONUS_25km,FV3_GFS_v15p2,FV3GFS,FV3GFS,2019070100,6,6,17,
,yes,specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS,RRFS_CONUS_25km,FV3_GFS_v15p2,FV3GFS,FV3GFS,2021061500,6,6,19,
,yes,specify_template_filenames,RRFS_CONUS_25km,FV3_GFS_v15p2,FV3GFS,FV3GFS,2019070100,6,6,28,
Expand All @@ -74,4 +68,4 @@ yes,yes,grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_GFS_v16,RRFS_CONUS_25km,FV3_
,,aqm_grid_AQM_NA13km_suite_GFS_v16,AQM_NA_13km,FV3_GFS_v16,FV3GFS,FV3GFS,2023021700 2023021706,6,,,This is an air-quality model test that requires special compilation to run; not supported in this release
,,grid_RRFS_NA_3km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta,RRFS_NA_3km,FV3_RRFS_v1beta,FV3GFS,FV3GFS,2019070100,3,,,The RRFS_NA_3km domain currently has segfault problems--this test is not run
,,subhourly_post,RRFS_CONUScompact_25km,FV3_RRFS_v1beta,HRRR,RAP,2020081000,3,,,Subhourly post tasks are currently broken--these tests are not run
,,subhourly_post_ensemble_2mems,RRFS_CONUScompact_25km,FV3_RRFS_v1beta,HRRR,RAP,2020081000,3,,,Subhourly post tasks are currently broken--these tests are not run
,,subhourly_post_ensemble_2mems,RRFS_CONUScompact_25km,FV3_RRFS_v1beta,HRRR,RAP,2020081000,3,,,Subhourly post tasks are currently broken--these tests are not run
2 changes: 1 addition & 1 deletion modulefiles/build_derecho_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ the CISL machine Derecho (Cray) using Intel@2021.10.0
whatis([===[Loads libraries needed for building the UFS SRW App on Derecho ]===])

prepend_path("MODULEPATH","/lustre/desc1/scratch/epicufsrt/contrib/modulefiles_extra")
prepend_path("MODULEPATH", "/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core")
prepend_path("MODULEPATH", "/glade/work/epicufsrt/contrib/spack-stack/derecho/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")

load(pathJoin("stack-intel", os.getenv("stack_intel_ver") or "2021.10.0"))
load(pathJoin("stack-cray-mpich", os.getenv("stack_cray_mpich_ver") or "8.1.25"))
Expand Down
6 changes: 3 additions & 3 deletions modulefiles/build_gaea_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ the NOAA RDHPC machine Gaea C5 using Intel-2023.1.0

whatis([===[Loads libraries needed for building the UFS SRW App on Gaea C5 ]===])

prepend_path("MODULEPATH","/ncrc/proj/epic/spack-stack/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core")
stack_intel_ver=os.getenv("stack_intel_ver") or "2023.1.0"
prepend_path("MODULEPATH","/ncrc/proj/epic/spack-stack/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")
stack_intel_ver=os.getenv("stack_intel_ver") or "2023.2.0"
load(pathJoin("stack-intel", stack_intel_ver))

stack_mpich_ver=os.getenv("stack_mpich_ver") or "8.1.25"
stack_mpich_ver=os.getenv("stack_mpich_ver") or "8.1.28"
load(pathJoin("stack-cray-mpich", stack_mpich_ver))

stack_python_ver=os.getenv("stack_python_ver") or "3.10.13"
Expand Down
2 changes: 1 addition & 1 deletion modulefiles/build_hera_gnu.lua
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ whatis([===[Loads libraries needed for building the UFS SRW App on Hera using GN

prepend_path("MODULEPATH", "/scratch2/NCEPDEV/stmp1/role.epic/installs/gnu/modulefiles")
prepend_path("MODULEPATH", "/scratch2/NCEPDEV/stmp1/role.epic/installs/openmpi/modulefiles")
prepend_path("MODULEPATH", "/scratch2/NCEPDEV/stmp1/role.epic/spack-stack/spack-stack-1.6.0_gnu13/envs/ufs-wm-srw-rocky8/install/modulefiles/Core")
prepend_path("MODULEPATH", "/scratch2/NCEPDEV/stmp1/role.epic/spack-stack/spack-stack-1.6.0_gnu13/envs/upp-addon-env/install/modulefiles/Core")

load("stack-gcc/13.3.0")
load("stack-openmpi/4.1.6")
Expand Down
2 changes: 1 addition & 1 deletion modulefiles/build_hera_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ whatis([===[Loads libraries needed for building the UFS SRW App on Hera ]===])
prepend_path("MODULEPATH","/contrib/sutils/modulefiles")
load("sutils")

prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core")
prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")

stack_intel_ver=os.getenv("stack_intel_ver") or "2021.5.0"
load(pathJoin("stack-intel", stack_intel_ver))
Expand Down
2 changes: 1 addition & 1 deletion modulefiles/build_hercules_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ the MSU machine Hercules using intel-oneapi-compilers/2022.2.1

whatis([===[Loads libraries needed for building the UFS SRW App on Hercules ]===])

prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core")
prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")

load("stack-intel/2021.9.0")
load("stack-intel-oneapi-mpi/2021.9.0")
Expand Down
2 changes: 1 addition & 1 deletion modulefiles/build_jet_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ the NOAA RDHPC machine Jet using Intel-2021.5.0

whatis([===[Loads libraries needed for building the UFS SRW App on Jet ]===])

prepend_path("MODULEPATH","/contrib/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core")
prepend_path("MODULEPATH","/contrib/spack-stack/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")

load("stack-intel/2021.5.0")
load("stack-intel-oneapi-mpi/2021.5.1")
Expand Down
2 changes: 1 addition & 1 deletion modulefiles/build_noaacloud_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ the NOAA cloud using Intel-oneapi

whatis([===[Loads libraries needed for building the UFS SRW App on NOAA cloud ]===])

prepend_path("MODULEPATH", "/contrib/spack-stack-rocky8/spack-stack-1.6.0/envs/ue-intel/install/modulefiles/Core")
prepend_path("MODULEPATH", "/contrib/spack-stack-rocky8/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")
prepend_path("MODULEPATH", "/apps/modules/modulefiles")
load("gnu")
load("stack-intel")
Expand Down
2 changes: 1 addition & 1 deletion modulefiles/build_orion_intel.lua
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ the MSU machine Orion using intel-oneapi-compilers/2021.9.0

whatis([===[Loads libraries needed for building the UFS SRW App on Orion ]===])

prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core")
prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/upp-addon-env/install/modulefiles/Core")

load("stack-intel/2021.9.0")
load("stack-intel-oneapi-mpi/2021.9.0")
Expand Down
4 changes: 2 additions & 2 deletions modulefiles/srw_common.lua
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ load("fms/2023.04")

load("bacio/2.4.1")
load("crtm/2.4.0.1")
load("g2/3.4.5")
load("g2tmpl/1.10.2")
load("g2/3.5.1")
load("g2tmpl/1.13.0")
load("ip/4.3.0")
load("sp/2.5.0")
load("w3emc/2.10.0")
Expand Down
3 changes: 2 additions & 1 deletion modulefiles/tasks/gaea/python_srw.lua
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,6 @@ unload("python")
load("conda")

setenv("SRW_ENV", "srw_app")
setenv("LD_PRELOAD", "/opt/cray/pe/gcc/12.2.0/snos/lib64/libstdc++.so.6")
setenv("LD_PRELOAD", "/usr/lib64/libstdc++.so.6")
setenv("FI_VERBS_PREFER_XRC", "0")

1 change: 0 additions & 1 deletion modulefiles/wflow_gaea.lua
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ load("rocoto")
load("conda")

pushenv("MKLROOT", "/opt/intel/oneapi/mkl/2023.1.0/")
setenv("LD_PRELOAD", "/opt/cray/pe/gcc/12.2.0/snos/lib64/libstdc++.so.6")

if mode() == "load" then
LmodMsgRaw([===[Please do the following to activate conda:
Expand Down
3 changes: 0 additions & 3 deletions parm/model_configure
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
total_member: 1
PE_MEMBER01: {{ PE_MEMBER01 }}
start_year: {{ start_year }}
start_month: {{ start_month }}
start_day: {{ start_day }}
Expand All @@ -13,7 +11,6 @@ ENS_SPS: .false.
dt_atmos: {{ dt_atmos }}
calendar: 'julian'
memuse_verbose: .false.
atmos_nthreads: {{ atmos_nthreads }}
restart_interval: {{ restart_interval }}
output_1st_tstep_rst: .false.
write_dopost: {{ write_dopost }}
Expand Down
25 changes: 19 additions & 6 deletions parm/ufs.configure
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,14 @@ EARTH_attributes::

# ATM #
ATM_model: fv3
ATM_petlist_bounds: -1 -1
ATM_petlist_bounds: 0 {{ pe_member01_m1 }}
ATM_attributes::
Verbosity = 0
::

# AQM #
AQM_model: aqm
AQM_petlist_bounds: -1 -1
AQM_petlist_bounds: 0 {{ aqm_pe_member01_m1 }}
AQM_attributes::
Verbosity = 0
Diagnostic = 0
Expand All @@ -45,8 +45,21 @@ runSeq::
{% else %}
# EARTH #
EARTH_component_list: ATM
ATM_model: fv3
runSeq::
ATM
::
EARTH_attributes::
Verbosity = 0
::

# ATM #
ATM_model: fv3
ATM_petlist_bounds: 0 {{ pe_member01_m1 }}
ATM_omp_num_threads: {{ atm_omp_num_threads }}
ATM_attributes::
Verbosity = 0
Diagnostic = 0
::

# Run Sequence #
runSeq::
ATM
::
{% endif %}
10 changes: 8 additions & 2 deletions scripts/exregional_plot_allvars.py
Original file line number Diff line number Diff line change
Expand Up @@ -429,7 +429,7 @@ def setup_logging(debug=False):
t1a = time.perf_counter()

# Sea level pressure
slp = data1.select(name="Pressure reduced to MSL")[0].values * 0.01
slp = data1.select(name="MSLP (Eta model reduction)")[0].values * 0.01
slpsmooth = ndimage.gaussian_filter(slp, 13.78)

# 2-m temperature
Expand Down Expand Up @@ -484,7 +484,13 @@ def setup_logging(debug=False):
)

# Composite reflectivity
refc = data1.select(name="Maximum/Composite radar reflectivity")[0].values
# refc is the 37th entry in the GRIB2 post output file
# First rewind to the start of the GRIB2 file
data1.rewind()
# Advance 36 entries in the GRIB2 file
data1.seek(36)
# Read values from the 37th entry in the GRIB2 file
refc = data1.readline().values

if fhr > 0:
# Max/Min Hourly 2-5 km Updraft Helicity
Expand Down
9 changes: 7 additions & 2 deletions scripts/exregional_plot_allvars_diff.py
Original file line number Diff line number Diff line change
Expand Up @@ -446,9 +446,9 @@ def setup_logging(debug=False):
t1a = time.perf_counter()

# Sea level pressure
slp_1 = data1.select(name="Pressure reduced to MSL")[0].values * 0.01
slp_1 = data1.select(name="MSLP (Eta model reduction)")[0].values * 0.01
slpsmooth_1 = ndimage.gaussian_filter(slp_1, 13.78)
slp_2 = data2.select(name="Pressure reduced to MSL")[0].values * 0.01
slp_2 = data2.select(name="MSLP (Eta model reduction)")[0].values * 0.01
slpsmooth_2 = ndimage.gaussian_filter(slp_2, 13.78)
slp_diff = slp_2 - slp_1

Expand Down Expand Up @@ -544,7 +544,12 @@ def setup_logging(debug=False):
qpf_diff = qpf_2 - qpf_1

# Composite reflectivity
# refc is the 37th entry in the GRIB2 post output file
data1.rewind()
data1.seek(36)
refc_1 = data1.select(name="Maximum/Composite radar reflectivity")[0].values
data2.rewind()
data2.seek(36)
refc_2 = data2.select(name="Maximum/Composite radar reflectivity")[0].values

if fhr > 0:
Expand Down
18 changes: 9 additions & 9 deletions tests/WE2E/machine_suites/comprehensive.derecho
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
2020_CAD
2020_CAPE
2019_hurricane_barry
2019_halloween_storm
2019_hurricane_lorenzo
2019_memorial_day_heat_wave
2020_denver_radiation_inversion
2020_easter_storm
2020_jan_cold_blast
#2020_CAD
#2020_CAPE
#2019_hurricane_barry
#2019_halloween_storm
#2019_hurricane_lorenzo
#2019_memorial_day_heat_wave
#2020_denver_radiation_inversion
#2020_easter_storm
#2020_jan_cold_blast
community
custom_ESGgrid
custom_ESGgrid_Central_Asia_3km
Expand Down
Loading

0 comments on commit 87b26cc

Please sign in to comment.