Skip to content

Commit

Permalink
[develop]: Update ConfigWorkflow chapter to HEAD of develop (ufs-comm…
Browse files Browse the repository at this point in the history
…unity#915)

* This PR updates the ConfigWorkflow.rst file to align with the current state of config_defaults.yaml in develop.
* This PR also adds in-code documentation for undocumented variables in config_defaults.yaml.
* There are a few odds & ends in other parts of the documentation to fix errors or make updates that came into develop very recently.

---------

Co-authored-by: Michael J. Kavulich, Jr <kavulich@ucar.edu>
Co-authored-by: Michael Lueken <63728921+MichaelLueken@users.noreply.github.com>
Co-authored-by: Christina Holt <56881914+christinaholtNOAA@users.noreply.github.com>
  • Loading branch information
4 people authored Oct 5, 2023
1 parent 0b1b070 commit 7d2cbfd
Show file tree
Hide file tree
Showing 8 changed files with 1,692 additions and 734 deletions.
4 changes: 2 additions & 2 deletions docs/UsersGuide/source/BuildingRunningTesting/BuildSRW.rst
Original file line number Diff line number Diff line change
Expand Up @@ -294,8 +294,8 @@ If the ``devbuild.sh`` approach failed, users need to set up their environment t

.. code-block:: console
source etc/lmod-setup.sh gaea
source etc/lmod-setup.csh gaea
source /path/to/ufs-srweather-app/etc/lmod-setup.sh gaea
source /path/to/ufs-srweather-app/etc/lmod-setup.csh gaea
.. note::

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,12 +69,12 @@ Build the Container
------------------------

.. hint::
If a ``singularity: command not found`` error message appears when working on Level 1 platforms, try running: ``module load singularity``.
If a ``singularity: command not found`` error message appears when working on Level 1 platforms, try running: ``module load singularity`` or (on Derecho) ``module load apptainer``.

Level 1 Systems
^^^^^^^^^^^^^^^^^^

On most Level 1 systems, a container named ``ubuntu20.04-intel-srwapp-develop.img`` has already been built at the following locations:
On most Level 1 systems, a container named ``ubuntu20.04-intel-ue-1.4.1-srw-dev.img`` has already been built at the following locations:

.. table:: Locations of pre-built containers

Expand All @@ -95,29 +95,27 @@ On most Level 1 systems, a container named ``ubuntu20.04-intel-srwapp-develop.im
+--------------+--------------------------------------------------------+

.. note::
* Singularity is only available on the Gaea C5 partition, and therefore container use is only supported on Gaea C5.
* On Gaea, Singularity/Apptainer is only available on the C5 partition, and therefore container use is only supported on Gaea C5.
* The NOAA Cloud containers are accessible only to those with EPIC resources.

Users can simply set an environment variable to point to the container:

.. code-block:: console
export img=/path/to/ubuntu20.04-intel-srwapp-develop.img
export img=/path/to/ubuntu20.04-intel-ue-1.4.1-srw-dev.img
Users may convert the container ``.img`` file to a writable sandbox. This step is required when running on Cheyenne but is optional on other systems:

.. code-block:: console
singularity build --sandbox ubuntu20.04-intel-srwapp $img
.. COMMENT: What about on Derecho?
When making a writable sandbox on Level 1 systems, the following warnings commonly appear and can be ignored:

.. code-block:: console
INFO: Starting build...
INFO: Verifying bootstrap image ubuntu20.04-intel-srwapp-develop.img
INFO: Verifying bootstrap image ubuntu20.04-intel-ue-1.4.1-srw-dev.img
WARNING: integrity: signature not found for object group 1
WARNING: Bootstrap image could not be verified, but build will continue.
Expand Down Expand Up @@ -241,7 +239,7 @@ To generate the forecast experiment, users must:
#. :ref:`Set experiment parameters <SetUpConfigFileC>`
#. :ref:`Run a script to generate the experiment workflow <GenerateWorkflowC>`

The first two steps depend on the platform being used and are described here for Level 1 platforms. Users will need to adjust the instructions to their machine if their local machine is a Level 2-4 platform.
The first two steps depend on the platform being used and are described here for Level 1 platforms. Users will need to adjust the instructions to match their machine configuration if their local machine is a Level 2-4 platform.

.. _SetUpPythonEnvC:

Expand Down Expand Up @@ -277,8 +275,6 @@ The ``wflow_<platform>`` modulefile will then output instructions to activate th
then the user should run ``conda activate workflow_tools``. This will activate the ``workflow_tools`` conda environment. The command(s) will vary from system to system, but the user should see ``(workflow_tools)`` in front of the Terminal prompt at this point.

.. COMMENT: Containers are old and still say regional_workflow...
.. _SetUpConfigFileC:

Configure the Workflow
Expand All @@ -295,13 +291,13 @@ where:
* ``-c`` indicates the compiler on the user's local machine (e.g., ``intel/2022.1.2``)
* ``-m`` indicates the :term:`MPI` on the user's local machine (e.g., ``impi/2022.1.2``)
* ``<platform>`` refers to the local machine (e.g., ``hera``, ``jet``, ``noaacloud``, ``mac``). See ``MACHINE`` in :numref:`Section %s <user>` for a full list of options.
* ``-i`` indicates the container image that was built in :numref:`Step %s <BuildC>` (``ubuntu20.04-intel-srwapp`` or ``ubuntu20.04-intel-srwapp-develop.img`` by default).
* ``-i`` indicates the container image that was built in :numref:`Step %s <BuildC>` (``ubuntu20.04-intel-srwapp`` or ``ubuntu20.04-intel-ue-1.4.1-srw-dev.img`` by default).

For example, on Hera, the command would be:

.. code-block:: console
./stage-srw.sh -c=intel/2022.1.2 -m=impi/2022.1.2 -p=hera -i=ubuntu20.04-intel-srwapp-develop.img
./stage-srw.sh -c=intel/2022.1.2 -m=impi/2022.1.2 -p=hera -i=ubuntu20.04-intel-ue-1.4.1-srw-dev.img
.. attention::

Expand Down
44 changes: 14 additions & 30 deletions docs/UsersGuide/source/BuildingRunningTesting/DefaultVarsTable.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,36 +11,28 @@ Table of Variables in ``config_defaults.yaml``
* - Group Name
- Configuration variables
* - User
- RUN_ENVIR, MACHINE, ACCOUNT, HOMEdir, USHdir, SCRIPTSdir, JOBSdir, SORCdir, PARMdir, MODULESdir, EXECdir, VX_CONFIG_DIR, METPLUS_CONF, MET_CONFIG, UFS_WTHR_MDL_DIR, ARL_NEXUS_DIR
- RUN_ENVIR, MACHINE, ACCOUNT, HOMEdir, USHdir, SCRIPTSdir, JOBSdir, SORCdir, PARMdir, MODULESdir, EXECdir, METPLUS_CONF, UFS_WTHR_MDL_DIR, ARL_NEXUS_DIR
* - Platform
- WORKFLOW_MANAGER, NCORES_PER_NODE, TASKTHROTTLE, BUILD_MOD_FN, WFLOW_MOD_FN, BUILD_VER_FN, RUN_VER_FN, SCHED,PARTITION_DEFAULT, QUEUE_DEFAULT, PARTITION_HPSS,
QUEUE_HPSS, PARTITION_FCST, QUEUE_FCST, REMOVE_MEMORY, RUN_CMD_SERIAL, RUN_CMD_UTILS, RUN_CMD_FCST, RUN_CMD_POST, RUN_CMD_PRDGEN, RUN_CMD_AQM,
RUN_CMD_AQMLBC, SCHED_NATIVE_CMD, CCPA_OBS_DIR, MRMS_OBS_DIR, NDAS_OBS_DIR, NOHRSC_OBS_DIR, DOMAIN_PREGEN_BASEDIR, PRE_TASK_CMDS,
RUN_CMD_AQMLBC, SCHED_NATIVE_CMD, PRE_TASK_CMDS, CCPA_OBS_DIR, NOHRSC_OBS_DIR, MRMS_OBS_DIR, NDAS_OBS_DIR, DOMAIN_PREGEN_BASEDIR,
TEST_EXTRN_MDL_SOURCE_BASEDIR, TEST_AQM_INPUT_BASEDIR, TEST_PREGEN_BASEDIR, TEST_ALT_EXTRN_MDL_SYSBASEDIR_ICS, TEST_ALT_EXTRN_MDL_SYSBASEDIR_LBCS,
TEST_VX_FCST_INPUT_BASEDIR, FIXgsm, FIXaer, FIXlut, FIXorg, FIXsfc, FIXshp, FIXgsi, FIXcrtm, FIXcrtmupp, EXTRN_MDL_DATA_STORES
TEST_VX_FCST_INPUT_BASEDIR, FIXgsm, FIXaer, FIXlut, FIXorg, FIXsfc, FIXshp, FIXcrtm, FIXcrtmupp, EXTRN_MDL_DATA_STORES
* - Workflow
- WORKFLOW_ID, RELATIVE_LINK_FLAG, USE_CRON_TO_RELAUNCH, CRON_RELAUNCH_INTVL_MNTS, CRONTAB_LINE, LOAD_MODULES_RUN_TASK_FP, EXPT_BASEDIR, EXPT_SUBDIR, EXEC_SUBDIR,
EXPTDIR, DOT_OR_USCORE, EXPT_CONFIG_FN, CONSTANTS_FN, RGNL_GRID_NML_FN, FV3_NML_FN, FV3_NML_BASE_SUITE_FN, FV3_NML_YAML_CONFIG_FN, FV3_NML_BASE_ENS_FN,
FV3_EXEC_FN, DATA_TABLE_FN, DIAG_TABLE_FN, FIELD_TABLE_FN, DIAG_TABLE_TMPL_FN, FIELD_TABLE_TMPL_FN, MODEL_CONFIG_FN, NEMS_CONFIG_FN, AQM_RC_FN, AQM_RC_TMPL_FN,
FV3_NML_BASE_SUITE_FP, FV3_NML_YAML_CONFIG_FP, FV3_NML_BASE_ENS_FP, DATA_TABLE_TMPL_FP, DIAG_TABLE_TMPL_FP, FIELD_TABLE_TMPL_FP,
MODEL_CONFIG_TMPL_FP, NEMS_CONFIG_TMPL_FP, AQM_RC_TMPL_FP, DATA_TABLE_FP, FIELD_TABLE_FP, NEMS_CONFIG_FP, FV3_NML_FP, FV3_NML_CYCSFC_FP,
FV3_NML_RESTART_FP, FV3_NML_STOCH_FP, FV3_NML_RESTART_STOCH_FP, FCST_MODEL, WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, ROCOTO_YAML_FN, EXTRN_MDL_VAR_DEFNS_FN,
MODEL_CONFIG_TMPL_FP, NEMS_CONFIG_TMPL_FP, AQM_RC_TMPL_FP, DATA_TABLE_FP, FIELD_TABLE_FP, NEMS_CONFIG_FP, FV3_NML_FP,
FV3_NML_STOCH_FP, FCST_MODEL, WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, ROCOTO_YAML_FN, EXTRN_MDL_VAR_DEFNS_FN,
WFLOW_LAUNCH_SCRIPT_FN, WFLOW_LAUNCH_LOG_FN, GLOBAL_VAR_DEFNS_FP, ROCOTO_YAML_FP, WFLOW_LAUNCH_SCRIPT_FP, WFLOW_LAUNCH_LOG_FP, FIXdir, FIXam,
FIXclim, FIXlam, THOMPSON_MP_CLIMO_FN, THOMPSON_MP_CLIMO_FP, CCPP_PHYS_SUITE, CCPP_PHYS_SUITE_FN, CCPP_PHYS_SUITE_IN_CCPP_FP, CCPP_PHYS_SUITE_FP, CCPP_PHYS_DIR,
FIELD_DICT_FN, FIELD_DICT_IN_UWM_FP, FIELD_DICT_FP, GRID_GEN_METHOD, PREDEF_GRID_NAME, DATE_FIRST_CYCL, DATE_LAST_CYCL, INCR_CYCL_FREQ, FCST_LEN_HRS,
FCST_LEN_CYCL, LONG_FCST_LEN, CYCL_HRS_SPINSTART, CYCL_HRS_PRODSTART, BOUNDARY_LEN_HRS, BOUNDARY_LONG_LEN_HRS, BOUNDARY_PROC_GROUP_NUM,
PREEXISTING_DIR_METHOD, VERBOSE, DEBUG, COMPILER, SYMLINK_FIX_FILES, DO_REAL_TIME, COLDSTART, WARMSTART_CYCLE_DIR,
FCST_LEN_CYCL, LONG_FCST_LEN, PREEXISTING_DIR_METHOD, VERBOSE, DEBUG, COMPILER, SYMLINK_FIX_FILES, DO_REAL_TIME, COLDSTART, WARMSTART_CYCLE_DIR,
* - NCO
- envir_default, NET_default, RUN_default, model_ver_default, OPSROOT_default, COMROOT_default, DATAROOT_default, DCOMROOT_default, LOGBASEDIR_default,
COMIN_BASEDIR, COMOUT_BASEDIR, NWGES, NWGES_BASEDIR, DBNROOT_default, SENDECF_default, SENDDBN_default, SENDDBN_NTC_default, SENDCOM_default,
COMIN_BASEDIR, COMOUT_BASEDIR, DBNROOT_default, SENDECF_default, SENDDBN_default, SENDDBN_NTC_default, SENDCOM_default,
SENDWEB_default, KEEPDATA_default, MAILTO_default, MAILCC_default
* - gsi
- niter1, niter2, l_obsprvdiag, diag_radardbz, write_diag_2, bkgerr_vs, bkgerr_hzscl, usenewgfsberror, netcdf_diag, binary_diag, readin_localization,
beta1_inv, ens_h, ens_v, regional_ensemble_option, grid_ratio_fv3, grid_ratio_ens, i_en_perts_io, q_hyb_ens, ens_fast_read, l_PBL_pseudo_SurfobsT,
l_PBL_pseudo_SurfobsQ, i_use_2mQ4B, i_use_2mT4B, i_T_Q_adjust, l_rtma3d, i_precip_vertical_check, HYBENSMEM_NMIN, ANAVINFO_FN, ANAVINFO_DBZ_FN,
ENKF_ANAVINFO_FN, ENKF_ANAVINFO_DBZ_FN, CONVINFO_FN, BERROR_FN, OBERROR_FN, HYBENSINFO_FN, cld_bld_hgt, l_precip_clear_only, l_qnr_from_qr, beta_recenter
* - rrfs
- DO_RRFS_DEV, DO_NLDN_LGHT, DO_ENKFUPDATE, DO_DACYCLE, DO_SURFACE_CYCLE
* - task_make_grid
- GRID_DIR, ESGgrid_LON_CTR, ESGgrid_LAT_CTR, ESGgrid_DELX, ESGgrid_DELY, ESGgrid_NX, ESGgrid_NY, ESGgrid_WIDE_HALO_WIDTH, ESGgrid_PAZI,
GFDLgrid_LON_T6_CTR, GFDLgrid_LAT_T6_CTR, GFDLgrid_NUM_CELLS, GFDLgrid_STRETCH_FAC, GFDLgrid_REFINE_RATIO, GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G,
Expand All @@ -53,7 +45,7 @@ Table of Variables in ``config_defaults.yaml``
- EXTRN_MDL_NAME_ICS, EXTRN_MDL_ICS_OFFSET_HRS, FV3GFS_FILE_FMT_ICS, EXTRN_MDL_SYSBASEDIR_ICS, USE_USER_STAGED_EXTRN_FILES,
EXTRN_MDL_SOURCE_BASEDIR_ICS, EXTRN_MDL_FILES_ICS
* - task_get_extrn_lbcs
- EXTRN_MDL_NAME_LBCS, LBC_SPEC_INTVL_HRS, EXTRN_MDL_LBCS_OFFSET_HRS, FV3GFS_FILE_FMT_LBCS, LBCS_SEARCH_HRS, EXTRN_MDL_LBCS_SEARCH_OFFSET_HRS, EXTRN_MDL_SYSBASEDIR_LBCS,
- EXTRN_MDL_NAME_LBCS, LBC_SPEC_INTVL_HRS, EXTRN_MDL_LBCS_OFFSET_HRS, FV3GFS_FILE_FMT_LBCS, EXTRN_MDL_SYSBASEDIR_LBCS,
USE_USER_STAGED_EXTRN_FILES,EXTRN_MDL_SOURCE_BASEDIR_LBCS, EXTRN_MDL_FILES_LBCS
* - task_make_ics
- KMP_AFFINITY_MAKE_ICS, OMP_NUM_THREADS_MAKE_ICS, OMP_STACKSIZE_MAKE_ICS, USE_FVCOM, FVCOM_WCSTART, FVCOM_DIR, FVCOM_FILE, VCOORD_FILE
Expand All @@ -71,14 +63,6 @@ Table of Variables in ``config_defaults.yaml``
- KMP_AFFINITY_RUN_PRDGEN, OMP_NUM_THREADS_RUN_PRDGEN, OMP_STACKSIZE_RUN_PRDGEN, DO_PARALLEL_PRDGEN, ADDNL_OUTPUT_GRIDS: []
* - task_plot_allvars:
- COMOUT_REF, PLOT_FCST_START, PLOT_FCST_INC, PLOT_FCST_END, PLOT_DOMAINS
* - task_analysis_gsi
- TN_ANALYSIS_GSI, TN_OBSERVER_GSI, TN_OBSERVER_GSI_ENSMEAN, KMP_AFFINITY_ANALYSIS, OMP_NUM_THREADS_ANALYSIS, OMP_STACKSIZE_ANALYSIS, OBSPATH_TEMPLATE
* - task_process_radarref
- RADAR_REF_THINNING, RADARREFL_MINS, RADARREFL_TIMELEVEL, OBS_SUFFIX
* - task_get_da_obs
- NLDN_NEEDED, NLDN_LIGHTNING, NSSLMOSAIC, RAP_OBS_BUFR
* - task_process_bufrobs
- OBSPATH_TEMPLATE
* - task_nexus_emission
- PPN_NEXUS_EMISSION, KMP_AFFINITY_NEXUS_EMISSION, OMP_NUM_THREADS_NEXUS_EMISSION, OMP_STACKSIZE_NEXUS_EMISSION
* - task_bias_correction_o3
Expand All @@ -88,19 +72,19 @@ Table of Variables in ``config_defaults.yaml``
* - Global
- USE_CRTM, CRTM_DIR, DO_ENSEMBLE, NUM_ENS_MEMBERS, ENSMEM_NAMES, FV3_NML_ENSMEM_FPS, ENS_TIME_LAG_HRS, DO_SHUM, DO_SPPT, DO_SKEB, ISEED_SHUM, ISEED_SPPT, ISEED_SKEB, NEW_LSCALE, SHUM_MAG, SHUM_LSCALE, SHUM_TSCALE, SHUM_INT,
SPPT_MAG, SPPT_LOGIT, SPPT_LSCALE, SPPT_TSCALE, SPPT_INT, SPPT_SFCLIMIT,
SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, SKEBNORM, SKEB_VDOF, USE_ZMTNBLCK, DO_SPP, ISEED_SPP, SPP_VAR_LIST, SPP_MAG_LIST, SPP_LSCALE,
SPP_TSCALE, SPP_SIGTOP1, SPP_SIGTOP2, SPP_STDDEV_CUTOFF, DO_LSM_SPP, LSM_SPP_TSCALE, LSM_SPP_LSCALE, ISEED_LSM_SPP, LSM_SPP_VAR_LIST,
SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, SKEBNORM, SKEB_VDOF, USE_ZMTNBLCK, DO_SPP, SPP_VAR_LIST, SPP_MAG_LIST, SPP_LSCALE,
SPP_TSCALE, SPP_SIGTOP1, SPP_SIGTOP2, SPP_STDDEV_CUTOFF, ISEED_SPP, DO_LSM_SPP, LSM_SPP_TSCALE, LSM_SPP_LSCALE, ISEED_LSM_SPP, LSM_SPP_VAR_LIST,
LSM_SPP_MAG_LIST, HALO_BLEND, PRINT_DIFF_PGR
* - Verification
- OBS_CCPA_APCP01h_FN_TEMPLATE, OBS_CCPA_APCPgt01h_FN_TEMPLATE, OBS_MRMS_REFC_FN_TEMPLATE, OBS_MRMS_RETOP_FN_TEMPLATE,
OBS_NDAS_SFCorUPA_FN_TEMPLATE, OBS_NDAS_SFCorUPA_FN_METPROC_TEMPLATE, VX_FCST_MODEL_NAME, VX_FIELDS, VX_APCP_ACCUMS_HRS, VX_FCST_INPUT_BASEDIR,
- OBS_CCPA_APCP01h_FN_TEMPLATE, OBS_CCPA_APCPgt01h_FN_TEMPLATE, OBS_NOHRSC_ASNOW_FN_TEMPLATE, OBS_MRMS_REFC_FN_TEMPLATE, OBS_MRMS_RETOP_FN_TEMPLATE,
OBS_NDAS_SFCorUPA_FN_TEMPLATE, OBS_NDAS_SFCorUPA_FN_METPROC_TEMPLATE, VX_FCST_MODEL_NAME, VX_FIELDS, VX_APCP_ACCUMS_HRS, VX_ASNOW_ACCUMS_HRS, VX_FCST_INPUT_BASEDIR,
VX_OUTPUT_BASEDIR, VX_NDIGITS_ENSMEM_NAMES, FCST_SUBDIR_TEMPLATE, FCST_FN_TEMPLATE, FCST_FN_METPROC_TEMPLATE, NUM_MISSING_OBS_FILES_MAX, NUM_MISSING_FCST_FILES_MAX
* - cpl_aqm_parm
- CPL_AQM, DO_AQM_DUST, DO_AQM_CANOPY, DO_AQM_PRODUCT, DO_AQM_CHEM_LBCS, DO_AQM_GEFS_LBCS, DO_AQM_SAVE_AIRNOW_HIST, DO_AQM_SAVE_FIRE, DCOMINbio_default,
DCOMINdust_default, DCOMINcanopy_default, DCOMINfire_default, DCOMINchem_lbcs_default, DCOMINgefs_default, DCOMINpt_src_default,
DCOMINairnow_default, COMINbicor, COMOUTbicor, AQM_CONFIG_DIR, AQM_BIO_FILE, AQM_DUST_FILE_PREFIX, AQM_DUST_FILE_SUFFIX, AQM_CANOPY_FILE_PREFIX,
AQM_CANOPY_FILE_SUFFIX, AQM_FIRE_FILE_PREFIX, AQM_FIRE_FILE_SUFFIX, AQM_FIRE_FILE_OFFSET_HRS, AQM_FIRE_ARCHV_DIR, AQM_RC_FIRE_FREQUENCY,
AQM_RC_PRODUCT_FN, AQM_RC_PRODUCT_FREQUENCY, AQM_LBCS_FILES, AQM_GEFS_FILE_PREFIX, AQM_GEFS_FILE_CYC, NEXUS_INPUT_DIR, NEXUS_FIX_DIR,
NEXUS_GRID_FN, NUM_SPLIT_NEXUS: 3NEXUS_GFS_SFC_OFFSET_HRS, NEXUS_GFS_SFC_DIR, NEXUS_GFS_SFC_ARCHV_DIR
NEXUS_GRID_FN, NUM_SPLIT_NEXUS, NEXUS_GFS_SFC_OFFSET_HRS, NEXUS_GFS_SFC_DIR, NEXUS_GFS_SFC_ARCHV_DIR
* - Rocoto
- attrs, cycledefs, entities, log, tasks: taskgroups
- attrs, cycledefs, entities, log, tasks, taskgroups
6 changes: 2 additions & 4 deletions docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst
Original file line number Diff line number Diff line change
Expand Up @@ -206,7 +206,7 @@ In future shells, you can activate and use this environment with:
source ~/conda/etc/profile.d/conda.sh
conda activate uwtools
See the `workflow-tools respository <https://github.com/ufs-community/workflow-tools>`__ for additional documentation.
See the `workflow-tools repository <https://github.com/ufs-community/workflow-tools>`__ for additional documentation.

Modify a ``wflow_<platform>`` File
``````````````````````````````````````
Expand Down Expand Up @@ -561,8 +561,6 @@ output over the :term:`CONUS`. It generates graphics plots for a number of varia
* Max/Min 2-5 km updraft helicity
* Sea level pressure (SLP)

.. COMMENT: * 500 hPa heights, winds, and vorticity --> seems to be omitted? Why?
This workflow task can produce both plots from a single experiment and difference plots that compare the same cycle from two experiments. When plotting the difference, the two experiments must be on the same domain and available for
the same cycle starting date/time and forecast hours. Other parameters may differ (e.g., the experiments may use different physics suites).

Expand Down Expand Up @@ -702,7 +700,7 @@ Run the following command from the ``ufs-srweather-app/ush`` directory to genera
The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and used later to automatically run portions of the workflow if users have the Rocoto workflow manager installed on their system.

This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s <WorkflowGeneration>` describes the experiment generation process. The ``generate_FV3LAM_wflow.py``:
This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s <WorkflowGeneration>` describes the experiment generation process. The ``generate_FV3LAM_wflow.py`` script:

#. Runs the ``setup.py`` script to set the configuration parameters. This script reads three other configuration scripts in order:

Expand Down
24 changes: 21 additions & 3 deletions docs/UsersGuide/source/BuildingRunningTesting/WE2Etests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,9 @@ The list of fundamental and comprehensive tests can be viewed in the ``ufs-srwea

For convenience, the WE2E tests are currently grouped into the following categories (under ``ufs-srweather-app/tests/WE2E/test_configs/``):

* ``aqm``
This category tests the :term:`AQM` configuration of the SRW App.

* ``custom_grids``
This category tests custom grids aside from those specified in ``ufs-srweather-app/ush/predef_grid_params.yaml``. These tests help ensure a wide range of domain sizes, resolutions, and locations will work as expected. These test files can also serve as examples for how to set your own custom domain.

Expand All @@ -38,15 +41,30 @@ For convenience, the WE2E tests are currently grouped into the following categor
* ``grids_extrn_mdls_suites_nco``
This category of tests ensures that the workflow running in **NCO mode** (i.e., with ``RUN_ENVIR`` set to ``"nco"``) completes successfully for various combinations of predefined grids, physics suites, and input data from different external models. Note that in NCO mode, an operational run environment is used. This involves a specific directory structure and variable names (see :numref:`Section %s <NCOModeParms>`).

* ``ufs_case_studies``
This category tests that the workflow running in community mode completes successfully when running cases derived from the `ufs-case-studies repository <https://github.com/dtcenter/ufs-case-studies>`__.

* ``verification``
This category specifically tests the various combinations of verification capabilities using METPlus.

* ``release_SRW_v1``
This category was reserved for the official "Graduate Student Test" case for the Version 1 SRW code release.

* ``wflow_features``
This category of tests ensures that the workflow completes successfully with particular features/capabilities activated.

.. note::

Users should be aware that some tests assume :term:`HPSS` access.

* ``custom_ESGgrid_Great_Lakes_snow_8km`` and ``MET_verification_only_vx_time_lag`` require HPSS access, as well as ``rstprod`` access on both :term:`RDHPCS` and HPSS.
* On certain machines, the *community* test assumes HPSS access. If the ``ush/machine/*.yaml`` file contains the following lines, and these paths are different from what is provided in ``TEST_EXTRN_MDL_SOURCE_BASEDIR``, users will need to have HPSS access or modify the tests to point to another data source:

.. code-block:: console
data:
ics_lbcs:
FV3GFS:
RAP:
HRRR:
Some tests are duplicated among the above categories via symbolic links, both for legacy reasons (when tests for different capabilities were consolidated) and for convenience when a user would like to run all tests for a specific category (e.g., verification tests).

Running the WE2E Tests
Expand Down
Loading

0 comments on commit 7d2cbfd

Please sign in to comment.