-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configuring chgres to Work With Older NCEP FNL Analyses #954
Comments
I don't have read permissions for your files here: |
I have updated the permissions - it should be readable now. |
I see what is happening. Doing a dump of fnl_20080910_00_00.grib2, I see what looks like soil temperature in records 235-238:
If I check these records for the GRIB2 discipline, I see '0'- meteorological products:
https://www.nco.ncep.noaa.gov/pmb/docs/grib2/grib2_doc/grib2_table4-1.shtml#2 So, to get |
I see - is there a way to get chgres_cube to use those even though they are meteorological data? |
Not without some code modifications. Are you using the head of develop? |
Yeah it's the support/HAFS branch, which I believe was branched off of develop: |
The code mods to get past the error should be minimal. But there could be other issues with this old data. I can try to run the case myself. Don't get rid of the input data or the fort.41 namelist. Make sure I have read permissions on the data. |
OK, I will leave it untouched for now.
Andy
…On Thu, May 30, 2024 at 1:08 PM GeorgeGayno-NOAA ***@***.***> wrote:
Yeah it's the support/HAFS branch, which I believe was branched off of
develop: commit e6eec36
<e6eec36>
(HEAD, origin/support/HAFS, origin/HEAD, support/HAFS)
The code mods to get past the error should be minimal. But there could be
other issues with this old data. I can try to run the case myself. Don't
get rid of the input data or the fort.41 namelist. Make sure I have read
permissions on the data.
—
Reply to this email directly, view it on GitHub
<#954 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMCUFFG7JMTYDER7M4VYCH3ZE5MILAVCNFSM6AAAAABIRA4N7OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBQGMYDEMZUHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
I can't get to your working directory. Can you leave the fort.41 file somewhere for me and make sure I can read all the files it points to? Thanks. |
I changed the working directory to be readable - it's here: /work2/noaa/aoml-hafs1/ahazelto/hafstmp2024/HAFS_tutorial_test0_oldgfs/2008091000/09L/atm_ic/. Let me know if you still can't access it. |
Got stopped at this point: |
OK looks better now? drwxr-sr-- 12 ahazelto aoml-hafs1 4096 May 29 15:55 hafstmp2024/ |
Try |
I did a chmod -R 755 on /work2/noaa/aoml-hafs1/ahazelto/hafstmp2024/HAFS_tutorial_test0_oldgfs/. Does that work? |
Yes. I am able to reproduce the error. Don't remove any of your files. My test data and script is here: /work2/noaa/da/ggayno/save/chgres.old.grib2 |
Hi George, just wanted to follow up on this and see if you had any further insights or suggestions? Thanks! |
I documented what is happening here: #954 (comment) Code changes will be required to work with this data. I can assist if you want to try updating the code yourself. |
Hello,
We are hoping to run HAFS on some older hurricanes (for example, Hurricane Ike in 2008). We were attempting to initialize off of the NCEP FNL analyses (https://rda.ucar.edu/datasets/ds083.2/). A lot of the variables in there seem to be the same ones that are in the current GFS analyses, and it did seem to be successfully creating some atmospheric ICs.
The log file for this test is on Orion: /work2/noaa/aoml-hafs1/ahazelto/logs/hafs_atm_ic.log
The files used for ICs/BCs are here: /work2/noaa/aoml-hafs1/ahazelto/hafs-input/GFS_FNL/gfs.20080910/00/atmos/
Is there a way to modify chgres to use these files, or provide a proxy for some of the needed soil variables?
Thanks,
Andy Hazelton
The text was updated successfully, but these errors were encountered: