The grib_filter option on http://nomads.ncep.noaa.gov is used to download
subsets of NCEP's forecasts. As the name suggests, the option works with grib (version 2)
files and is a filter. This option allows you to select regional subsets as well as level and variables.
As model resolution grows faster than bandwidth or the money to pay for
bandwidth, users will have to start using facilities such as grib_filter.
Grib_filter is an interactive facility to download subsets of the data, so it seems
unsuited to downloading more than a few files. However, we were tricky. Once you learn how
to download the data interactively, you click a button and generate a magic URL.
Once you have generated this magic URL, you give the URL to curl or wget to download
the data. Using scripting 101, you can write a script to download the data for other times and
forecast hours. Using cronjobs 101, you can run that script every day and get your daily forecasts
All my examples will be using the bash shell and the curl program. The procedure
is simple, so many users have implemented the procedure in different
For my example, I am using the 1x1 GFS forecasts on http://nomads.ncep.noaa.gov. I selected directory
gfs.2013072218 and file gfs.t18.pgrbf00.grib2. Then I selected 3 levels:
500, 700 and 900 mb. 5 variables: HGT, RH, TMP, UGRD, VGRD. I enabled the
subregion option ("make subregion") and selected the domain 20N-60N, 250E-330E.
Finally I select the option "Show the URL only for web programming".
When I click on "Download", I get the magic URL:
If you study the above URL, you see the arguments start after the question mark and
are separated by ampersands (&). You may also notice that slashes have been
replaced by %2F. This is all standard stuff when working with URLs. If you look closely,
you see the name of the file being processed (file=gfs.t18z.pgrbf00.grib2) as well as the
directory (dir=%2Fgfs.2013082218). (%2F translates into a /.)
Here is a prototype script to "execute" the URL and download the file. In the script,
I've replaced the date codes and forecast hour with variables.
# define URL
# download file
curl "$URL" -o download.grb
# add a sleep to prevent a denial of service in case of missing file
I copied the above script into a file, changed the date code, chmod it to 755.
It ran. If it doesn't run, the error message will be in download.grb.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 71185 0 71185 0 0 660 0 --:--:-- 0:01:47 --:--:-- 2821
ebis@linux-landing:/tmp$ wgrib2 download.grb
Converting the above script to download the required forecast hours,
run with the current date and to run as a cron job is scripting 101.
Have problems, ask your local scripting guru for help. Want to convert
the above bash script to Windows? I am sure it can be done and a Windows
guru can help.
Compatibility: as a design criteria, the new versions of grib_filter should
not break existing user scripts. The API should not change between versions
of grib_filter. However, the model will evolve over
time and so will the available fields. That can break user scripts.
Comments: the current grib-filter (as of 8-2013) is labeled as beta code and has been
in operations for many years. We will upgrade the grib-filter but the old URLs will
remain compatible with the new grib-filter. The new grib-filter will allow interpolation
to different grids and extraction of values at lon-lat locations.
The screen capture is based on nomads 2013-08. Future screens may look different.