|
The grib_filter option on http://nomads.ncep.noaa.gov is used to download
subsets of NCEP's forecasts. As the name suggests, the option works with grib (version 2)
files and is a filter. This option allows you to select regional subsets as well as level and variables.
As model resolution grows faster than bandwidth or the money to pay for the additional
bandwidth, users will have to start using facilities such as grib_filter.
Grib_filter is an interactive facility to download subsets of the data, so it seems
unsuited to the needs of a forecast operations. However, we were tricky. Once you learn how
to download the data interactively, you click a button and generate a magic URL.
Once you have generated this magic URL, you give the URL to curl or wget to download
the data. Using scripting 101, you can write a script to download the data for other times and
forecast hours.
All my examples will be using the bash shell and the curl program. The procedure
is simple, so converting to another programming language is straight forward.
For my example, I am using the 1x1 GFS forecasts on http://nomads.ncep.noaa.gov. I selected directory
gfs.2013072218 and file gfs.t18.pgrbf00.grib2. Then I selected 3 levels:
500, 700 and 900 mb. 5 variables: HGT, RH, TMP, UGRD, VGRD. I enabled the
subregion option ("make subregion") and selected the domain 20N-60N, 250E-330E.
Finally I select the option "Show the URL only for web programming".
When I click on "Download", I get the magic URL:
URL=
http://nomads.ncep.noaa.gov/cgi-bin/filter_gfs.pl?file=gfs.t18z.pgrbf00.grib2&lev_500_mb=on&lev_700_mb=on&lev_1000_mb=on&var_HGT=on&var_RH=on&var_TMP=on&var_UGRD=on&var_VGRD=on&subregion=&leftlon=250&rightlon=330&toplat=60&bottomlat=20&dir=%2Fgfs.2013072218
If you study the above URL, you see the arguments start after the question mark and
are separated by ampersands (&). You may also notice that slashes have been
replaced by %2F. This is all standard stuff when working with URLs. If you look closely,
you see the name of the file being processed (file=gfs.t18z.pgrbf00.grib2) as well as the
directory (dir=%2Fgfs.2013082218). (%2F translates into a /.)
Here is a prototype script to "execute" the URL and download the file. In the script,
I've replaced the date codes and forecast hour with variables.
#!/bin/sh
#
# define URL
#
fhr=00
hr=18
date=20130821
URL="http://nomads.ncep.noaa.gov/cgi-bin/filter_gfs.pl?\
file=gfs.t${hr}z.pgrbf${fhr}.grib2&\
lev_500_mb=on&lev_700_mb=on&lev_1000_mb=on&\
var_HGT=on&var_RH=on&var_TMP=on&var_UGRD=on&\
var_VGRD=on&subregion=&leftlon=250&\
rightlon=330&toplat=60&bottomlat=20&\
dir=%2Fgfs.${date}${hr}"
# download file
curl "$URL" -o download.grb
# add a sleep to prevent a denial of service in case of missing file
sleep 1
I copied the above script into a file, changed the date code, chmod it to 755.
It ran. If it doesn't run, the error message will be in download.grb.
ebis@linux-landing:/tmp$ ./test.sh
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 71185 0 71185 0 0 660 0 --:--:-- 0:01:47 --:--:-- 2821
ebis@linux-landing:/tmp$ wgrib2 download.grb
1:0:d=2013082118:HGT:500 mb:anl:
2:6406:d=2013082118:TMP:500 mb:anl:
3:9906:d=2013082118:RH:500 mb:anl:
4:12991:d=2013082118:UGRD:500 mb:anl:
5:18567:d=2013082118:VGRD:500 mb:anl:
6:24143:d=2013082118:HGT:700 mb:anl:
7:30549:d=2013082118:TMP:700 mb:anl:
8:34049:d=2013082118:RH:700 mb:anl:
9:37134:d=2013082118:UGRD:700 mb:anl:
10:42295:d=2013082118:VGRD:700 mb:anl:
11:47456:d=2013082118:TMP:1000 mb:anl:
12:51372:d=2013082118:RH:1000 mb:anl:
13:54457:d=2013082118:UGRD:1000 mb:anl:
14:59618:d=2013082118:VGRD:1000 mb:anl:
15:64779:d=2013082118:HGT:1000 mb:anl
Converting the above script to download the required forecast hours,
run with the current date and to run as a cron job is scripting 101.
Have problems, ask your local scripting guru for help. Want to convert
the above bash script to Windows? I am sure it can be done and a Windows
guru can help.
Comments: the current grib-filter (as of 8-2013) is labeled as beta code and has been
in operations for many years. We will upgrade the grib-filter but the old URLs will
remain compatible with the new grib-filter. The new grib-filter will allow interpolation
to different grids and extraction of values at lon-lat locations.
|