Radar coverage

What fraction of the US population is within x kilometers of a weather radar?We can answer this question for the lower 48 states from the w2merger caches, a grid of US population density and a digital elevation map (terrain) data.  The WDSS-II program is called coverageStats and I ran it like this:

coverageStats -i $HOME/.w2mergercache/_55.000_-130.000_500.000___NMQWD_0.010_0.010___33_3500_7000/ -E conus/conusterrain.nc -P conus/nap10ag.asc.gz -o `pwd`/coverage -h 0:5 --verbose

I am defining that some place is covered by a weather radar if it is scanned by that radar at a height of below 5km above ground level.  The population density data is in Esri Grid format from Columbia University. The digital elevation data is from the USGS (the gtopo30 dataset) and has been converted to netcdf using the WDSS-II tool topoBrea.  Radar coverage information comes from the MRMS CONUS 1km resolution cache (created using createCache).

Here’s what the result looks like:

pop_mindist_frac70% of the US population is covered by a weather radar that is less than 100 km away.  A little less than 20% of the US population is not covered, or is covered by a radar beam that is at a height more than 5 km.  Note that these numbers take beam-blockage into account.

Here’s a map of what area is covered at what height.

DistanceToNearestRadar_20130120-120000One thing to realize is because I started with the MRMS cache, parts of Canada are included in these statistics.

If you want to try out different assumptions (What if I drop radar X? Do not consider Department of Defense radars? Use height of 3km to 5km? etc.), feel free to run coverageStats yourself.

 

Tags: None

ldm2netcdf now handles SAILS correctly

The implementation of the Supplemental Adaptive Intra-Volume Low-Level Scan (SAILS) for the 88D radars presented a problem for the WDSS-II ingestor ldm2netcdf  because it relied on VCP definitions stored in XML configuration files.  Those XML files defined which elevations matched up with each tilt. However, SAILS can insert a supplemental 0.5 degree scan into the existing VCP at any time. With no changes, ldm2netcdf would incorrectly label the new 0.5 tilt as the next expected tilt as defined in its VCP XML file.

To solve this problem, ldm2netcdf now processes Message 5 in the Level-II data stream (the RDA Volume Coverage Data)  to map each incoming tilt to the correct elevation. The new 0.5 elevations get correctly labeled and saved just like any other 0.5 tilt.

Algorithms listening to 0.5 elevations will be notified of these new tilts just like normal.  Algorithms that listen to all tilts will insert them into the constantly updating virtual volume as the latest 0.5-degree tilt of data for that elevation. So, with the change to ldm2netcdf, downstream algorithms such as w2qcnndp, w2vil, w2merger, etc. deal with the SAILS tilt transparently.

If you do not want the SAILS elevation to be inserted into the data stream, you can specify the ‘-e’ option on the command line of ldm2netcdf to separate out the extra SAILS tilts. The SAILS tilts will then be saved into a separate directory, such as Reflectivity_SAILS, or AliasedVelocity_SAILS.  We do not recommend this, as you are essentially throwing away the extra information.

Finally, we took this opportunity to eliminate some outdated command line options in ldm2netcdf.  First, is the ‘-D’ option for dealiasing.  The dealiasing code in ldm2netcdf is very old and the dealias2d command provides much better results. Second, the ‘-c’ for compositing will be removed since w2vil does a much better job creating composites.

The new changes are being tested and will be rolled out when all the kinks are worked out.

Reading in HRRR files

There are many occasions in which it is useful to read model data into WDSS-II. Often times, this data comes from the RUC/RAP model, and gribToNetcdf makes it very simple to get this data in a format readable by most of the other WDSS-II algorithms. However, if you are interested in using data from the higher resolution HRRR model, reading in the data is not quite so straightforward.

In order get HRRR data into a format readable by WDSS-II, you must first create a configuration file specifying which meteorological variables you are interested in.The reason the configuration file is created is simply to save time and space. Each HRRR file contains over 100 variables. If you’re interested in only a few of the variables, why waste valuable processing time and space reading in data you are not interested in?

While this all may seem rather cumbersome at first, it’s actually not too difficult. Just follow the steps below, and you will be working with HRRR data in no time.

  1. First of all, you need to make sure you have 2 environment variables set correctly. JAVA_HOME needs to be set to the location of your java instillation, and WDSSII_INSTALL_DIR needs to be set to the wdssiijava directory.
  2. Copy the file wdssiijava/example/griddataingest.xml into the current directory, and rename it hrrr.xml.
  3. Edit hrrr.xml. The inputDir and outputDir variables must be changed to match where your data is. The filenamePatterns variable must also be changed to match something in your input files (i.e., if all of your files are named 20130106_****, you could set filenamePatterns equal to “2013”). Finally, the listVariables variable needs to be uncommented and set to “true”.
  4. Run: ./w2java.sh org.wdssii.ncingest.GridDatasetIngest ./hrrr.xml. This will create your configuration file, varList.xml. This file will contain all of the meteorological variables in your HRRR data.
  5. Edit varList.xml. For all variables that you are interested in utilizing, you will need to set them from “false” to “true” inside of varList.xml.
  6. Edit hrrr.xml. Set listVariables to false.
  7. Run ./w2java.sh org.wdssii.ncingest.GridDatasetIngest ./hrrr.xml once more, and voila, your netcdf HRRR files will be waiting for you in the directory that you specified.

For future processing, you will simply need to change the inputDir and outputDir variables in hrrr.xml. If you are interested in processing different or additional variables, simply change those variables from “false” to “true” in varList.xml.

In order to use wdssiijava, you need to have Java 7 or higher installed and in your PATH.

VCP dependence removed from WDSS-II

In the past, in order to create a volumetric product in WDSS-II, it was required that the VCP used by the radar be known. This was not a problem for users working with data from the WSR-88D network, but for those utilizing data from outside of that network, a few extra steps were required, including the creation of a “fake” VCP file that contained the levels at which the radar had scanned.

However, the WSR-88D network recently adopted two new concepts to its scanning strategies. The Automated Volume Scan Evaluation and Termination (AVSET) concept allows any site to not perform higher-elevation scans when no storms are detected. The Supplemental Adaptive Intra-Volume Low-Level Scan (SAILS) gives radars in the 88D network the capability of adding in a supplemental 0.5 degree scan at any time.

While AVSET and SAILS have many advantages to them, the combination of these concepts have made using the VCP of a radar to help build volumetric products unreliable. Therefore, rather than depending on the VCP to build virtual volumes, we have taken the VCP dependence out of all of our products. This means that when working with data from outside of the WSR-88D network, including data from outside the US, users no longer need to create these “fake” VCP files, nor does the VCP need to be defined in the data. Users simply need to be sure that an appropriate expiry time for each scan is specified (using the ExpiryInterval attribute in the netcdf files) to ensure that old data ages off in a timely fashion.

Algorithms affected include:  w2vil and w2circ.