Tile Cache Management - Change Detection Level

(* Many thanks to folks at City of Portland GIS Dept for their inspiration and debugging on this process*)

Creating and maintaining large Tile Caches can be a time consuming process especially when the underlying data is constantly being updated throughout your organization. In a perfect world, each and every change would be recorded and these values could be used to update only a subsection of the Tile Cache. If you can obtain a list of areas where changes have occurred, you can use an Inclusion Mask (see Advanced Options on the Tile Exporter Window) to tell the Arc2Earth tile exporter where to create new tiles. This can work really well but in practice (especially in large organizations), it tends to break down for these reasons:

  1. Changes can occur from multiple ArcMap instances
  2. Changes can occur from different departments with datasets being merged together afterwards
  3. Changes can occur from outside the ArcGIS ecosystem (SQL databases etc)
  4. Attribute value changes can occur from many different sources

For these reasons, getting a list of all areas in your Tile Cache that have changed is difficult and error prone. The only solution would be to recreate the entire cache on a regular basis which is perfectly acceptable for small caches but can be a non-starter for even medium sized ones

** Change Detection Level **

In an attempt to alleviate (not solve) some of these issues, we've added some new functionality to Arc2Earth that will try to detect these changes automatically. As it turns out, one of the most efficient ways to do this is by examining the pixels of the tile images themselves.

The basic idea is to compare tiles being drawn with tiles that already exist in the cache. If the pixel signature of each tile matches exactly then it can be assumed that no features changed in that area. This comparison only occurs on a single level within your cache and when compared to what can be saved, the actual comparison time is trivial.

Feature geometry changes (add/update/delete) would obviously affect the drawing of the tile but attribute changes also affect the signature due to fact that they usually power feature renderers. Once a tile is deemed identical, it can be inferred that all tiles below it are also identical (see caveats below) and this is where the process can really save a large amount of time. Entire areas of cache can be eliminated from the exporter and more importantly, they are tiles in the lower levels of the cache where most of the time of exports is spent.

Here are two sample tiles that show how the pixel signature is calculated. The new tile has a random marker graphic added to simulate an edit to the datasource. When creating a signature, Instead of checking each pixel, an MD5 hash is calculated directly from the bytes of the file. The resulting hashes are compared and when they differ, the geographic extent of the tile is added to the Inclusion Mask for the export. All subsequent levels will use the Inclusion Mask instead of the full extent

New Tile Hash: CcIjUx1JDBk+t29cMyGVDQ==
Existing Tile Hash: omX8p+Y/WnrpzKWdrZAcnQ==

Here's a general workflow:

  1. You need an existing cache to compare against so make sure you've run the full export at least once.

  2. On the next export, choose a zoom level within your cache that will serve as the detection level. In general, this should be two levels above your end level (ex: tile cache levels 5-10, then choose 8 as the CDL). However, you should really test this value with your actual data/map to come up with the best possible level

To simulate a change to underlying data, add a simple marker graphic to the map for testing

Run the export and watch the Progress Log. When your Change Detection level is exported, note the detected area in the log

All levels below the Change Detection level will still be exported however, they will use the Inclusion Mask (a list of extents to export) and not the entire map extent. There will not be much difference in time for your small test but when its done on a large extent/cache, the time saving can be dramatic

Change Detection Caveats:

  1. Choosing the proper level can be error prone. Going to high will potentially miss underlying changes while going to low obviates the benefits of the process
  2. If you make use of Scale Dependent Visibility or Renderers, make sure that your level is set low enough to determine of values (features) from that layer are taken into account
  3. Counterintuitively, tiles that are draw with Anti-Aliasing are actually more accurate when detecting change