These graphs show in a little better detail what our crawls have been yielding. While the frequency of new URLs per site does seem to decrease hyperbolically, the pattern is consistent, i.e. the total number of new URLs is about same for each crawl, and the distribution looks about the same as well.
For these graphs, I have omitted frequencies <= 5. There are an average of around 950 sites per crawl that have <= 5 new pages.
Why So Many Zeroes?
After digging through the crawls, I uncovered why so many sites have zero new pages. The reason is that any time there's a broken or moved link, wget either doesn't download anything, or else downloads an "index.html" file of the moved page and then stops. Thus, there are no pages downloaded for that site, meaning no new pages when the pages are checked for redundancy.
As for the pages with very few new URLs, I can't discover anything wrong with the crawls. It just seems that they don't update their content as regularly.