From: Gerald Rode <>
Date: Sat, 7 Apr 2007 14:28:32 -0700

I attached a breakdown of scraped torrent data broken down by site.
This data only represents active torrents (those still found on
respective sites). Theres a large set of null hashes which appears to
represent earlier scraped data as currently we retrieve hashes with high
success. Im assuming that we want these null hashes to be updated.
Again with sites that require downloading to retrieve the hash the
process becomes much slower especially in cases in which the site places
limits on downloads such as Snarf-it which I have come to conclude
limits an ip to 30 downloads. As of right now I am utilizing a few web
proxy's get around such site download limitations. Also as the data may
show some sites may need more attention in the searching scheme as a low
number of urls would depict - either that or they just dont have a huge
amount of content.

Received on Fri Sep 14 2007 - 10:56:24 BST

This archive was generated by hypermail 2.2.0 : Sun Sep 16 2007 - 22:19:49 BST