Dev Status Update - 13 April 2007

From: Ben Grodsky <>
Date: Fri, 13 Apr 2007 13:05:22 -0700



* proxy installer w/ QA

        * one issue identified already: anything other than original client set received doesn't do anything

* did publish for Push

        * php putting stuff into db

        * talked w/ Ivan about d/ling queue, have to talk to Jay about search frame ???

* started looking at relateds ???


* edit queue working

* have to work on getting sources from available supply INSTEAD of dumps

* have to work on Search and Filter

        * Search interfaces w/ Ty's stuff

        * so, Search is more important than Favorites

* working on Favorites and GeoID stuff

        * geoID more important

        * geoID should be done today

* some soulseek issues

        * may be resolved by adding server interfaces

* synched out new version of ed2k server, so easier to deal w/ server lists


* 5 acceptances for new term.

* Hahn getting data from Danny

        * happy so far.

* BT

        * changed trackers

        * isohunt wiped out all stuff

        * new tracker got banned b/c used it w/ old user names

        * did well on Korn test

        * getting Paramount back up


* gnutella host browser

        * using Sujay's host IPs

        * running for 1 week

        * 30mm rows??? of data per day

* working on EDU - IP now

* ares table for data feed is corrupted.

        * Tables on this MI server are having issues for Ty, Sujay

        * 9esh isn't having problems w/ tables he creates here

* Danny was transferring the data feed manually for Univ Studs.

        * Now automatically repairing table delay.

        * Repairing tables shouldn't be automatic, as it locks db for time it's being repaired (usually 30 min)


* proxy server on 6 machines now

* minimal resource usage

* run at this level for a few days

        * if no problems, let Jay know how to scale-up testing

        * client doesn't have much functionality

        * test proxy part and the light/distinct ed2k spoofing


* Sujay's, Andrew's and Jeff's website computer is not stable.

        * Moving to another machine

* making some changes for Timbaland one-off

* should have Google-map for MI for next week

* changes for Billboard for Hahn.


* snarf-it

        * only allows 30 downloads/day/IP

                * kluged work-around cycling thru dev IPs in office and web proxies

        * QA warned of searching limitation

* connection issues for Bt searcher fixed

        * changed some code to better deal w/ htm


* gnutella demand collection stopped

* ramping up # machines to see when hit limiter threshold

* machine where stable version running crashed, so 3 days of bad-no data

* test system was collecting data, but as it's only testing the data from there isn't much more stable

* moving to LB system for searching b/c # of searches decreasing with additional tracks

* initial results of Gnutella protection

        * hitting about as hard as Ivan

        * top results a little higher than Ivan's

        * lower results aren't in as well as Ivan's stuff

        * start transitioning from Ivan's leave protection --> your ultra-peer protection.

                * Turn on one cabinet at DCa - these should be better computers

                * the DCa bandwidth has been mostly resolved.

                * Goals

                        * get Ivan computers

                        * probably start covering non-singles

        * Basic issue is the DCb machines being used mostly hover at 80% cpu

        * Ivan using

                * PM2 - 5 cabinets dell 3 ghz

                * PM1 - using ~1.7 gbps

        * Sujay

                * PM1

                        * using 35 servers ( of them crappy)

                        * using ~400 mbps


* generally optimizing

* queries weren't going thru completely b/c db server connection bottleneck

* running hopefully stable code now to process full-day of Ares data

* might have another gnutella-day of data processed by this afternoon.

* Have to cut through the crap

        * porn on gnutella

                * probably easier to attack

                * adds extra layer of kluge of using gnutella to drive ares searches to filter out Spanish.

        * spanish on ares


* thumbnail generator issues

        * overcame some black screen thumbnail issue

        * basic filtering formula to identify good thumbnails takes 60 screen caps over x-period of time and then picks the biggest file sized thumbnail.

                * Some artifacts (black squares) showing up in big view

                * artifacts issue isn't critical, now just concerned w/ how well the thumbnailing service scales w/ increased users on page.

                * Stephan's command-line version allows increased ram to be available

                        * java pukes (virtual memory to spew)

* working on publishing side

        * networking drive is long-term solution

        * Stephan may be able to re-design applet to handle more than 1-transfer
Received on Fri Sep 14 2007 - 10:55:52 BST

This archive was generated by hypermail 2.2.0 : Sun Sep 16 2007 - 22:19:45 BST