If each aggregator agent gets a slice of the database, how many feeds to you think each can reasonably handle in a timely manner without chewing up bandwidth and processor? 10k maybe?

I'm thinking that each node would cycle through their list every 20 minutes. That'd be roughly 72 checks per day, per feed.

If the average feed size is 150k, that puts the total download burden around 1 GB per day. That seems reasonable to me if someone knows what they are doing and chooses to participate.

Follow

The agent would kick off a check for all 10,000 feeds at once and just wait for it to finish. The checking process would also use HEAD/Last-Modified and the usual tricks to keep from wasting bandwidth if nothing changed.

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
PodcastIndex Social

Intended for all stake holders of podcasting who are interested in improving the eco system