The scenario is: servers produce files locally (archives), which then need to be accessible on-demand to all the other servers. This should be done in the fastest way possible. I'm having trouble deciding which route to take.
purchase a dedicated server with high speed network and large storage for storing the archives. The servers would communicate with this repository and download the archives if needed.
another variation of the above would be to create a P2P file sharing network between these servers. This would eliminate the single point of failure issue and scale very nicely if new nodes are added.
I would like to hear your input. Thanks!
[link][15 comments]