My organization would like to share files via the gnutella network from one of our Windows 2000 servers using a reliable method that will provide for maximum uptime without requiring a user to log into the system so that BearShare or another standard gnutella client can run. Is anyone aware of a gnutella client that can be configured to run as an NT service?
We could run the service on a Linux box if necessary, but we don’t have one set up, making NT preferable.
You do realize that this is a really bad idea; as soon as your server is discovered (which it will, eventually, if it’s exposed to the Internet as there are people who scan the Internet hunting for new servers) it will become linked into the main Gnutella network and you’ll become a part of one of the largest MP3 swapping networks in the country. This will chew up your bandwidth; my experience is that just passing the catalog and query metadata through the network will use 300 kb/s of network connectivity.
Use a web server. There’s a standard client available for that on virtually any platform and you won’t have to worry about being coopted like this.
We are using a web server already, but we’re hoping to expand our audience by reaching users of the file sharing services. Kind of like advertising on cable and network, I suppose.
You raise a valid concern regarding bandwidth. I’m wondering, is there a way to throttle the amount of bandwidth used by a gnutella client to a reasonable level?
PaulYeah
Don’t hold strong opinions about things you don’t understand.
Not with the Gnutella client itself. If your routers are capable of traffic shaping or traffic limiting you can restrict total network charge for the Gnutella service port, but this will reduce your total ability to serve files.
In general, p2p systems use at least an order of magnitude more bandwidth than client-server systems to accomplish the same goal, and are less reliable to boot. There are also serious security and data-integrity concerns with almost all p2p networking systems (the exception seems to be FreeNet, which uses public key encryption to sign content, providing security and data protection, at the expense of even more bandwidth). All this leads me to the conclusion that there is no business-viable use for p2p networkiing at this time.
Well, not quite. You raise valid points, but they’re all moot if he’s talking about setting up a private gnutella network on his trusted network, behind his firewall, with boatloads-o-bandwidth (relatively speaking).
To answer the original question, the windows 2000 Resource Kit has a program called srvany.exe which allows you to run any program you want as a service. Obviously, you should think very hard about what sorts of vulnerabilities this could open, but technically speaking, it’s not hard.
In such an environment, there is little reason not to use a client-server model, though. And you still have data integrity issues (unless you’re a Microsoftie and believe that turtle networks really are a good idea).
Sure there is…one of the same reasons peer-to-peer is useful on the internet: you don’t need a large centralized server which takes effort to maintain, yet you still have discoverability of the files which are out there.
You’re right: the bandwidth costs are much higher. But if it’s warranted by the effort you save and the collaboration you foster, do it. No sense fretting about “wasted” bandwidth if it’s actually buying you something worthwhile.