It seems to me that many sites are wasting a lot of bandwidth by providing file downloads over HTTP. If 10,000 users download a 20MB file, the website uses 200,000MB of bandwidth, assuming nothing fancy is done by the ISPs, like caching, etc. (I'm just trying to provide a simple example.)

Here's my idea: Rather than providing a direct link to the desired file, give a link to a torrent associated with it. By itself, this would definitely fail. If the file isn't popular, no one will be able to get it. So the key to this idea is that the webserver would need to seed the file.

This is how I see it:

  • Worst case scenario:

    • The file is something really obscure, something that only one person in the world would want.
    • Eventually, this person finds out where to get the file, and starts to use the torrent.
    • Since it is only the webserver seeding, the person downloads at the same rate he would have over HTTP.
    • There's some overhead for running the BitTorrent client, and perhaps a tracker as well. (I don't really know anything about torrent trackers.)
  • Best case scenario:

    • The file is something that everyone really wants, like the latest Windows update.
    • The initial downloaders download at the base rate, but later downloaders can download from the server, as well as their peers.
    • Users are able to download the file as fast as they possibly can.
    • The cost to the company hosting the file asymptotes.
    • A month later, the one person who hasn't turned their computer on for a long time connects and downloads from the webserver, since no one else is downloading anymore.

Right now, there are some obstacles that might turn people off of the idea:

  • You need to run a BitTorrent client on the server. There's some overhead associated with this.
  • You have to add another file to the client whenever you offer another file for download.
  • BitTorrent hasn't been integrated into the popular web browsers. Your grandmother doesn't want to download another program to download... whatever it is that grandmothers might be interested in downloading...

Pretend that these problems have been solved. Are there any other problems with this idea that I haven't thought of?

Also, are there any initiatives out there that are trying to promote this?


Here's an anecdote

BitTorrent only seems to work well if you set up your ADSL router/firewall/etc to pinhole a port through to your computer, so it can listen for incoming connections.

Without this, it generally seems to run dog-slow.

pinholing ports is way above the ability of most internet users, therefore these users perceive that bittorrent is dog-slow.

nobody wants to use something that is dog-slow unless they have to (eg: for piracy)

Conclusion: Zero demand (and probably a backlash) if people provided bittorrent for downloads instead of HTTP.

In the case of WoW, my friends that play it all have a list of HTTP mirror sites and go download their patches via those, as it is far quicker for them to get the files this way.



  1. Browsers don't have built in Torrent Clients.
  2. Nobody wants to seed some random file. They will download it and then close the client.
  3. People only seed files on "sharing" sites that punish users for bad ratios. You can't do that on a regular site.

Blizzard uses a bittorrent like technology to distribute patches to their popular World of Warcraft game...

I think this faces challenges on a lot of fronts:

  • Searching and indexing of torrents is currently done by a strange consortium of tracking sites, many of which track torrents for content of dubious legal standing. There is no Google like player in the space to find stuff.

  • The reliance on BT as an active means to distribute illegal movies and music has prompted buys like the MPAA and RIAA to pursue ISPs with a very heavy hand... to the point that some ISPs actively try to block torrent traffic--legal or not.

  • I wonder what the bandwidth costs would be for "mega" seeders that are always up and running w/ a desired file. From a sheer network cost perspective, I wonder if the net cost ends up equalizing over time.

  • There's really no way to "rollback" a file. If I'm a software distributor and I realize I just shipped a bad verison of a file to the public and need to correct the situation, once that file is in the public torrent stream, it's just out there. At best all I can do is open up a second torrent file w/ the corrected file, but there's nothing forcing clients to go get the newer file versus the version w/ a bug or security problem or whatever else it was that caused me as the IP owner for wanting to squash that file.

I think in specialized situations like the WoW patcher it works (mostly). But as a means for replacing HTTP downloads period? Don't see it happening anytime soon. Hell, FTP and gopher are still out there... :P


This is already part of the bittorrent spec. You can add a 'http seed' to a torrent file. All the clients I've tried recently understand this bit of the spec.

So your server becomes one of the peers, simply by hosting the file in the normal http way.

Best case - several downloaders come to your site at once, and download from you and each other, cutting your bandwidth, and increasing their download speed.

Worst case - only one person downloads at a time - you have a teeny bit of extra bandwidth for the tracker protocol. For a large download, this will not be a material amount of bandwidth.

There is a separate issue that if you are Large Company Inc, why should your customers help you with bandwidth bills when you could just shell out for an akamai-like content distribution system. I assume this is why the likes of Apple and Microsoft can't use it for OS media distribution, when the likes of Debian can.


Don't forget that the RIAA / MPAA make bit torrent look illegal, so a lot of companies don't allow you to run bit torrent software.


Everyone seems to be thinking that the torrent be hosted in a separate program.

It should be part of the browser and work transparently as long as the browser is open. Ideally it won't give you access to change ports, seeding rules, etc. It will just work. Running at lowest priority with even just a couple of k/sec will mount up over a lot of users. As soon as the user moves/deletes the file he/she downloaded then the seeding will stop.

Things that are needed:

  • Ubiquitous browser support
  • OOB support for Apache/IIS.



The best answer is not to replace HTTP downloads altogether: that will not happen any time soon.

The best answer is to offer the choice. And speed limit the HTTP download so that it's still usable, but people who can, and who know how to use it will prefer the Bittorrent option (No one likes to wait, that's human nature). There are already provisions in the Bittorrent specs to allow for using NORMAL HTTP connections as seeds using Range: headers et al. like the popular download managers do to download chunks of the file as if the HTTP server were a seed themselves.

This is the option the Blizzard patch downloader took... And it does suck, you know why? Their own FAQ's actually have the answer. DSL lines don't like their upload stream to be saturated, if it is, it seriously affects the download speed, and there is simply no mechanism in place in the patcher to limit uploads, short of simply turning off the P2P part of it (which means it only uses the HTTP seeds, wow, now there really is an idea...) so saturate it does and hey presto, you have no download bandwidth left... Ooops.

So now we're down to the overhead simply being the need to run a tracker on your system somewhere. And away you go. You don't even need an initial seed: you have your HTTP seed for that.


"Torrents suck up bandwidth. Http downloads don't. I can't use the internet at my place when there is a torrent in progress."

This has very little to do with a torrent or a http download, but rather to do with your router or firewalls' traffic shaping (or lack of). This is an easy to do task.

If a company wanted to use a torrent distribution system, there is nothing stopping them from having say a dozen servers throughout the world that are always seeding the files. That way their users can download them from the server, but at the same time are also providing uploads for other users.

I think the bigger issue is to do with ease of use for users (1 click download for HTTP vs using another program besides a browser (or browser+plugin)).


Also, business users behind 10 different firewalls and 10 IT snoopers aren't in the best position to be ferrying around bittorrent packets.


Well, some compagnies are actually using torrent to distribute their product. I'm thinking of Linux distributions for instance, who get a lot of traffic at each release.

Now why isn't it mainstream... Maybe because you need to upload... Maybe because it needs time to make a standard... You would need to rewrite some of the html standard wouldn't you?

For instance add a facultative bittorrent attribute to A tag would be awesome.


20MB isn't that big of a deal for a major site using a CDN. For smaller sites, a lot of them do use bittorrent.

However, if you're thinking that typical websites and their content would switch over to bittorrent, this will never happen for one major reason: latency. You know how it takes several minutes for a decent torrent to get up and running? How'd you like to have that wait for your daily sites? It would suck. Not to mention, it now requires a % of your upstream bandwidth, constantly. In addition to paying for your connection, now you have a bandwidth charge just for typical browsing.

So to sum up: clients don't support it, it wouldn't affect most people, and it would introduce poor performance scenarios, and you pay the upstream tax.


Jonathon - Opera has built in bittorrent support, firefox has firetorrent/foxtorrent extensions

Grant - Depends on your bittorrent client, but most have throttling settings, so you can set the amount of bandwidth it uses. Afaik, most regular browser downloads don't have this. The reason bittorrent slows down the internet so much is that you need the upload for TCP. Just set the upload bandwidth a bit back from your max, and you'll be fine.


"Blizzard uses a bittorrent like technology to distribute patches to their popular World of Warcraft game"

And it always stunk. It was always faster to wait in line at a file download site and download it myself than to use that horrible patcher.

This is one of the problems with torrents, some take forever to get since you have to seed and saturate your network...


The short answer: BT won't replace HTTP because it requires peers (by that I mean other people who have downloaded the file) to keep a server running the whole time and eating their upload bandwidth.

Some companies/organizations have tried this. Most Linux distros provide both an HTTP/FTP link and a torrent file. Thing is, most people downloading Linux know what's going on on their systems, so they can decide to keep seeding the torrent.

Now, suppose browsers by default download the file, and seed it for X amount of time. Now imagine some hapless user seeding 100 files he downloaded in the last day/week/(X amount of time)... I don't think he'll have much bandwidth left to send the simplest HTTP request.


I'd like to add that Amazon S3, another CDN, provides BitTorrent support.

Built to be flexible so that protocol or functional layers can easily be added. Default download protocol is HTTP. A BitTorrent? protocol interface is provided to lower costs for high-scale distribution. Additional interfaces will be added in the future.

Amazon S3 Functionality


bit torrent generally gets a bad name as it is most often associated with piracy. Apple is has said they are going to use bittorent to distribute apps and updates and stuff and pay people a small amount(probably a few cents)for every mb they upload. Bittorrent is still relatively new so it will take time for it to catch on. The main fact that bittorrentt is overwhelming used for piracy is the number one hindrance. Companies and the such start to use bittorent and the next thing you know people are also downloading movies and tv shows.


Torrents suck up bandwidth. Http downloads don't. I can't use the internet at my place when there is a torrent in progress.