It’s kind of silly, but I still really dig the idea behind torrenting and peer to peer sharing of data. It’s cool to think about any old computer helping pass along some odd bits & bytes of data, whether a goofy drawing or strange story.
AI model weights. Patches for MMOs (World of Warcraft famously used this to good effect).
I think a good chunk of the Internet Archive is available as torrents, at least the software collections and public domain media.
You can also download a torrent of the whole of Wikipedia, with and without images.
Do you know how big those two Wikipedia downloads are?
Not a direct answer to your question, but this is where I download my stuff from, and it also shows size.
https://library.kiwix.org/#lang=eng
Edit: Wikipedia is available there, the full thing is 109.89GB. I wonder how up-to-date it is.
As of last year, English Wikipedia, articles only, text only, was about 22GB compressed (text compresses pretty efficiently), according to the current version of this page:
As of 2 July 2023, the size of the current version of all articles compressed is about 22.14 GB without media
Some other sources describe the uncompressed offline copies as being around 50 GB, with another 100 GB or so for images.
Wikimedia, which includes all the media types, has about 430 TB of media stored.
I don’t think they do it anymore, but spotify started out with a p2p network on the backend.
Super smart way of bootstrapping such a thing without having to upfront huge server costs.Took it out ten years ago. It was super smart, and there are still situations where it would be helpful, like when a new Taylor Swift album drops that takes the service offline.
Not litteral torrenting but the protocols ar e very similar (since they are both P2P data sharing):
Windows updates can downloaded from other computers in your local network
Steam now tries downloading games from other computers you are logged in. You can opt-in thlo serve other accounts in your local network as well.
Downloading actual linux ISOs with bittorrent is soo much faster than downloading them directly from the distro’s mirror. I always use bittorent to download new linux distros I’d like to try.
Also, I believe p2p protocols are still popular in korea because ISPs there actually charge website operators for bandwidth delivered to korean customers. Twitch pulled out of korea because of this. I think their competitors there, e.g. AfreecaTV, uses p2p for their streams.
peertube uses bittorent to stream video.
This might be stretching the definition of “common” and “torrenting,” but BitTorrent created BitTorrent Sync with similar tech for personal file synchronization. It was later rebranded Resilio and still exists today.
An open-source alternative that works in a similar fashion, SyncThing, also exists.
I would consider this to be one of the intended functions of torrent files. Torrents started as faster ways to share files peer to peer. If a few people had a large file on their machines they could each upload part to someone who needs it essentially multiplying their upload bandwidth. This became less popular as internet speeds increased, except for “illegal” stuff. I would definitely try one of these…if I had more than one computer.
A common use case for SyncThing is keeping a password file up to date between, say, your PC and your phone. It’ll even work remotely, thanks to the presence of relays.
(The downsides include pretty heavy battery usage )
IIRC Steam uses BitTorrent to help users download game assets. There’s an option to switch it off, still, so must still be going.
PeerTube uses Webtorrents to offload hosting of hueg files.
Odysee uses something similar to do the same. (At least they claim to, but last time I took a dig at it it seemed to be hosted “regularly”)
Spotify famously had their own p2p-thing going in their desktop apps in the early days. Saved them a pretty coin back when hosting was expensive.
Coming to a browser near you is IPFS.
sharing fan edits
Clonezilla uses bittorrent for one of its massive deployment modes. I work at a university, and whenever we have to deploy an OS image, the ten gigabit uplink between the storage server and the classroom switches always gets saturated in unicast/interactive mode. Using bittorrent mode gets around this issue because once a computer has downloaded a chunk of the image, it can seed it for the rest of the computers within the subnet. One massive limitation is that the target computer has to have enough storage space for both the downloaded image and the deployed OS too.
I remember when it was relatively new and controversial BBC’s iPlayer hadn’t been around very long and they said they were going to start using Bittorrent tech for streaming. Guessing that never came to fruition though.
Any large file is going to be much quicker getting through BT as long as there are enough seeders. OS distros, patches, P2P files, 4K anything, etc.
One funny use I discovered when I was cloning a lot of computers is that even on a closed lan, BT with local discovery was stupidly fast in distributing a big set of files across a pile of computers instead of rsync. Also, setting it up was much easier.
Its a really interesting question. I wonder what the underlying economics and ideologies are at play with its decline. Economies of scale for large server farms? Desire for control of the content/copyright? Structure and shape of the network?
I guess it has some implications for stream versus download approaches to content?
If I recall, Spotify moved away from it just because the client/server model got way cheaper and the P2P model had some limitations for their future business plans. I remember them mentioning that offering a family plan was a challenge with their P2P architecture when people on the same network/account were using it at the same time.
It was probably also part of the move to smartphones. Spotify was just a desktop program for a long time and, while I’m not an expert, I would guess the P2P model made a lot more sense on desktop with a good connection than early smartphones on flaky 2G/3G connections. They might have had to run a client/server model for iOS and/or Android anyway.
Very interesting, thank you. I guess then the centralised server must have some sort of economy of scale.
In my head, I’m comparing the network to the electricity grid with certain shapes of network making different technologies more or less feasible. I would guess the internet network is probably similar to the electricity grid in most places having fewer hubs and lines of high bandwidth rather than a more evenly distributed network. Maybe the analogy is bad though.