Between 1999 and the early 2000s, a generation of software rewrote the rules of what it means to download, share and distribute. It wasn’t a single invention but a sequence: each project was born to solve a limit of the previous one, and every technical answer carried a cultural one. Understanding that season means understanding why P2P didn’t age as a fad but became a way of thinking.

The limit of Napster

Napster, in 1999, showed the world that file sharing can be simple. The service kept a central index of songs shared by users; search was fast, transfer happened directly between peers. To users it was magic; to architecture it was a compromise: there is still a point that can be shut down, seized, or simply targeted in court. That’s exactly what happened.

The technical lesson is clear: as long as a center exists, a lever exists. Anyone wanting a resilient network has to remove that point too. Projects experimenting with more distributed architectures emerge.

Gnutella and the idea of pure decentralization

Gnutella, released in 2000, removes the centralized index. Queries travel from node to node, propagated by controlled flooding. The network no longer has a single brain: knowing any node is enough to enter, ask questions, receive answers. Shutting one down stops nothing.

The price of this freedom is efficiency. Searches cost bandwidth, nodes must propagate messages on behalf of others, results may be incomplete. But the idea is launched: decentralization is not just an ethic, it is an architecture that works.

Why BitTorrent was so efficient

In 2001 Bram Cohen released the first version of BitTorrent. His insight is mechanical and elegant: a large file is split into many pieces, and each peer downloads different pieces from different sources, immediately sharing them with whoever needs them. You don’t wait to have everything before giving: you give while receiving.

  • the swarm grows with the number of users instead of slowing down;
  • a tracker coordinates the peers, but content does not pass through it;
  • .torrent files describe content with hashes, not with links;
  • distributing a huge video costs less than a normal download.

BitTorrent becomes the unofficial infrastructure of the Internet for distributing heavy files: Linux distros, game updates, scientific archives, community backups. It is not "pirate software", it is a protocol — and like all protocols, it does what it is asked to.

eMule, credits, queues, communities

Started in 2002 as a free, open source project, eMule inherits the eDonkey (ED2K) network and adds something fundamental: a credit and queue system that tries to reward those who share. Those who give bandwidth get priority in others’ requests. It’s a technical response to a social problem — free riding, the users who download without redistributing.

eMule then also supports Kademlia (Kad), one of the first widely deployed DHT networks: a distributed hash table that lets you find content and peers without central servers. The direction is clear: fewer and fewer coordination points, more and more logic delegated to the network itself.

The first P2P applications were not just software: they were social experiments on trust, reputation, cooperation and conflict with centralized models.

From file sharing to a new technical culture

Anyone who used eMule or BitTorrent learned concepts we now take for granted: hashes to verify integrity, seeders and leechers, share ratios, DHT, trackers, swarms. It was, effectively, a mass school of distributed systems. People got used to the idea that content can come from many sources at once, and that participating — not just consuming — can be the norm.

It’s no coincidence that many ideas from Bitcoin, IPFS, mesh networks and modern distributed storage borrow pieces of this grammar. File sharing was a laboratory, and its patterns stuck around.

What remains today

BitTorrent is still in use, eMule survives in niches, Gnutella is essentially a closed chapter. But the legacy of this season is not measured by the persistence of clients: it is measured by how we now think about CDNs, streaming, replication and software distribution. Every time a system scales well because "more users bring more resources," there is a fragment of the idea Cohen wrote twenty-three years ago.