It is widely believed that Internet platforms are so large that they cannot be removed from their current state of dominance. Many believe that we have no choice but to apply the full force of competition regulation to protect customers from harm caused by their size and market power.
But are these platforms really too big to fail? Or are we in the midst of a cycle that has already repeated itself and will continue to do so in the future?
When I first accessed the Internet, it was through VSNL’s Gateway Internet Access service that offered a 9.6 kilobits per second connection over the notoriously spotty dial-up lines. Despite those obstacles, it was clear to early adopters like me that the Internet was a vast storehouse of exceptionally diverse and useful content.
At the time, the World Wide Web was little more than a collection of ‘HTML’ pages that were loosely linked to each other. This meant that it was really hard to find the information you needed unless you knew the URL of the specific Internet page it was stored on.
I have previously written about Archie, the world’s first Internet search engine, which attempted to solve this problem by indexing webpages by title. Early Internet browsing involved trapping through the Archie Index, hoping to retrieve relevant information by decoding a page’s content from its title.
The limitations inherent in this approach prompted organizations such as Yahoo and Alta Vista to invest heavily in curation. Armies of librarians were hired to organize the Internet. He personally visited hundreds and thousands of websites to manually sort them into a hierarchy to yield more effective search results. This is how we navigated the web when I first went online. But it was already clear that given the rate at which the Internet was growing, very soon it would be impossible for a human to keep up with the pace.
In 1996, a magical new search engine began filtering news through the academic grapevine, which could generate highly relevant results by ranking pages based on the importance of their back-link data. When I first used Google, it was as magical as I was promised it would be. I was able to access more relevant information than Yahoo or Alta Vista were ever able to suggest. This upstart search engine was so confident in its need-satisfaction accuracy that it had a button on its search page that bypassed the search results and took you straight to its top-ranked list of websites. And you’re rarely disappointed.
Since then, search has been our primary means of accessing the Internet. But even though it has served us well for more than two decades, I am beginning to notice something lacking in its quality and perfection. In the face of diverse and often conflicting sources of information, I have found it difficult to locate material I can trust.
This has forced me to turn to curation once again. Wirecutter is now my first port of call for product reviews, though I often visit even more specific websites for things I’m passionate about—DPReview.com (for photography equipment), HeadFi.org (high -and headphones) and alllattelove.com (for coffee).
As good as algorithmic recommendations are, I love finding new artists from playlists collected manually by friends who share my taste in music. I’ve stopped relying on online book recommendations unless they’re from reading lists of people I admire or who I subscribe to. It’s gotten to the point where I’m more confident of getting a sufficiently diverse range of relevant perspectives on a given issue from authors on Substack and discussions on Reddit than from general search results.
Benedict Evans summed it up perfectly in a tweet: “All searches grow until you need curation. All curations grow until you need searches.”
It is an unavoidable truth that technology develops in cycles, often swinging like a pendulum between extremes. If it seems like technology giants are unattainable in today’s times, it’s only because we may not yet see the upstart lurking around the corner that’s about to be their downfall.
Take the Internet only. Even though it was originally designed to be open, our access today is almost exclusively facilitated by the services that determine what we get to see. All of the content, commerce, entertainment and social connections we consume are pre-packaged in an endlessly scrolling feed of information in an online experience that goes far beyond the basic open vision of the Internet.
I don’t mention this to denigrate those platforms or downplay the current state of the Internet, but because I believe the pendulum is already swinging in the opposite direction. It is impossible to ignore decentralized solutions—blockchain-based services and decentralized autonomous organizations—in the past five years, regarding the centralization of the open Internet once in the hands of a few. has emerged as a counter-argument to this narrative of extreme centralisation.
While I broadly agree that we should have fair rules in place to protect consumers from harm, I’m not yet convinced we need to because Big Tech grew too big to fail. Is. On the contrary, if we look at history, we are probably on the verge of a major cyclical change.
After all, current technologies have always been replaced by the next best thing.
Rahul Mathan is a participant in Trilegal and also a podcast called Ex Machina. His twitter handle @matthan . Is