Scale is the antithesis of specialization. A idea is boiled down to its core for implementation but once successful, pressure mounts to scale the thing and soon the core starts expanding. Before long, adjacencies get targeted and conquered – either directly or via partners (since the flavor of the season is extensible platforms, partners come in and implement the extension sometimes into areas that clearly challenge boundaries the founding fathers of the platform had conceived). The service starts behaving like an aggregator. Some platforms (they did not call themselves platforms then) started off as aggregators. Take market data terminals for example. They became aggregators the moment they started plugging in more and more exchanges liquidity venues, more newswires and more brokers. Common functionalities got overlaid so all incremental additions benefited from using those functionalities. Soon enough it became a mad scramble – add as much as possible, as much as the pipes would allow. Market data vendors started crisscrossing their content assets on different platforms. Very soon there were a couple of dominant players in the market who were mostly carrying all that was there to carry. So how does this model get disrupted?
Death can come in many ways. A single swift bullet often does not kill the incumbents in a deep seated market such as this (I would like to believe that what is true for the market data industry will also hold water with industries that display similar characteristics). Death comes by a thousand cuts. My colleague Kunal Mehta recently attracted my attention to Owlin. Owlin is a bottoms up news service that threatens to disrupt the market news vendors (like Reuters and Bloomberg) by scraping information off platforms where news tend to break quicker (‘break’ does not however mean verified. Anyone who has run a newsroom or consumes news for market price movement purposes must surely know the difference). If Owlin is successful then they would have inflicted an early cut (even if it does not and fails in the implementation, someone else will come up with a better way of reaching the same intended outcome). Aggregators also face threats of disintermediation – that is, a situation where competitors figure out the raw data source, get there and enhance value of the basic information more than what the aggregator does. In an era where information is reaching the public domain quicker than ever before, this is a clear and present danger for aggregators and a clear and present opportunity for disruptors. An area where aggregators yet have a moat is with content relationships. Over time, painstakingly they have built relationships with content creators who require a wide and effective distribution platform, which the aggregators have provided to them. Consider brokers who would want to maximize their distribution reach of research for investors but cannot make capital investment to build the infrastructure. They have been loyal (though slightly irritable) customers of aggregators for a long while (yes, this model can also be broken with a crowdsourced experts’ network but the actual moat here is not the content. It is credibility)
A thousand cuts is a nice phrase. It has an element of mortality to it but how useful is it for consumers in the industry that is getting slashed by the blades of the blind watchmaker? Aggregators make life easy for the users who make a trade off between convenience and the bleeding edge. It is acceptable to live with a sub-standard instant messaging system embedded in the aggregators’ system because it is just that much more easier to use that than a best of breed. History tells us that sooner or later the bleeding edge catches up. Early adopters who bravely withstand that pain of ditching the convenience factor are rewarded for their choice, leading to others following their footsteps. The small band behind the piped piper soon becomes a crowd. At a critical point of this crowd-swell people start thinking of mashups – where the best of breeds are brought together in a manner that the users can control (as opposed to ceding control to the aggregator). The integration, messy at the beginning start becoming elegant with passage of time with each micro-mashup (specific to an implementation) getting a development roadmap of its own
Unsurprisingly perhaps aggregators also see this situation via an identical set of optics. That is why the preponderance of building “extensible”, “open” (albeit walled garden) architecture platforms that are ready for the Great Bypass situation as and when it arrives. The next wave will perhaps be to acquire the bleeding edge capabilities (is that why rumors are rife – as are rebuttals of Bloomberg eyeing LinkedIn?)