Outcooperating the competition: Building platform-ecosystem last movers by embracing a long-term and inclusive perspective
When we speak about ecosystems and platforms, new forms of organizations, and the role of software in the development of such organizational forms, we often lack a shared framework, a business and technology architecture which acts as a reference point. But, in reality, a convergent view of digitally enabled ecosystems with interacting parties is emerging potently from market practices. It is worth highlighting as it can provide a foundation for further thinking. We will look into these emerging patterns of organizing markets and into a potential way to orchestrate and design a set of incentives that can make it possible for an ecosystem to be a cooperative ‘last mover’, in the sense of creating an ecosystem weaving initiative that aims at becoming the standard, the place to be for innovations to happen.
The essential question we investigate is the possibility of creating ecosystem strategies that reduce the case for destructive competition and maximize the case for collaboration, plugging in, integrating special capabilities, composability, modularity and, eventually, wholly systemic actualization where all parties thrive. Such a system would be post-competitive and represent a powerful new way to redesign markets for accelerated innovation.
The components of a platform-ecosystem
Modern software powered platform-ecosystem initiatives are essentially based on three key spaces of value creation where exchanges happen:
- the marketplace;
- the main ‘product’ features (the essential UX);
- the so-called extension platform.
Indeed, a weaver of a platform powered ecosystem normally aims at and operates to:
- create a marketplace that enables certain experiences of exchange of niche product/services between parties (producer-consumer), normally monetizing through a ‘take rate’;
- provide a main user experience with regards to a set of enabling services and products -- centrally provided by the platform owner -- often in the form of a SaaS offering or other more capital intensive services (such as logistics) targeted often at producers in the marketplace and, in a more limited set of cases, to consumers;
- create an environment where other third parties can develop so-called ‘extensions’ to the main user experience in the form of apps, templates, and plug-ins. In many cases this happens by adopting a so-called ‘reverse API’ paradigm, where extensions are effectively pieces of software that run in tight connection with the main UX and and are often optimized (following strict UX guidelines); these apps connect for data, and further workflow execution, to other pieces of software running in external contexts where the users may also have a connected identity, information and data (such as a Shopify seller that syncs ecommerce with a bookkeeping solution through an extension).
Normally, in most of the cases, the integrated stack would end here, hiding at least three more key layers. Indeed in the platform-owned back end you would find:
- a certain ‘grammar’, so-called domain-model in domain-driven design terms;
- a data layer where all data generated are kept safe and accessible;
- eventually, the infrastructure on top of which the system runs.
All these need to exist to grant system execution. The stack can therefore be seen as:
It’s important to note that (as the symbol [→] hints) the extensions that run on the platform may also use an extended domain model and can store data on different data layers and run on different modularized components as infrastructure. For example, the accounting extension that allows the ecommerce owner to keep books synchronized with sales may store data in a private and controlled space and possibly connect with a public tax filing infrastructure.
Understanding and challenging the framework
Normally such a platform framework would be run by a large company. This company would aim at gaining a certain defensible advantage through a defensibility flywheel, such as a scale advantage, a lock-in becoming essential to the adopters’ workflow, or a proprietary technology or data advantage.
After becoming ‘too big to shortcut’ the platform would play a game of balance between control -- by controlling all the interfaces between the layers -- and enablement -- by providing services valued by all the entities involved. Entities let go of a degree of independence to be able to reap the benefits of being part of the ecosystem. The benefits might include greater demand generation or improved efficiency. The platform owning company would likely seek defensibility and control of the ecosystem by leveraging reinforced multi-sided network effects.
Given all this what are the challenges that prevent new approaches to running an ecosystem that are more inclusive and less centralizing? And would such a system be desirable or simply a more efficient means of achieving innovation?
The essential role of interfaces is well captured in David Akin’s Shea’s Law which states: ‘The ability to improve a design occurs primarily at the interfaces. This is also the prime location for screwing it up.’
We want to explore the effect of two major drivers. On one hand, we want to explore what happens when we liberate the interfaces that exist between layers and components from the monopolistic control of one single ruling party. We believe that clear and stable interfaces increase the overall capability of the system to generate broader plurality, optionality, and bring more resilience.
On the other hand, we also envision that transitioning towards a shared governance on interfaces and embracing less centralized incentive structures, would bring a longer-term focus: we assume that -- as some studies have shown -- organizations that are co-managed and co-owned show a broader tendency towards what in Hirschman’s Exit, Voice, and Loyalty framing is labelled as ‘voice’ and ‘loyalty’. Essentially this makes them better equipped for the long term.
Creating clear interfaces between layers would also be essential to facilitate the evolution of each of those layers. It is clear that a different pace of layering will affect the infrastructure and domain model layers -- much slower in evolving -- versus the services and products layer that normally evolves much faster.
The role of the main UX
In a system architecture like the one outlined above, the provider of the main UX would be in charge of:
- implementing the core set of ‘product’ functionalities specified in the domain;
- providing the differentiating element on top of the core domain functionalities (such as with capital intensive services that plug into the functionality);
- building its own marketplace(s) of services;
- building its own marketplace of extensions;
- managing policing, and security of both marketplaces.
By standardizing the interfaces between the main UX and the extensions it would be possible to have multiple players provide alternative main UXs. Services marketplaces could also have a low bundling with the main UX(s). For example, a marketplace featuring consultants aiming to provide consulting services to adopters of the software stack would need the experts to be familiar with the domain model and main UX, and wouldn’t require deeper integration. Envisioning the possibility of also including in the domain model the single marketplace entity and its reputation, it would be technically possible to imagine experts being able to provide services across different main UXs marketplaces by leveraging the same reputation. In the case of such an untangled UX, and thus hard to attain defensibility, the main differentiator for the main UX providers would be that of full compatibility with the ecosystem of service providers and extensions, and furthermore, that of providing the best experience across the core set of features plus adding differentiating features, by retaining compatibility with the whole ecosystem.
The thickness of the main UX depends on the grade of standardization in the business process enabled: the more standard the process, the thicker the main UX is nudged to become. It would be also possible to imagine such an ecosystem to sport a very thin main UX, even a disappearing one: in this case, the extensions would all share the same domain model but implement a partially overlapping set of features, allowing interoperability but providing their own ‘view’ on the domain. The reason to have a main UX, and not only extensions, would be to provide basic curation services (policing and security) helping the adopter navigate the extensions’ market. Also, the main UX provider would be best suited to run the services marketplaces. Main UX providers would be in charge of standardizing transactions (for example with a payment system, distribution, and reputation-based browsing).
The case to have a main UX and the inherent difficulty in standardizing the interface between the main UX and the extensions indicates that a likely outcome could be that of having a strong coupling between a main UX and a certain ecosystem of extensions.
A thick main UX player would have to deliver a tangible amount of enabling value to the ecosystem by running the transactions engine (for both the extensions and the service marketplaces) and the overall evolutionary learning engine at scale. The thicker the main UX, the more empowerment and services to be provided to the ecosystem will be needed to justify the thickness. As defensibility options would be certainly limited for the main UX provider though, due to the openness of the domain model and interfaces in such an unbundled market, building trust and empowering features for the ecosystem would require a different financing path. The current financing path for network-based/platform based organizations is indeed largely based on investing upfront with the aim of creating defensibility and lock-in: platforms struggle to overcome the so-called chicken-egg problem and rely on massive subsidies in the early stage to create the attraction that -- in the longer term -- allow them to create network effects. In an unbundled ecosystem, such as the one we’re exploring, we would be seeing (and to some extent we’re already seeing) the application of new incentive design approaches that allow an early and steady creation of trust between users and the platform by cementing a reciprocal set of incentives for success from the beginning. Tools such as Bonding Curves, Augmented Bonding Curves or similar crypto primitives designed to create early stage utility (either financial, by giving rights to future profits or functional, by allocating special governance rights) while network effects and product maturity are still not materialized, may fruitfully contribute to building such early trust and solve the problem explained by Chris Dixon in his landmark work on crypto tokens.
Finally, it’s worth saying that in such an open architecture both direct to customer extensions and a parallel main UX mediated distribution may also co-exist. An example of such a separation and standardization of interfaces may be seen in the development of the Wordpress ecosystem where thanks to the standardization of the domain model, and the openness of interfaces we’ve seen a plethora of different approaches emerging such as with headless CMSs, second order ecosystem such as Elementor’s built on top Wordpress domain model and back end, and more.
The domain model, data layer, and infrastructure
The domain model on top of which the system would run would then act as a common, shared model of the system and would contain all the definitions, and the actions that are worth specifying to ensure consistency and compatibility across the different implementations of the main UX and between main UX and extensions. The domain model would represent the actual underlying protocol and would also be the root of the implementation of the data architecture. Such a domain protocol would clearly need to be subject to shared governance processes to ensure all points of view are respected and that changes in the domain model do not dramatically impact a subset of the ecosystem players.
The data layer would need to be transparent, accessible and auditable supporting some level of federation between local clusters to allow certain spaces of information to remain closed -- but compatible and available for settlement. To some extent, the common infrastructure and the data model could also overlap in such a system, especially in the possibility of an implementation based on a permissionless digital ledger. In this case, multiple nodes would be responsible for the execution of the open ledger and the validation of all the transactions, token engineering would be needed to ensure the relevant incentives for the nodes composing the network to run the validation work.
More federated architectures could be also designed and reduce the ledger validation work by designing trustless settlement layers between clusters made of trusted entities (such as by settling inter- organizational transactions, while keeping intra-organizational transactions in a trust-based environment).
The type of interface and data model coupling would vary with the type of data being exchanged: with exchanges needing the historicization, typical of financial transactions, sitting on a distributed ledger means distributing the validation work to nodes. If the need to share a data model for interoperability doesn’t entail having consistent ledgers with auditable information then the need for shared infrastructure would likely disappear, technically leaving the possibility for other types of provisioning of common infrastructure to emerge, such as with cloud providers.
In such an architecture, extensions run on the premise of using the same shared domain model (that is reflected in the interfaces) -- to ensure their compatibility to all the main UXs -- and possibly extend the domain model or just use a complementary extension of such a model into other domains. Extensions could technically also be able to wrap and integrate different operational infrastructures (such as further logistics or computation infrastructures for example) and make them available to the parties. In the pace layering view, extensions are the most likely to capture new and emerging behaviors and are subject to strong innovation pressures to keep competing. The main UX(s) and the common domain model continuously exercise an attraction mechanism for features that emerge in the extension ecosystem or in the marketplaces as they mature: in an ILC cycle extensions and marketplaces likely generate most of the innovations and that is to be gradually integrated into the main UX by continuous institutionalization. This cycle is continuous: as the main UX grows it may bureaucratize and become too monolithic and big. This makes the case for breaking it down into smaller niches of the market and a subsequent further specification of the domain model, effectively giving space to the birth of a further, more vertical, ecosystem.
Outcooperating the competition
In a usual context we would see a single organization investing widely, iterating fast and creating a main UX and a data-infrastructure layer on top of a proprietary domain model.
This company would likely start providing a single user value proposition in the main UX and gradually introduce marketplace features and extensions. Other trajectories also exist albeit this would be the most anticipatable today. So when would a pluralistic, cooperative model be worth applying? Why is this traditional model here to be overcome? What are the interfaces that make sense to agree on?
One could argue that -- if possible -- agreeing on a shared protocol representing a common domain model would make sense to allow inter-system cooperation and interoperability. But in a world of take- all winners such a proposition doesn’t make sense: as you’re competing for a certain niche and your value proposition depends on acquiring network effects you shouldn’t focus on enhancing low layer interoperability. The existing financing and technological patterns that have pushed towards a winner take all perspective are being challenged by several essential innovations. First, as we’ve explained briefly, mainly thanks to decentralized finance and governance patterns and crypto-tokens design, new ways to finance early stage ecosystem development in a more pluralistic way are emerging. Furthermore the emergence of these technologies further reduces transaction cost and make self-executing multi party contracts (effectively partially autonomous organizations) possible and puts into question the centralized approach to platform building and ecosystem weaving by just making alternative ways possible.
Minimizing the attractiveness of exit by ensuring the stability of interfaces (a major element of concerns of third parties that accept to produce under a platform enablement regime) would increase trust in the platform and would push the development of specific IP inside the extensions while leaving the platform enabling services to be the basis of a larger ecosystem: the need to compete with the ecosystem would be minimized, while the need to compete ‘within the ecosystem’ would still be present and create innovations that would be gradually captured and institutionalized by a trusted party -- the main UX provider -- and inside a trusted governance and financial process -- the domain model evolution. Considering that all interfaces will be open and externalizable one could argue that competition would not disappear but just be moving away from the interface. Interfaces and the domain model, instead, would effectively have to be managed under a ‘Commons’ regime and would require a governance process inspired by the eight key principles pointed out by Elinor Ostrom.
In this perspective, in no way could we imagine innovation to happen through the governance process that would, instead, be more a guarantee of ecosystem stability and thrivability. The main UX providers would compete among themselves and with the extensions for user attention but a lot of the network value would accrue in shared, non enclosable spaces.
One could argue that the natural tendency of the market would naturally push for ecosystems to compete with each other and thus invalidate most of the ideas outlined in this article. On the other hand, standardization processes have always been part of the history of industry development, and the Internet itself, and experiences in shared-governance ecosystems are growing providing promising results in terms of growth and innovation enablement. Open Compute Project -- a shared governance platform born to ‘apply the benefits of open source and open collaboration to hardware and rapidly increase the pace of innovation in, near and around the data center’ is now projected to intermediate and facilitate circa $12 billion of gross merchandise value in 2023. Uniswap, a decentralized exchange protocol that connects ‘developers, liquidity providers and traders’ and lets them ‘participate in a financial marketplace that is open and accessible to all’ is now able to enable almost $1 billion in transactions every day and is governed through the interplay of a public forum, a governance token that has been distributed to the stakeholders in the community in a certain issuance moment (UNI).
Based on early research and interviews, we foresee that, as a complement of the goodwill and long term commitment, creating interlocking financial incentives -- that increase skin in the game for all participants into each others’success -- may represent a promising way to keep the case for cooperation higher than that for exit, fork and competition. As an example, organizations developing extensions should be provided with options to access not only the governance processes related to the domain model and protocol, but also be given the possibility to access equity of the main UX provider they decide to connect to. As we’ve anticipated, a coupling between a particular main UX provider and an ecosystem of entities is anticipatable: extensions evidently delegating some decision making power to the main UX may trade this loss of power in exchange for a stake in the success of the main UX they optimize for. These incentives -- traditionally related to equity holding and transferred through complex and bureaucratic processes -- are being streamlined through new technological approaches of which indubitably token engineering is the most representative.
Conclusions and further work
In this article we presented a view of how the emerging trends in tech, infrastructure and development of ecosystems are gradually making possible a different approach to ecosystem building. This approach is more integrative and cooperative, and seeks to, at the same time, enable competition for innovation in certain spaces, while incentivising cooperation and the creation of shared innovation though interface standardization, shared governance and new types of financial incentives. The emergence of new technologies is making these new directions possible, case studies confirm that this direction already constitutes an appropriate approach to ecosystem building. Such an approach to ecosystem building is advisable to incumbents and upstarts that intend to weave long term ecosystemic initiatives and embrace a perspective of openness and long-termism versus land grab and exploitation.
This research stream emerged as at Boundaryless, in collaboration with other entities, we’re exploring the opportunity to create a software-powered ecosystem around a common protocol of organizing by embracing a long-term, and cooperative approach that can outcooperate the competition.
Simone Cicero is co-creator of the Platform Design Toolkit and co-founder of Boundaryless. He was included in the 2020 Thinkers50 Radar list of upcoming thinkers.
Simone Cicero, ‘The 3 key challenges in making your organization platform-ready’, June 2021; https://stories. platformdesigntoolkit.com/the-3-key-challenges-in-making-your-organization-platform-ready-293999c7c7d.
Simone Cicero, ‘Pricing in platforms and marketplaces’, June 2021; https://stories.platformdesigntoolkit.com/pricing-platforms- marketplaces-151ab67b130a.