Padi UMKM — A Complexity Case

When I first designed what later became Padi UMKM, I did not do it in a boardroom. I did it at home, during long months of WFH in the middle of the Covid-19 pandemic. I drew the system on papers spread on the floor. At that time, my head was full of ideas about ecosystems, complexity theory, and complexity economics. I was not thinking about building another digital platform. I was thinking about how economic coordination itself breaks down under systemic shock, and how new coordination patterns might emerge when old ones collapse. In that sense, Padi UMKM was born less from a product mindset than from an ecosystem mindset, with complexity theory consciously in the background.

When the pandemic hit, what collapsed was not only the economy. What collapsed was the coordination logic of the economy. Supply chains broke, demand evaporated, SMEs lost access to markets, and institutions discovered that their standard operating procedures were designed for stability, not for systemic disruption. Many organisations reacted by accelerating digital projects, launching platforms, and optimising internal processes. That helped, but it did not address the deeper problem. The economic ecosystem itself had lost its organising structure. Actors that were rational in isolation could no longer produce coherent outcomes collectively. This is how complex systems behave under stress: when established coordination patterns fail, local rationality no longer aggregates into systemic order.

Padi UMKM did not start as a brilliant digital product idea. It started as a response to a coordination failure across a fragmented system of SOEs, SMEs, banks, regulators, ministries, and development agencies. All were acting with good intentions, yet through incompatible logics, timelines, and mandates. The system was not short of initiatives; it was short of coherence. In complexity terms, the economy had been pushed far from equilibrium, and the challenge was not optimisation but reorganisation. What was needed was not another tool, but a new pattern of interaction among heterogeneous agents.

The real innovation of Padi UMKM was therefore not the platform. The platform was the easy part. The digital workforce of Telkom Group can design platforms; that is an operational capability. The platform was necessary, and it became the core infrastructure of the ecosystem, but it was not the breakthrough. The breakthrough was the deliberate redefinition of roles within the economic system. SOEs must reposition their procurement operation into a capability of creating new market, i.e. an SME-based market structure. SMEs were not framed as beneficiaries of aid, but as economic agents that could be structurally integrated into formal procurement and value creation. Banks and financial institutions were not treated merely as lenders, but as part of an enabling architecture that combined financing with capability development and pathways to export. What changed was not a feature set. What changed was the pattern of interaction between economic actors.

The formal launching of Padi UMKM itself was not initiated by Telkom or by the Ministry of SOEs. It was planned within the nationwide BBI (Bangga Buatan Indonesia) program, because the central government needed a real, executable instrument to accelerate domestic economic circulation under crisis. Telkom showed a commitment to develop the platform, even though it was still imperfect at that time. The urgency was national, not corporate. This matters, because it positioned Padi UMKM from the beginning not as a corporate product launch, but as a systemic intervention embedded in a national recovery narrative. The early external promotion of Padi UMKM, beyond the internal SOE environment, was also driven by the BBI program. Over time, almost by systemic selection rather than by design, Padi UMKM became the de facto e-commerce infrastructure for BBI, as other platforms could not fit the specific institutional and ecosystemic roles required by the program.

From the beginning, we made a counterintuitive choice in the way the system was governed. Telkom deliberately limited its role to being the product and platform owner. The ecosystem itself was not branded as Telkom’s program. The community was symbolically owned by the Ministry of SOEs and by SOEs collectively. Even the name Padi UMKM did not originate from Telkom. This was not a political compromise; it was a strategic design choice grounded in complexity thinking. In complex systems, ecosystems tend to collapse when one actor over-claims ownership. When the platform owner also claims to own the ecosystem, other actors reduce their commitment, hedge their participation, or quietly resist. By stepping back from symbolic ownership, Telkom created space for other institutions to step forward. The platform provided the infrastructure, but the legitimacy of the ecosystem was deliberately distributed across actors.

At some point, something structurally interesting happened. The initiative crossed a threshold where no single actor could kill it anymore. The CEO of Telkom could not simply shut it down because the ecosystem had become institutionally embedded beyond Telkom. The Minister of SOEs could not dismantle it easily because it had become part of the official narrative of national economic recovery. The President could not disown it because it had been publicly positioned as a success story through BBI, PEN, and related programs. This was not political theatre. This was the moment when the system acquired path dependence. Once an initiative becomes embedded across multiple layers of institutional narrative and governance, it ceases to be a project and becomes part of the system itself. At that point, you are no longer managing a prograe. You are dealing with a living economic structure.

Value in Padi UMKM did not come from transactions alone. It emerged from the coupling of multiple layers of interaction. Transactions between SOEs and SMEs were reinforced by access to credit, by certification mechanisms that enabled formal participation, by development programmes that upgraded SME capabilities, and by pathways to export markets. None of these elements, on their own, would have been transformative. The transformation emerged from their interaction. This is how complex economies create value: not through linear pipelines, but through ecosystems in which different forms of capital, i.e. financial, institutional, social, and operational, reinforce one another over time.

Internally in Telkom, there was a structural separation of roles that proved critical. The Digital Business Directorate (DDB) operated at the product and business level. Its logic was operational: build, run, scale, monetise, and maintain the platform. Even as the platform owner and economic keystone, it remained only one agent within the broader ecosystem. In parallel, the Synergy Subdirectorate under the Strategic Portfolio Directorate worked at the ecosystem level. This role was not about features, roadmaps, or KPIs. It was about sensing emergent patterns of collaboration, mediating conflicts between institutions, and navigating collisions between policy signals and organisational incentives. In the early phase, the Synergy team also played a foundational role in organising cross-SOE agreements, preparing the multi-actor launch, embedding Padi UMKM within the BBI program, and connecting it with multiple SME build-up initiatives involving the Ministry of SMEs, the Ministry of Trade, and other institutions. This work was not linear project management; it was ecosystem orchestration under uncertainty.

In Indonesia’s context, the interaction between SOEs, SMEs, banks, and regulators is not merely complex; it is quasi-chaotic. Mandates overlap, incentives conflict, and policies evolve at different speeds and under different political pressures. In such an environment, precise prediction is an illusion. What becomes possible instead is navigation: sensing where constructive patterns of emergence are forming, dampening destructive feedback loops before they escalate, and shaping the boundaries within which the ecosystem evolves. This is not classical management. This is leadership under complexity.

As a result of its early success, there was a moment when the government, again through the BBI programme, asked to expand Padi UMKM to cover all government agencies (K/L/PD). On paper, this looked like success, with an enormous projected GMV. In reality, it carried a systemic risk. Full integration into the broader government procurement apparatus would have imposed rigid compliance structures and administrative constraints that could have frozen the adaptive dynamics that made the ecosystem work. The decision to return that expansion to LKPP, while positioning Telkom only as a platform provider for LKPP, was a deliberate choice to preserve modularity and flexibility over symbolic scale. In complex systems, scale without adaptability is not growth; it is fragility disguised as success.

What this experience ultimately taught us is uncomfortable for traditional management thinking. In complex economic ecosystems, you cannot engineer outcomes. You can only design conditions: boundaries, incentives, roles, and narratives that make constructive emergence more likely than destructive collapse. The platform mattered. The technology mattered. But what mattered more was the humility to accept that once an ecosystem becomes alive, you are no longer the architect standing outside the system. You are one of the agents operating within it.

The strategic lesson for C-level leadership is this. In times of systemic disruption, competitive advantage no longer lies primarily in having the most sophisticated product or the fastest execution. It lies in the capability to shape interaction spaces across institutions, sectors, and policy domains. Leadership shifts from control to stewardship. Strategy shifts from optimisation to navigation. And success is no longer measured only by ownership, but by whether the system you helped catalyse can survive, adapt, and continue to create value even when you step back.

That, ultimately, is what Padi UMKM represents. Not a digital product success story, but a case of how leadership, strategy, and technology can be recomposed to operate effectively in a complex, adaptive economy under crisis. It is an ecosystem in motion. It is Synergy in action.

Navigating Business at the Edge of Chaos

This is a speech preparation for the CIMA & AICPA Strategic Leaders Breakfast Talk, to be held in mid-February 2026, under the theme ‘Leadership in the Age of Disruption — Strategic Leadership for Modern Finance Professionals’. I will deliver the presentation from the perspective of complexity science and complexity economics, before exploring the practical implementations for management accounting professionals.

In current economic landscape, business must be perceived as the development of an ecosystem that operates as a complex adaptive system (CAS). Within this framework, autonomous agents, both internal to the firm and across broader business networks, possess the capacity for independent decision-making and activity. From this complexity perspective, phenomena such as VUCA (Volatility, Uncertainty, Complexity, and Ambiguity) and disruption are no longer viewed as external threats to be mitigated or overcome. Instead, they are recognised as engines of evolution and qualitative opportunities to redesign business architecture. Strategy shifts from the mere optimisation of saturated, linear models toward the cultivation of dynamic ecosystems that generate new value through the process of emergence.

The optimal zone for such innovation is the Edge of Chaos, which is a critical transition state where a system balances order and stability with disorder and change. It is precisely in this zone, rather than in a state of total equilibrium, where optimal innovation occurs. For the modern enterprise, the Edge of Chaos is not a threat to be avoided, but a strategic space to be occupied and, if necessary, intentionally created. Competitive advantage in this regime is defined not by scale or static efficiency, but by architectural flexibility and the velocity of learning in response to constant internal and external feedback loops.

Leadership within this complex environment requires a fundamental shift in identity toward that of an ecologist. The leader’s primary duty is no longer the top-down control of outputs, but the creation of conditions and cultures that enable teams to self-organise. This involves managing the delicate tension at the Edge of Chaos, introducing enough healthy friction to trigger innovation without descending into systemic anarchy. Rigid & brittle SOPs are replaced by simple rules or heuristics that guide autonomous decision-making amidst ambiguity. Leaders must facilitate safe-to-fail probing, i.e. launching multiple, simultaneous, low-cost experiments to detect strategic signals and opportunities that traditional analytical models inevitably miss.

Strategic management in the exponential era demands ambidextrous design, balancing the exploitation of core operations with the continuous exploration of new ventures through modular structures. This necessitates the orchestration of resources far beyond traditional organisational boundaries, incorporating partners, start-ups, and regulators into platform-based strategies. Strategy is viewed as a process of co-evolution, where the organisation constantly reinvents itself to remain congruent with a shifting environment.

Finally, Management Accounting (MA) serves as the vital navigation instrument in this journey through the Strategic Planning for Exponential Era (SPX) framework. MA must evolve to support dynamic feasibility, utilising Real Options Analysis to value investments as strategic options—the right to expand, delay, or pivot—rather than rigid, one-way capital bets. This implementation includes Agile Capital Budgeting, where funds are allocated to strategic “buckets” rather than granular, unproven projects. By abandoning the stagnation of rigid annual budgets in favour of Rolling Forecasts and Throughput Accounting, MA ensures that resource allocation is driven by real-time feedback and the velocity of value conversion. Ultimately, the most profound business developments are market-creating innovations that not only ensure sustainability but actively uplift the economy and quality of life for society

Synergy Value as Emergence

When considering mergers, acquisitions, alliances, or even intra-group synergies, it is useful to shift our perspective away from additive arithmetic and towards the philosophy of emergence. In complex systems, including business ecosystems as complex adaptive systems, value does not reside solely within the parts; rather, it arises through the patterned interactions between them. This emergent phenomenon is precisely what in corporate finance is labelled synergy value. In formal terms, we may describe the total incremental value of a collaboration as

where V(x; G) denotes the value of the whole system, generated by the vector of resources and activities x under a specific governance structure G, and ∑V represents the value of each entity in isolation. The very fact that ΔV may be greater than zero testifies to emergence: complementarities in action, dependencies properly orchestrated, and adaptive patterns unfolding across the system.

The Levers of Emergent Synergy

Four principal levers determine whether emergent value materialises or evaporates. The first is complementarity, or what economists term supermodularity. This describes the situation in which activities reinforce each other such that the marginal return of undertaking one activity is enhanced by the undertaking of another; formally, the cross-partial derivatives are positive (𝛿²V/𝛿xi 𝛿xj > 0). It is here that the popular slogan “one plus one equals more than two” has rigorous grounding.

The second lever is the interdependence structure. Every collaboration has a topology of dependencies, where some assets act as complements, others as substitutes, and some nodes become bottlenecks through which the value of the entire system is channelled. In business ecosystems, mapping this structure is indispensable, for it often dictates whether modularity and flexible linkages suffice, or whether full absorption is required.

The third lever is defined by the adaptive rules of the system. A collaboration is not static; it is a complex adaptive system in which local decisions, feedback loops, and routines create new global patterns. Where local experimentation is permitted, and where feedback loops are properly designed, valuable behaviours diffuse through the organisation or alliance. Where rigidity prevails, the system is condemned to stasis, and synergy remains a theoretical promise rather than an emergent reality.

Finally, there is the matter of orchestration capacity. This refers to the dynamic capabilities of leadership—sensing opportunities, seizing them through resource allocation, and reconfiguring the system as environments change. Ashby’s principle of requisite variety reminds us that the variety of governance and decision-making tools must match the variety and volatility of the environment. Without adequate orchestration, even strong complementarities and favourable topologies may collapse under the weight of integration costs.

Applications Across Collaboration Types

In mergers and acquisitions, the choice of integration model should mirror the degree of interdependence. The celebrated Haspeslagh–Jemison framework reminds us that absorption is not always optimal; linkage or preservation may unlock more emergent value when autonomy is vital. The risk of the so-called synergy mirage lies precisely in misjudging complementarities and ignoring the time it takes for emergent patterns to stabilise. Thus, every acquisition is less a completed transaction than a hypothesis about the future, whose proof lies in the integration process.

In alliances and joint ventures, synergy takes the form of options on emergence. Here, limited commitments allow parties to test complementarities without over-committing capital. The collaborative form is well-suited to contexts of uncertainty, where exploration of emergent patterns is required. Ecosystem logic also applies: co-opetition and the management of network externalities often define the extent of emergent value.

For intra-group business synergy, emergence must be cultivated across corporate units. Here, Herbert Simon’s notion of near-decomposability becomes instructive: groups should design modular interfaces so that subsidiaries adapt locally yet align globally. To maintain cooperation, emergent rents must be shared fairly; cooperative game theory suggests the Shapley value as one method of allocating incremental value in proportion to each unit’s marginal contribution. Without such fairness, group members are tempted to defect, undermining the collaborative potential of the system.

Measuring and Governing Emergence

Because synergy is emergent, it resists simple enumeration. Yet it is not beyond the reach of disciplined measurement. One may begin with a complementarity map, estimating where cross-partials are most positive, and therefore where joint action may yield the greatest return. Alongside, an ecosystem dependency graph may be drawn, in the spirit of Ron Adner’s ecosystem mapping, to reveal missing complements and bottlenecks whose removal could unlock value.

Where uncertainty is high, the logic of real options should prevail. Pilot projects, staged investments, or minority stakes serve as options to explore emergent potential without risking catastrophic downside. Parallel to this, a system of synergy accounting may be implemented, in which incremental value is decomposed using Shapley allocations, thereby aligning incentives with marginal contributions to the whole.

The Philosophical Bottom Line

Synergy lives not in assets but in interactions. Corporate actions—whether a merger, an alliance, or an intra-group initiative—are best understood as interventions in a complex system. When complementarities are strong, interdependencies are designed with care, adaptive rules permit experimentation, and orchestration capacity is sufficient, emergent synergy is more than a hopeful metaphor; it becomes an observable reality. Conversely, where these levers are mismanaged, the promised “1 + 1 > 2” dissolves into disappointment, integration costs, and value destruction.

Thus, the philosophy of emergence, long a staple of complexity science, is not an academic curiosity but a practical guide to business collaboration. It teaches us that the true measure of a deal or alliance lies not in the parts themselves, but in the patterns of interaction that the collaboration enables.

The Flawed Global Ecosystem Strategy

Last century, the US stood as the pinnacle of industrial power. With unmatched manufacturing capacity, cutting-edge innovation, and a dynamic domestic labour force, the US not only produced at scale, but also created a vast middle class through industrial employment. But since the early 21st century, this dominance had eroded. Despite the continued global success of Apple, Microsoft, etc, the US found its industrial core hollowed out. This paradox—where the strategy won, but the nation did not—is at the heart of this exploration.

The US led the global shift toward liberalisation and globalisation, embracing free trade, deregulation, and offshoring as strategies for economic growth and competitive advantage. These ideas crystallised during the Reagan-Thatcher era and were institutionalised in policies such as NAFTA and the support for China’s entry into the WTO. The logic was simple: relocate labor-intensive manufacturing to lower-cost countries, focus domestically on high-value services and innovation, and reap the benefits of global efficiency.

For US corps, this approach worked magnificently. Apple built one of the most valuable ecosystems in the world, with tightly integrated design, software, services, and hardware. But much of this hardware was manufactured and assembled overseas, particularly in China. Microsoft dominated software and enterprise services, but its global cloud and platform ecosystem increasingly depended on international data centers, developer networks, and supply chains that were vulnerable to political shifts.

What became apparent over time was that these ecosystem-based strategies—while brilliant in achieving scale, market control, and profitability—were fundamentally fragile. They were built on assumptions of a stable global environment, unrestricted cross-border flows of labour, capital, and data, and a geopolitical consensus that no longer exists. The COVID-19 pandemic, the US-China business war, and the rise of protectionist and nationalist policies globally exposed just how brittle these supply chains and platform dependencies were.

The heart of the flaw is in the over-optimisation for efficiency at the expense of resilience. By offshoring critical manufacturing, the US lost not only jobs but also industrial knowledge, logistics infrastructure, and the ability to rapidly pivot production domestically in times of crisis. This strategic vulnerability became clear when shortages of semiconductors, PPE, and other essentials during the pandemic brought entire industries to a standstill.

Moreover, the US model of capitalism encouraged short-termism. Public companies were driven to maximise quarterly earnings and shareholder returns, often by cutting labor costs or outsourcing rather than reinvesting in domestic capacity. Labor unions weakened significantly, and with them, the political and social infrastructure that once supported a strong working class. The cultural shift toward a “knowledge economy” reinforced the idea that physical production was less valuable than digital platforms, intellectual property, and financial engineering.

This ideology extended into the UK as well, which closely mirrored US strategies in economic liberalization. Under Thatcher in the 1980s, the UK privatized major industries, deregulated finance, crushed unions, and repositioned itself as a global hub for services—especially financial services. The “Big Bang” of 1986 opened up London’s financial markets, turning the City into a magnet for global capital. Much like the US, the U.K. allowed its manufacturing base to atrophy in favour of high-value services concentrated in the Southeast, particularly London.

However, the UK, unlike the US, lacked the scale, resource diversity, and global technological dominance to buffer the negative effects of this transition. The result was stark regional inequality, declining productivity, and chronic underinvestment in infrastructure and education in much of the country. Brexit, in many ways, was the political expression of this economic alienation—a rebellion against globalisation, centralisation, and the perception of being “left behind.”

In both countries, we see a core contradiction: while companies triumphed globally, the broader national economies suffered from fragility, inequality, and a loss of sovereignty in key strategic sectors. The ecosystem-based strategies of firms like Apple and Microsoft continue to generate massive returns, but they do so by depending on fragile geopolitical arrangements, low-cost labor overseas, and complex, just-in-time logistics networks that are increasingly prone to disruption.

The irony is that ecosystems, as conceptualised in nature, thrive on diversity, redundancy, and mutual support. Business ecosystems, as built by the tech giants, often lack these qualities. They tend toward centralisation, dominance, and efficiency, making them look more like monocultures than true ecosystems. When stress hits—in the form of sanctions, pandemics, or trade wars—these systems do not bend; they break.

So is the ecosystem model flawed? Not entirely. It remains one of the most powerful frameworks for value creation in a networked economy. But it needs to evolve. Firms must build ecosystems that are not just efficient, but resilient and adaptable. This means diversifying supply chains, investing in local capabilities, supporting the long-term health of partners, and accounting for political and environmental risks.

Nations, too, must rethink their approach. A return to protectionism is not the answer, but neither is blind faith in market liberalism. Strategic sectors must be rebuilt or supported domestically not only for economic competitiveness but for national resilience. Policies must incentivise long-term investment, regional regeneration, and industrial policy aligned with innovation.

Ultimately, the story of the past few decades is not that globalization and liberalization were inherently wrong. Rather, they were applied too narrowly, with too little foresight, and with insufficient regard for the long-term health of national economies. The US and the UK offer lessons—both cautionary and hopeful—for any country navigating the next era of global business, where resilience, sovereignty, and inclusive prosperity will be just as important as efficiency and innovation.

Information at the Heart of Complexity

In The Complex World, a book written by David Krakauer as an intro to the foundations of Complexity Theory, a striking passage declares in the Chapter on Information, Computation, and Cognition: “information and information processing lie at the heart of the sciences of complexity.” This powerful statement not only encapsulates the essence of complexity science but also invite to explore how foundational ideas from information theory and historical philosophy have reshaped our understanding of the intricate systems that govern nature, technology, and society.

At the forefront of this intellectual revolution stands Claude Shannon, whose seminal 1948 work laid the groundwork for modern information theory. Shannon introduced the concept of quantifying information through measures such as entropy and redundancy, offering a robust mathematical framework to analyse how messages are encoded, transmitted, and decoded. His groundbreaking insights transformed the way we understand communication and paved the way for examining complex systems through the lens of information exchange.

Claude Shannon

Building on Shannon’s legacy, early pioneers like Norbert Wiener in cybernetics explored how feedback loops and control mechanisms underpin both living organisms and machines. These studies revealed that all systems — whether biological, electronic, or social — operate through continuous cycles of processing and exchanging information. This realisation led to a shift in perspective: rather than viewing components in isolation, researchers began to see the dynamic interactions and feedback as the true drivers of emergent behavior.

Central to complexity science is surely the idea that complex systems are composed of numerous interacting parts whose collective behavior gives rise to phenomena that are not apparent from the properties of individual components. The complexity of information itself reflects the system’s potential for emergence. As information becomes more intricate, its diverse possibilities create the fertile ground for spontaneous order and structure to arise. In this sense, the complexity embedded within information mirrors the layered reality it represents.

Analytically, viewing systems as networks of information processors has led to the development of powerful computational models. Cellular automata, agent-based simulations, and network analyses allow scientists to investigate how simple local rules of interaction can culminate in sophisticated global patterns. These models quantify the flow of information and reveal that small changes in how data is processed can lead to dramatic shifts in system behavior—underscoring the role of information in driving emergent phenomena.

Furthermore, this perspective is enriched by concepts such as Holland’s signals and boundaries, which describe how interactions at the edges of systems give rise to organised patterns. Signals act as the carriers of information across boundaries, defining the interfaces where local interactions take place. These interactions are critical in establishing the rules by which complex behaviors emerge, demonstrating that even at the micro-level, the quality and complexity of information can have far-reaching implications on the overall structure and dynamics of a system.

Ultimately, the convergence of Shannon’s revolutionary insights, the pioneering work in cybernetics, and the evolution of systems theory all lead us to the compelling conclusion mentioned above: information and information processing lie at the heart of the sciences of complexity. This understanding not only provides a unifying framework across disciplines but also highlights how the inherent complexity of information — measured in its entropy and intricate signals —mirrors and shapes the emergent realities of our world.

Signals and Boundaries

John Holland’s “Signals and Boundaries” has become a touchstone in the study of complex adaptive systems (CAS), offering an intuitive way to understand how local interactions give rise to emergent behavior. At its core, Holland’s framework posits that signals (=the carriers of information) and boundaries (=the limits that define and protect modules) play a pivotal role in the organisation, adaptation, and evolution of complex systems. His insights have helped shape our understanding of how simple, localised exchanges can lead to intricate global patterns.

The framework’s influence is widespread, resonating strongly within academic circles including the SFI. Scholars have incorporated Holland’s ideas into broader discussions on network theory and modularity, using them as a bridge between traditional adaptation models and more modern computational approaches. By emphasising the dual roles of communication through signals and compartmentalisation via boundaries, Holland provided researchers with a practical toolkit for analysing the dynamics of ecosystems, technological platforms, and social networks.

Holland’s Signals and Boundaries, read at the Soekarno Hatta International Airport

A significant strength of Holland’s theory lies in its capacity to illustrate how local interactions can generate emergent complexity. When agents within a system interact, they exchange signals that serve as feedback loops—adjusting behavior and influencing neighboring agents. Meanwhile, boundaries help maintain structure by isolating specific interactions from external noise, allowing subsystems to develop independently yet remain interconnected. This delicate balance between isolation and connectivity is what drives the self-organisation and adaptation observed in complex systems.

However, the notion that complexity is solely the product of local interactions has its critics. Some argue that focusing exclusively on bottom-up processes might neglect the role of global influences and top-down causation. In many systems, overarching constraints, environmental factors, and collective dynamics impose patterns and behaviors that local interactions alone cannot fully explain. This perspective contends that emergent phenomena may also be shaped by these global forces, suggesting a need for models that integrate both micro-level interactions and macro-level structures.

One contrasting perspective within the complexity paradigms is the idea of strong emergence. Proponents of strong emergence assert that certain higher-level properties of a system are fundamentally irreducible to the interactions of its constituent parts. In this view, while local interactions are essential, they cannot entirely account for phenomena that manifest at the macro scale. The emergent behaviors observed in complex systems may require explanations that go beyond the sum of local interactions, implying that there are holistic properties at play that necessitate a different conceptual approach.

There is also a growing consensus among some researchers that a dual approach—one that synthesises both local and global perspectives—is necessary for a complete understanding of complexity. Network theorists and systems dynamicists, for example, have highlighted the importance of long-range correlations and global feedback loops that complement local interactions. This integrated approach recognises that while signals and boundaries are crucial, the interplay with broader systemic forces can drive self-organisation and adaptation in ways that are not captured by local dynamics alone.

Holland’s signals and boundaries framework remains a seminal contribution to complexity science, celebrated for its clarity and applicability across diverse domains. It has provided a powerful lens for examining how decentralised, local interactions can lead to emergent behavior—a notion that has profoundly influenced our understanding of ecosystems, technological platforms, and social networks. Yet, as our grasp of complex systems deepens, it is equally important to acknowledge and incorporate contrasting views, such as the roles of strong emergence and global influences, to capture the full richness of complexity. This ongoing dialogue not only enriches the theoretical landscape but also drives innovation in how we model and manage complex systems in practice.

Cities Development as CAS

The research titled “Inter-City Firm Connections and the Scaling of Urban Economic Indicators” by Yang, Jackson, and Kempes, published in PNAS Nexus (Nov 2024), presents a fresh perspective on how cities generate economic output. While traditional urban scaling theories focus on how local, intra-city interactions drive economic productivity, this study argues that inter-city connections — especially through multinational firms — play an equally, if not more, significant role. By analysing GDP data from cities in the US, EU, and PRC, alongside the Global Network Connectivity (GNC) of multinational firms, the study reveals that cities with higher inter-city connectivity exhibit higher-than-expected GDP, even after accounting for population size. This finding challenges the conventional idea that urban scaling is driven solely by local social interactions, offering a new lens for understanding complexity in urban systems.

This study is an example of how complexity science can be applied to real-world systems like cities. Cities, as complex adaptive systems (CAS), exhibit emergent behaviours, such as superlinear scaling of GDP, where larger cities tend to be disproportionately more productive. Traditionally, this emergent property was attributed to denser local social interactions. However, the authors introduce a new dimension of complexity by demonstrating how inter-city firm connections serve as an additional mechanism for economic emergence. Using the concept of networked systems, cities are modelled as nodes connected by firms, and the GNC score quantifies the strength of these connections. The research shows that GDP is influenced not just by a city’s local population but also by its position within this global network. This insight extends the complexity science framework by highlighting the role of cross-city organisational linkages in shaping global economic output.

The study also provides methodological advances that enrich the complexity science toolkit. It uses Scale-Adjusted Metropolitan Indicators (SAMI) to compare how cities “overperform” or “underperform” in GDP relative to expectations. This allows for a nuanced view of which cities benefit most from inter-city connections. Furthermore, the use of multilevel regression models that incorporate both local (population) and global (GNC) factors reveals the nonlinear dynamics at play. Such nonlinear scaling, where population alone cannot explain GDP growth, suggests the presence of feedback loops where better-connected cities become more prosperous, and prosperous cities become better connected. These insights underscore how complexity science can offer more accurate, multi-layered models of urban growth, moving beyond simplistic population-based approaches.

The implications of this research go beyond academic curiosity. For policymakers, it suggests that urban economic development strategies should prioritise enhancing global connectivity. Cities can benefit from strengthening ties with multinational firms, facilitating cross-city collaborations, and becoming key nodes in the global urban network. This is a shift from the classic focus on improving only local conditions, such as infrastructure or intra-city mobility. For complexity science, this study exemplifies how theories of self-organisation, emergence, and adaptive networks can be operationalised in practical, high-impact research. The work highlights the potential for developing a more comprehensive urban scaling model that integrates both local and global processes. By bridging concepts from complexity science with urban development, the study opens new possibilities for future research into how global interconnections influence local outcomes, from economic growth to social inequality.

Source: Vicky Chuqiao Yang, Jacob J Jackson, Christopher P Kempes, 2024, Inter-city firm connections and the scaling of urban economic indicatorsPNAS Nexus 3:11, DOI: 10.1093/pnasnexus/pgae503

Complexity Science for AI?

AI technologies are primarily developed by advancements in machine learning, particularly deep learning and natural language processing. An example is ChatGPT, which is built on the Transformer architecture, and employs deep neural networks with attention mechanisms to process and generate human-like text. While the architecture of these models is inherently complex, characterised by vast parameters and intricate layers, they do not rely heavily on Complexity Science as a core framework in their design or functionality.

There are actually some indirect connections between AI and Complexity Science. Deep neural networks, for instance, can be conceptualised as complex systems where simple components (neurons) interact to produce emergent behaviours, such as understanding and generating language. While Complexity Science provides valuable insights into such emergent phenomena, these principles are not the primary foundation for AI model development. Complexity Science concepts are also applied in training optimisation, where researchers study high-dimensional optimisation landscapes, convergence properties, and loss surface dynamics to improve the stability and efficiency of training processes. Interpretability in AI benefits from complexity-based approaches like network theory and information theory, which help uncover how information flows through neural networks. Another area of overlap is robustness and generalisation, where ideas from Complexity Science and statistical mechanics, such as phase transitions and criticality, aid in understanding why large, over-parameterised models perform well in real-world scenarios.

Despite these connections, we must acknowledge that Complexity Science has not been a major driving force in the development of AI technologies like ChatGPT. The creation of such models relies more on advances in neural network architectures, data processing, and algorithmic optimisation than on the theoretical foundations of Complexity Science.

There are some opportunities if we can enrich AI with principles from Complexity Science. It could enhance the adaptability, robustness, and interpretability of AI systems by providing them a better ways of managing dynamic, non-linear interactions and uncertainty in real-world environments. This integration could enable the creation of AI models that handle emergent behaviours more effectively, excel in predictive analytics, and exhibit greater resilience, moving the field closer to achieving general, human-like intelligence.

The challenges are on behalf of Complexity Science. One major limitation is its lack of standardised, predictive methods that can be broadly applied to complex systems. Most Complexity Science models are descriptive or exploratory, emphasising qualitative understanding over quantitative prediction. Additionally, complex systems are often highly context-dependent, making it difficult to generalise findings or develop uniform approaches. Computational intensity is another barrier; many complexity-based models, such as agent-based simulations, struggle to scale to the large datasets typical in AI. Furthermore, Complexity Science has historically focused on theoretical and simulation-based methods, while AI thrives on data-driven approaches, creating a methodological gap. Finally, the absence of a unified theoretical framework in Complexity Science makes it still challenging to translate its principles into practical, standardised tools for AI.

Complexity Science offers profound insights into the behaviour of complex systems but remains underdeveloped in areas critical for its integration with AI, such as predictive capability, scalability, and standardisation. As interdisciplinary research progresses and computational capabilities grow, these limitations may be addressed, unlocking new opportunities for AI systems to benefit from the rich, nuanced perspectives of Complexity Science.

Non-Accumulative Adaptability

Exploring the ideas about adaptation and emergence as a part of ecosystem (i.e. complex adaptive system — CAS) development, I think it is more exciting when we see it through the combined lenses of CAS, Schumpeter, Kuhn, Foucault, and Lyotard. Each of these perspectives explores how change does not just happen bit by bit, but instead in bold and disruptive leaps, as transformations that completely alter the playing field, whether we’re talking about economies, sciences, societies, or even our basic understanding of the world.

CAS implies that change is a matter of adaptive cycles — cycles of growth, accumulation, collapse, and renewal. An ecosystem could grow, accumulates the resources until hitting a limit. Then its whole structure becomes unsustainable, collapses, and reboots in a new way — it reorganises itself with fresh relationships and opportunities. This cycle is anything but smooth; it’s like a forest fire clearing the way for new growth, and it’s essential for resilience and long-term adaptability. This model resonates closely with Schumpeter’s idea of creative destruction in economies. Schumpeter saw capitalism as a system where innovation doesn’t build up neatly on top of the old but bulldozes it — new technologies, businesses, and products disrupt markets, toppling established companies and paving the way for the next wave of growth. For Schumpeter, entrepreneurs drive this cycle, constantly reinventing the economy and shifting the landscape in unexpected ways.

Thomas Kuhn brought a similar idea into science with his concept of paradigm shifts. In Kuhn’s view, science isn’t a smooth, cumulative process of adding one discovery to the next. Instead, it moves forward in fits and starts. Scientists work within a “paradigm” — a shared framework for understanding the world — until enough anomalies build up that the whole system starts to feel shaky. At that point, someone comes along with a radically new idea that doesn’t just tweak the existing framework but replaces it. Kuhn’s paradigm shift is a profound reimagining of the rules, kind of like Schumpeter’s creative destruction but applied to the way we think and know. It’s as if science periodically wipes the slate clean and rebuilds itself from a fresh perspective.

As a Gen-X, I must also mention Michel Foucault. Foucault offered a more historical spin on these ideas with his concept of epistemes. Foucault believed that every era has its own underlying structure of knowledge, shaping how people perceive and think about the world. These epistemes don’t evolve smoothly; they’re punctuated by abrupt shifts where the entire basis of understanding changes. Just like in a Kuhnian paradigm shift, when a new episteme takes over, it fundamentally changes what questions are even worth asking, as well as who holds power in the discourse. In Foucault’s view, knowledge isn’t just a collection of facts piling up—it’s tied to shifts in power and perspective, with each era replacing the last in a way that’s not fully compatible with what came before.

Then there’s Jean-François Lyotard, who takes the idea a step further by challenging the very idea of cumulative “progress” altogether. As a postmodernist, Lyotard argued that the grand narratives that used to make sense of history, science, and knowledge are breaking down. Instead of one single, upward trajectory, we’re left with multiple, fragmented stories that don’t fit neatly together. Knowledge, for Lyotard, is no longer a matter of moving toward some ultimate truth but an evolving patchwork of perspectives. This rejection of a single narrative echoes Schumpeter’s and Kuhn’s visions of disruption and replacement over seamless continuity. Lyotard’s work suggests that, in knowledge and culture alike, stability is always provisional, subject to the next seismic shift in understanding.

Let’s imagine they can talk together

So when we look at all these thinkers together, a fascinating picture emerges. In CAS, Schumpeter’s economics, Kuhn’s science, Foucault’s history, and Lyotard’s philosophy, progress is not about slowly stacking up ideas or wealth. Instead, it’s about cycles of buildup, breakdown, and renewal — each shift leaving behind remnants of the old and bringing forth something fundamentally new. This kind of progress isn’t just unpredictable; it’s fueled by disruption, tension, and revolution. These thinkers collectively remind us that the most transformative changes come from breaking with the past, not from adding to it. Progress, in this view, is a story of radical leaps, creative destruction, paradigm shifts, and fresh starts—where each new phase is a bold departure from what came before.

Enterprise Architecture for Digital Transformation

Lapalme has discussed “Three Schools of Thought on Enterprise Architecture” at IT Professional in 2012. Korhonen and Halén explored more on Enterprise Architecture for Digital Transformation.

Schools of Though on EA:

  • The Enterprise IT Architecting (EITA) school views enterprise architecture as “the glue between business and IT”. Focusing on enterprise IT assets, it aims at business-IT alignment, operational efficiency and IT cost reduction. It is based on the tenet that IT planning is a rational, deterministic and economic process. EA is perceived as the practice for planning and designing the architecture.
  • The Enterprise Integrating (EI) school views enterprise architecture as the link between strategy and execution. EA addresses all facets of the enterprise in order to coherently execute the strategy. The environment is seen both as a generator of forces that the enterprise is subject to and as something that can be managed. EA is utilized to enhance understanding and collaboration throughout the business.
  • The Enterprise Ecological Adaptation (EEA) school views EA as the means for organizational innovation and sustainability. The enterprise and its environment are seen as coevolving: the enterprise and its relationship to the environment can be systemically designed so that the organization is “conducive to ecological learning, environmental influencing and coherent strategy execution.” EA fosters sense making and facilitates transformation in the organization.

Level or Enterprise Architecture

  • Technical Architecture (AT) has an operational focus on reliability and present day asset utilization and is geared to present-day value realization. This is the realm of traditional IT architecture, information systems design and development, enterprise integration and solution architecture work. AT also addresses architectural work practices and quality standards, e.g. architectural support of implementation projects, development guidelines, and change management practices. In terms of organizational structure, AT would pertain to the technical level of organization, where the products are produced or services are provided.
  • Socio-Technical Architecture (AS) plays an important role as the link between strategy and execution. The business strategy is translated to a coherent design of work and the organization so that enterprise strategy may be executed utilizing all its facets, including IT. AS is about creating enterprise flexibility and capability to change rather than operational optimization: the focus on reliability is balanced with focus on validity in anticipation of changes, whose exact nature cannot be accurately predicted. AS would pertain to the managerial level of organization, where the business strategy is translated to the design of the organization.
  • Ecosystemic Architecture (AE) is an embedded capability that not only addresses the initial design and building of a robust system but also the successive designs and continual renewal of a resilient system. The architecture must allow for co-evolution with its business ecosystem, industry, markets, and the larger society. AE would pertain to the institutional level of organization, where the organization relates to its business ecosystem, industry, markets, and the larger society.

Adaptation and Maladaptation

Source: Korhonen J.J., Halén M. 2017. Enterprise Architecture for Digital Transformation. IEEE 19th Conference on Business Informatics. DOI 10.1109/CBI.2017.45

« Older posts

© 2026 Complexity Center

Theme by Anders NorénUp ↑