Designing the Tech Future: A Two-Part Conversation with Olaf J. Groth on Innovation, Power, and Global Governance – Part 2

In Part I of this exclusive interview, Olaf J. Groth explored the shifting geopolitics of innovation, outlining the forces that will shape the global tech landscape in the coming decades. He identified the “6Cs” — from cognitive technologies and CRISPR to climate change and China–US rebalancing — as key drivers of transformation, while stressing the foundational role of data, energy, and technical talent in the race for AI, quantum computing, and digital infrastructure leadership. Drawing comparisons between the US, China, and the EU, he highlighted each region’s strengths and vulnerabilities, examined the interplay between governments and private innovators, and addressed the opportunities and risks for emerging economies. Groth also warned of the challenges posed by fragmented “splinternets,” emphasising the need for new global accords on AI, data, and cybersecurity to ensure innovation thrives in an interconnected yet geopolitically tense world.

In Part II, Olaf J. Groth explores techno-globalism, the cross-border flow of technology, talent, and data, and the governance challenges it creates. He discusses the need for adaptive institutions, cross-border standards for AI and data, and the FLP-IT framework for resilient tech strategies. He also highlights the role of “design activist leaders,” strategic insights for companies, and a proposed global policy for algorithmic accountability to secure innovation, trust, and digital sovereignty.

PART II: The Rise of Techno-Globalism – Can Governance Catch Up?

WFA: You’ve written about “techno-globalism.” How do you define it, and why is it so critical in today’s era of borderless technologies?

OJG:  Techno-globalism, as I define it, is the cross-border flow and interdependence of technologies, talent, data, and innovation systems that increasingly shape our global economy, governance structures, and societal norms. It’s the counterweight to techno-nationalism—where states seek to control and weaponize technology for strategic advantage.

What makes techno-globalism so critical today is that we’re living in an era where technologies like AI, quantum computing, and biotech don’t respect national borders. Supply chains are global, talent is distributed, and data flows are constant. No single country can—or should—go it alone. The complexity and scale of our shared challenges, from climate change to pandemics to digital security, demand collaborative technological solutions.

At the same time, techno-globalism raises urgent questions about digital sovereignty, ethical governance, and the balance between innovation and control. Navigating this tension is one of the defining leadership challenges of our time.

WFA: Are current governance models—like those of the UN or WTO—equipped to regulate frontier technologies like AI and quantum, or do we need entirely new structures?

OJG:   I’d say current governance models like the UN, WTO, and even regional frameworks were built for a different era—one defined by slower, more predictable industrial progress and nation-state-centric policymaking. They’re increasingly ill-equipped to regulate the pace, complexity, and boundary-blurring nature of frontier technologies like AI and quantum. They require more than an update – it’ll have to be a thorough upgrade or redesign.

These technologies evolve exponentially, transcend borders, and involve actors—corporations, labs, startups, even autonomous agents—that often operate outside traditional diplomatic channels. That doesn’t mean we discard existing institutions, but we do need new adaptive, polycentric structures that can govern in real time, across domains, and with input from a broader set of stakeholders—technologists, ethicists, civil society, and the private sector.

I’ve argued that this is not just a governance challenge; it’s a legitimacy challenge. If we don’t update our mechanisms for collective decision-making, we risk a vacuum—one that could be filled by digital authoritarianism, regulatory fragmentation, or techno-feudal power concentrations. The future demands governance that’s anticipatory, inclusive, and as agile as the technologies it aims to steward.

WFA: Can you elaborate on what cross-border governance might look like for algorithms and data standards? Is this realistic in today’s divided world?

OJG:  As I’ve written and spoken about, cross-border governance for algorithms and data standards isn’t just aspirational—it’s essential. Algorithms increasingly shape decisions across finance, healthcare, defense, and public discourse. Data fuels those algorithms, and both move fluidly across borders, often without the consent or awareness of the individuals or societies affected.

What this governance might look like is a layered, modular system—not a one-size-fits-all treaty, but a framework of interoperable standards and norms. Think of it as a digital Bretton Woods 2.0: agreements on transparency, accountability, and privacy that can be adapted locally but are rooted in shared principles. These could be stewarded by coalitions of like-minded countries, tech companies, and civil society actors, with auditability and conditional access baked in. It’s also where mechanisms like algorithmic passports, data trusts, or dynamic consent models come into play.

Is it realistic in today’s fragmented geopolitical climate? It’s hard—but fragmentation is precisely why it’s urgent. If we don’t build connective tissue across jurisdictions, we risk regulatory chaos, digital protectionism, or worse—techno-authoritarian lock-in. Techno-globalism demands that we find ways to cooperate, even amid competition. The stakes are too high to leave governance as an afterthought.

WFA: In The Great Remobilization, you discuss “design activist leaders.” What qualities or mindsets define such a leader in a world of accelerating tech disruption?

OJG:  In The Great Remobilization, we define “design activist leaders” as those who don’t just react to disruption – they shape its direction with intention, foresight, and moral clarity. These leaders recognize that we’re living in a liminal phase – a period between worlds – where old institutions are eroding and new systems are not yet fully formed. In that void, leadership becomes a design act. DALs pick very selectively which parts of old systems to keep and which to recombine or reinvent before they put new structures and systems in place.

Zeroth Principles Discovery: Assumes that nothing is impossible – is not limited by the constraints of the current systems. DALs are able to “see / vision / imagine” new building blocks for better systems. They are not beholden to the well-trodden path of first principles and well-established logic, but create new ones.

Systems Diagnostic & Foresight: Thinks in actor systems rather than industries, territorial boundaries, or narrow market segments.  Understands not only the first but also the second and third-order effects of decisions and actions. 

Cross-Tribal Network Empathy: Empathizes with the perspectives and needs of others, appeals to tribes, builds bridges, and incorporates them into solution models. At the same time, they are not rugged-individual “hero” archetypes that pronounce “my way or the highway” but communicate that understanding complexity can lead to greater agility.

Hybrid Trust Building: Integrates nodes in physical v. virtual worlds, Web2 platform v, Web3 protocol paradigms, global institutions v. clubs and tribes into business strategy or policy design.

In a world of accelerating tech disruption, design activist leaders don’t wait for permission or perfect information. They act with purpose, prototyping the future while staying anchored in values that transcend any single technology cycle.

WFA: How can the FLP-IT framework help policymakers develop more agile and resilient technology strategies?

OJG:  As we’ve written in The Great Remobilization, the FLP‑IT framework is a strategic foresight tool designed precisely for moments of volatility and rapid technological disruption – exactly what policymakers are facing today. If you combine it with our Design Activist Leader framework, it is a guide for strategic leadership, not just thinking and planning:

FLP‑IT helps policymakers build more agile and resilient strategies by forcing a structured yet flexible engagement with complexity:

This model doesn’t offer a fixed blueprint—it offers a mindset and method for navigating ambiguity. It empowers policymakers not to simply react to disruption, but to design through it—proactively shaping a more coherent, values-driven, and interoperable technology future.

WFA: You’ve spoken with military, corporate, and civic leaders. What patterns of concern or opportunity do you hear most frequently from them?

OJG:  In my conversations with military, corporate, and civic leaders across regions, a few patterns consistently emerge – both in terms of deep concern and strategic opportunity.

First, there’s a shared anxiety about losing control of complexity. Whether it’s generals grappling with autonomous systems in warfare, CEOs facing AI-driven business model disruption, or civic leaders navigating the misinformation crisis, they’re all contending with exponential technologies outpacing institutional readiness. The sense that our governance, ethics, and economic models were built for a slower era comes up again and again. There’s a reason we saw a record number of CEO firings or resignations last year in the US. Things are just getting too complex, too accelerated, and pressurized to stay on top of all that change justice quickly enough.

Second, there’s a growing recognition that trust is the most valuable currency. And it’s in short supply – hard to obtain and easy to lose. Leaders are worried about eroding public trust in institutions, platforms, and even scientific knowledge. That erosion creates volatility, and volatility corrodes the legitimacy needed to govern effectively in times of change.

But on the opportunity side, I hear a rising appetite for designing new systems, not just tweaking old ones. Military leaders are exploring how to build human-machine teaming frameworks grounded in ethical constraints. Corporate leaders are looking for data architectures and AI models that enable both personalization and privacy. Civic leaders are hungry for policy tools that foster agency and participation, especially for younger generations.

Across the board, I hear a desire for more anticipatory, cross-sectoral collaboration. Leaders know that no single institution can manage the intersection of tech, geopolitics, and economics alone. They’re looking for new alliances, new playbooks—and frameworks like FLP-IT—to help them reimagine resilience at scale.

WFA: How should companies prepare for the increasing intersection of geopolitics and digital governance?

OJG:  As I wrote in my Feb 2024 article for the World Economic Forum, “Tech at the Centre of Geopolitics: 5 Strategic Capabilities for GeoTech Organizations:”

  1. Develop radical foresight via systems thinking
    1. As the WEF outlines, firms need foresight & system-thinking evaluation to sense and interpret integrated geotech forces—such as China, climate, cybersecurity, COVID-like bioshocks, and cognitive/crypto disruptions—in a unified way.
    1. This means embedding horizon scanning, weak‑signal detection, and complexity modeling into core strategy teams, not just R&D.
  2. Use data-driven benchmarking to evaluate readiness
    1. Establish metrics dashboards comparing your digital governance and geopolitical risk resilience against global best practices—precisely what “data‑driven best practices benchmarking” calls for.
    1. This equips leadership with objective insights into where compliance, technology, or supply‑chain resilience may fall short in different jurisdictions.
  3. Run scenario simulations for portfolios and supply chains
    1. The framework’s “simulations of actors and positions” capability urges companies to stress-test geotech risks—like trade barriers, AI export controls, or regional data localization—across business units, product lines, and supplier networks.
    1. These exercises enable dynamic strategy adaptation: rerouting, prioritizing, or redesigning offerings in response to emerging geopolitical norms.
  4. Integrate geotech into budgets & execution
    1. WEF highlights the need to integrate geotech considerations into both HQ and business unit planning—everything from intelligent supply‑chain platforms to “nano‑factory” models.
    1. In practice, that means tagging investments not just by ROI but by governance and sovereignty risk exposure.
  5. Establish GeoTech Response Teams
    1. A central “geotech response team,” staffed with multidisciplinary expertise—from cognitive science and climate policy to national security and tech ethics—anchors, calibrates, and operationalizes the above four capabilities across siloes.
    1. These teams, properly empowered by C‑suite and board mandates, ensure cross-functional alignment and agility in responding to global digital-policy disruptions.

These five strategic capabilities support techno‑geopolitical organizational readiness:

In sum, preparing for the geopolitics-digital governance intersection means operationalizing these five capabilities—through foresight, data, simulation, budget discipline, and dedicated GeoTech teams. That combination builds the agility and resilience needed to not only weather but shape the digital geopolitics of the 21st century.

WFA: If you could implement one global policy tomorrow to improve the governance of frontier technologies, what would it be – and why?

OJG:  If I could implement one global policy tomorrow to improve the governance of frontier technologies, it would be the mandatory creation and adoption of interoperable algorithmic transparency and accountability standards—anchored in principles of sovereignty, security, and shared innovation.

Why? Because algorithms now function as invisible infrastructure for everything from credit and healthcare access to military targeting and infrastructure resilience. Yet most of them operate in a black box—opaque to regulators, vulnerable to manipulation, and exploitable by bad actors. Without transparency, we cannot ensure security. And without shared standards, we risk both technological fragmentation and a race to the bottom in safety, ethics, and strategic control.

This global policy would establish a modular framework for algorithmic accountability that supports three imperatives:

  1. National economic competitiveness – By providing clarity and trust in AI systems, such standards would reduce regulatory uncertainty, lower transaction costs for cross-border tech partnerships, and enable firms to scale innovations globally. Countries that help shape and adopt these standards would enjoy first-mover advantages in setting the rules of the game—much like GAAP or Basel III shaped global finance. This is about industrial policy via governance leadership.
  2. National security and cybersecurity – Transparent algorithms are harder to poison, spoof, or hijack. Interoperable audit protocols would help governments and vetted third parties detect adversarial inputs, algorithmic backdoors, or unintended escalation risks in defense and critical infrastructure systems. Think of it as a cybersecurity dividend from good governance. It would also enable more trusted AI cooperation among allies—especially where joint command, deterrence, or intelligence systems are involved.
  3. Digital sovereignty with global interoperability – Each country could adapt implementation to its legal system and cultural context, but within a shared framework of cross-border recognition—akin to “algorithmic passports” or mutual assurance treaties. This ensures that sovereignty and competitiveness don’t require autarky, and that open societies can remain open without becoming strategically exposed.

This policy is not a silver bullet—but it gives us the architectural backbone to align innovation, trust, and power in the AI age. It’s a foundation for managing not just the tools we build, but the societal systems they are quietly redesigning beneath us. Without it, we risk drifting into a fragmented digital Cold War. With it, we have a shot at building a pluralistic, secure, and prosperous digital order.

Olaf, thank you for sharing your remarkable insights with World Future Awards and for continuing to inspire global dialogue around the future of innovation and governance. Your work is not only visionary but essential for navigating the complex intersections of technology, geopolitics, and leadership.

Visit Olaf’s LinkedIn profile for more information on his work.