Leading in the Age of AI: A Conversation That Changed the Way I Think

There are some conversations that stay with you. My recent interview with Sami Makelainen.

We spoke about AI, ethics, critical thinking, leadership, and the quiet risks that many of us aren’t paying enough attention to. And as someone who sits in boardrooms and leads teams, I walked away feeling both energised and deeply challenged.

We are in the middle of a technological revolution, one that’s happening faster than any of us can fully grasp. AI is no longer a “what if.” It’s here. It’s in our comms, our operations, our decisions. But what struck me most is not what the tech can do—it’s what we risk losing if we don’t lead it thoughtfully.

Sami spoke about the erosion of critical thinking. When we hand over tasks—writing, analysis, even decision-making—to machines, we risk outsourcing the very skills that make us human. And while AI can be a powerful intern (as Sami puts it: "eager to help, but with no real-world experience"), it should not become the default decision-maker. That role still belongs to us.

We also explored the concept of “moral crumble zones”—where humans are left to absorb the blame for systemic failures driven by automation. In a world where AI is shaping outcomes, who’s really accountable when things go wrong? As leaders, we need to start designing systems where responsibility and power are aligned, not outsourced.

Another quiet reality we unpacked: employees are already using AI tools in their day-to-day work—but many are doing so in secret. In fact, surveys show that anywhere from 40 to 75% of knowledge workers are using tools like ChatGPT without informing their managers or organisations. Why? Because in many companies, these tools are banned. Or employees are afraid that admitting efficiency gains might lead to heavier workloads—or even job loss.

This creates a dangerous kind of shadow IT—“shadow AI”—where the organisation believes it's mitigating risk by banning AI, while in fact increasing it. Employees resort to using unsecured, consumer-grade tools without governance, training, or ethical guidance. It’s not just a tech issue. It’s a cultural one.

This is why AI literacy matters—at every level of the organisation.

And this is where leadership is critical. Not to know everything about how AI works, but to ask the right questions, to invest in thoughtful experimentation, and to support a culture of safe, supported adoption. Sami introduced one of the best metaphors I’ve heard for AI: it’s like hiring a brand-new intern. Eager to help, high potential—but it needs guidance, structure, and supervision. If you train it well, it can be a powerful asset. But if left alone, it may look busy while making costly mistakes.

One of the most important ideas we explored was the gap between technological advancement and organisational readiness. Most leaders don’t need to become AI experts—but they do need to become fluent translators. The future-fit leader isn’t the one who knows every algorithm. It’s the one who can bridge the gap between technical teams and the boardroom—and make ethical, strategic decisions in real time.

And we need to close the AI literacy divide—not just across senior teams, but across entire organisations. Because here’s the truth: without guidance, AI won’t reflect your company’s culture, values, or ethics. It will reflect the bias of its training data—or the person who prompted it first.

One thing I can’t stop thinking about is this: AI is only as inclusive as the people who build and train it. If we’re not consciously embedding diverse perspectives into these systems, we risk hard-coding bias at scale. The ethical implications are enormous—and yet, too often, we treat them as afterthoughts rather than foundations.

But it’s not all cautionary. This is also a moment of immense possibility.

Sami spoke about creating “no regrets moves”—strategies that make sense no matter how the future unfolds. Scenario planning, AI literacy, and a commitment to human-centred leadership are just a few. These aren't just good business practices—they’re survival strategies for a future where change is the only constant.

As leaders, we must be willing to ask hard questions, even when the answers are unclear. We must invest in our own digital fluency, create space for experimentation, and above all, lead with clarity of vision and flexibility in execution.

This is not about resisting the future. It’s about shaping it—with wisdom, with courage, and with a deep sense of responsibility to the people we serve.