Categories
Skills

When did we lose the love of doing the work?

I have a confession…I like to work.

Yet, I find myself in rare company these days as so many seek to use AI to ‘do the work’ instead of collaborating to do ‘your best work’.

I think that’s gonna be a big problem and I have some thoughts on that.

Before GPT

The first time I used a generative AI tool was back in early 2023 when a colleague introduced me to a platform called GPT 2.5 from, at the time, a little known company called OpenAI.

ChatGPT didn’t exist yet.

This was it’s basic form before we experienced life through a prompt bar. The only people who were playing around at this point were nerds like me. After I got around the not so friendly interface, I saw the impact of this tech’s early potential.

At that time, I kept thinking this would be a great way to collaborate with technology to do better work.

What I couldn’t see at that point, or perhaps didn’t want to recognise, is humanities desire for instant gratification and the obsession to outsource/delegate every piece of work. The current AI marketing from all corners of the industry leans on this sense of ‘work is bad, so let AI do it for you’. I know that sounds like some weird slogan from a commercial in the 60’s.

The purpose of work

I get a great deal of value from my AI tool stack.

Perhaps I’m the weird one but my focus with AI is to help me to my best work, not outsource it.

The work, very much like learning, is where the hard stuff happens.

The ‘aha’ moments you would never have conceived without the focused effort, the seemingly unrelated events that craft a connective bridge of ideas which lead to something incredible.

My industry of workplace learning has/had a saying “Learning is the work and the work is learning” – its something like that.

It seems like too many of us have fallen out of love with doing the work.

Again, the problem isn’t AI, it’s us.

Our intentions have become skewed in the promise of an era where an artificial intelligence will do anything and everything for you. Yet, we rarely sit back to ask “Just because we can, does it mean we should?”, and even if we can, do we really want to?

Doing ‘the work’ is a big part of purpose for many.

Purpose, meaning and fulfilment is a dumpster on fire that is quickly rolling across society as we race to delegate, automate and outsource everything in the pursuit of “reclaiming time” or “Being efficient”.

We may not see it now, but its coming.

A bit of effort, struggle and focus is not bad for you, so don’t discount “Doing the work”.

“If you don’t make mistakes, you’re not working on hard enough problems and that’s a big mistake.” – Frank Wilczek


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence

3 Ways L&D Actually Adds Value In AI Adoption

L&D teams trying to and/or being responsible for total company AI adoption is a fools errand.

We both know this is stupid, yet I see too many instances of companies trying to “train” their way into successful AI adoption.

Of course, L&D teams alone aren’t going to make any organisation achieve meaningful AI adoption (however you measure that). Yet, we do have a part to play, and recognising where we can best support is critical.

So, lets explore Where L&D Can Support Meaningful AI Adoption.

There’s a ton of talk about AI adoption.

It’s odd because the validation of “adoption” has many definitions, dependent on the context and environment. The common pitfall is to measure adoption as ‘use of AI tools’ alone.

As we know, with previous technology, usage alone doesn’t mean meaningful adoption.

Setting what adoption looks like in your organisation is not a task for the L&D team.

Yet, we have an opportunity to contribute to long-term and meaningful adoption of AI across workforces as part of a wider collaboration in a community.

Let’s talk about that…

It takes more than access

Let’s go beyond the veil of bullshit we see online.

Access to an AI tool alone means nothing, and putting on one hour lunch and learns to “make people learn AI” is a comical up-skilling strategy.

If you’re a long-time reader, you’ve heard me become a broken record when I talk about what it takes to nurture meaningful and long-term change. We have much to consider with context, culture and constraints in each environment. No two workplaces are the same, that’s why the cookie-cutter “adoption frameworks” make me laugh.

They’re a good point of inspiration, but you shouldn’t follow them like a strict set of instructions.

Saying that, what is it we need to consider beyond tools?

People, Systems and Tools

As you’ve probably guessed, launching new technology and tools alone rarely leads to meaningful adoption.

There’s a bigger ecosystem at play.

We have to consider:

1/ People

Where are people at today, and how do we meet them?

Everyone will have a different understanding, maturity and receptiveness to something new and unknown. In AI’s case, we have a mix of emotions from “will this take my job” to “I want it to do all this stuff I hate doing”.

The most difficult part of a change process is people, because we’re all so unpredictable.

2/ Systems

Quite simply, how we work today.

What are the tried, tested and trusted conscious and unconscious systems we have in place? This covers both how we execute tasks and how we think about executing those tasks (deep, I know).

We each follow different types of systems in our day to day.

Understanding what these are and how AI will impact those is key to this change.

3/ Tools

The part you’re most likely more familiar with.

Here, we should consider the tools in use today alongside new ones being deployed, and how to bridge the gap in both understanding and knowing when and where to deploy them.

Too many forget the ‘when and where’ part at their own peril.

Where you can add value

For us to recognise where we can provide support and drive value, we must recognise what’s changing.

I think this framework from BCG can help recognise the moments where performance support is most needed with AI transformation.

They propose it for navigating AI transformation at scale, and through an L&D lens, I see this as a conversation point of what to map against when focusing on how best to support workforces.

It’s built on two key dimensions:

1️⃣ AI Maturity

It progresses from tool-based adoption by individuals to workflow transformation, to full, agent-led orchestration. Most organisations, and even teams within them, operate across multiple stages at once, not in a linear path.

2️⃣ Workforce Impact

This spans how tasks are executed, to what skills are needed, to how teams are structured, to how organisational culture must evolve to support new ways of working.

While this covers the wider transformation AI brings across businesses, it acts as a roadmap for L&D.

A roadmap is often what we need because it’s not uncommon for senior leaders to treat “training” (as they call it) as a boomerang that’s thrown at will when they decide people need to know stuff.

The framework above provides a view of where the friction/pain points/ problems exist in the cycle of change. That’s where we should focus.

Map it out

I mentioned before not to blindly follow frameworks, and that advice is the same here.

This view from BCG is a useful foundation for each of us to think about “where can we add value”, but it will look different for each environment.

So, I’d recommend you map out what your organisational journey looks like today.

Explore the 3 pillars of tasks, talent and teams across your business and how/where AI is starting to and might impact these. It’s here that you will uncover the friction and pain points where we can be of most service.

Some of that will be through tooling, no doubt. Yet, I feel pretty safe in saying you’ll be spending a good deal of your time navigating changes within people and systems.

Final thoughts

I’m going to leave it here for now, folks.

There’s much to say, of course, but only so much attention span I can ask you to give.

I’m thinking of expanding some of this thought into a long-form video. If that sounds like something you’d like to see, let me know.

In the meantime, some additional resources to explore on this include:


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Learning Technology

My 7 Best Thoughts on Learning, Life and Technology From 2025

I wrote 156,000 words across 52 editions to 5,000 people in my newsletter across 2025.

These thoughts might not be the most popular according to my stats, but they’re the ones I believe are the most meaningful and that I enjoyed writing.

Ready?

Here we go…

1/ The Anatomy of A Modern L&D Team

Now, I update this article every year.

I call it an article but its more like a playbook for the modern L&D leader. I’ve been publishing a new edition of this every year to help leaders craft a team, skills and tech stack to navigate today’s world.

In 2025, it had its biggest update.

And yes, AI had a lot to do with that, yet it goes beyond technology.

In the almost unstoppable AI takeover this year one thing became clear to me, the human element is more crucial than ever.

Read more

2/ Everything L&D Teams Need To Know About AI Agents

2025 was supposed to be the year of AI agents – but was it?

I’m not so sure.

This time last year, every tech and AI CEO preached that 2025 would be the year AI agents hit the big time. While I’m not convinced the hype delivered, I do believe these will become important parts of the work ecosystem in the years ahead.

Yet, something that grinds my gears is when a lot of social media gurus try to confuse and deceive the every day human on what exactly AI agents are.

So, that led to me creating this mini-guide for L&D to understand AI agents with out the BS, and explore how they can impact and amplify work as we know it.

Read more

3/ The Dangers Of Accepting What You See Online

This isn’t exclusively an L&D thing, yet I really wanted to say something about the state of what I see (and I’m sure you do) online.

The tipping point for me came when I saw one too many so called ‘L&D influencers’ continually spread misinformation through clickbaity headlines about research they didn’t actually read.

There’s a reason you should read beyond the headline.

And with the 2025 word of the year being claimed by “Rage bait”, I believe we need to look deeper into what we see, hear and read in online spaces.

If you want to discover why being a Skeptical hippo could improve your mind, ability to learn and your emotions, then this one is for you, dear reader.

Read more

4/ How AI Is Redefining the Way We Assess Learning

Ok, I’m big on the future of learning not focusing on recall with stupid end of course tests and quizzes, but shifting to human reasoning.

The catalyst for this? Yes, you guessed it…AI.

In this one, I propose that now is the time to ditch the memory games in place of true activities that nurture human intelligence through the use of modern tech solutions.

If you fancy shaking things up in 2026, the come join me in this one.

Read more

5/ Why Skill Erosion is a Real Problem That No one Can Ignore

I kinda think of this post as a sequel to my analysis on “The Hidden Impact of AI on Your Skills”.

Somehow, it’s been a year since I hit publish on that one.

The message of that piece was to think deeply about the over-reliance we will easily slip into with AI, and how easy it will be to convince ourselves we’re learning how to do something, when in reality, AI is doing it for us.

A year later, I only see more activity, which has amplified both.

That’s not to say there are not those who are rejecting total delegation to AI and those finding the balance between artificial and human intelligence.

As society obsesses over what it gains from AI, perhaps we should be asking what we lose, too.

Read more

6/ How To Stop AI From Hijacking Your Thinking

Let’s be honest, we’re all using AI in our day to day in someway, and that’s a good thing.

It’s an incredible piece of technology in the right hands, of course.

But as more of the online world becomes AI-ified, you must ask: Are you thinking with AI, or is AI shaping your thinking?

In this kinda mind-bending article, we explore what happens to human thinking when so many outsource it to an artificial construct.

Read more

7/ Where L&D Adds Real Value In The AI Noise

Let’s end it on a positive one, shall we?

I love learning, and I’ve loved my L&D career. We add so much value in many situations and that’s what keeps me writing more words every week.

While AI is changing the world, I don’t see it replacing human learning and those who work in organisations working to amplify that. It will look different but it won’t die (fyi, human learning will never die).

There’s been a ton of talk about AI adoption the last two years

It’s odd because the validation of “adoption” has many definitions dependent on the context and environment. The common pitfall is to measure adoption as ‘use of AI tools’ alone.

As we know with previous technology, usage alone doesn’t mean meaningful adoption.

Setting what adoption looks like in your organisation is not a task for the L&D team.

Yet, we have an opportunity to contribute to long term and meaningful adoption of AI across workforces as part of a wide collaboration in a community.

Lets talk about that.

Read more

Final thoughts

Choosing your best work is always hard, and I’m sure if you asked me to pick again a week from now, I might have a different combination

But for now, this is it.

Hopefully, I can keep talking about these topics in more detail with you across the next year, both here and in the weekly Steal These Thoughts newsletter.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence

Where L&D Adds Real Value In The AI Noise

There’s a ton of talk about AI adoption.

It’s odd because the validation of “adoption” has many definitions dependent on the context and environment. The common pitfall is to measure adoption as ‘use of AI tools’ alone.

As we know with previous technology, usage alone doesn’t mean meaningful adoption.

Setting what adoption looks like in your organisation is not a task for the L&D team.

Yet, we have an opportunity to contribute to long term and meaningful adoption of AI across workforces as part of a wide collaboration in a community.

Lets talk about that…

It takes more than access

Let’s go beyond the veil of bullshit we see online.

Access to an AI tool alone means nothing, and putting on one hour lunch and learns to “make people learn AI” is a comical up-skilling strategy.

If you’re a long time reader, you’ve heard me become a broken record when I talk about what it takes to nurture meaningful and long-term change.

We have much to consider with context, culture and constraints in each environment. No two workplaces are the same, that’s why the cookie-cutter “adoption frameworks” make me laugh.

They’re a good point of inspiration but you shouldn’t follow them like a strict set of instructions.

Saying that, what is it we need to consider beyond tools?

Read on…

People, Systems and Tools

As you’ve probably guessed, launching new technology and tools alone rarely leads to meaningful adoption.

There’s a bigger ecosystem at play.

We have to consider:

1/ People

Where are people at today and how do we meet them?

Everyone will have a different understanding, maturity and receptiveness to something new and unknown. In AI’s case, we have a mix of emotions from “will this take my job” to “I want it to do all this stuff I hate doing”.

The most difficult part of a change process is people because we’re all so unpredictable.

2/ Systems

Quite simply, how we work today.

What are the tried, tested and trusted conscious and unconscious systems we have in place. This covers both how we execute tasks and how we think about executing those tasks (deep, I know).

We each follow different types of systems in our day to day.

Understanding what these are and how AI will impact those is key in this change.

3/ Tools

The part you’re most likely more familiar with.

Here, we should consider the tools in use today alongside new ones being deployed, and how to bridge the gap in both understanding and knowing when and where to deploy them.

Too many forget the ‘when and where’ part at their own peril.

Where you can add value

Source: BCG

For us to recognise where we can provide support and drive value, we must note what’s changing.

I think this framework from BCG can help recognise the moments where performance support is most need with AI transformation.

They propose it for navigating AI transformation at scale, and through an L&D lens, I see this as a conversation point of what to map against when focusing on how best to support workforce’s.

It’s built on two key dimensions:

1️⃣ AI Maturity

It progresses from tool-based adoption by individuals, to workflow transformation, to full, agent-led orchestration. Most organizations, and even teams within them, operate across multiple stages at once, not in a linear path.

2️⃣ Workforce Impact

This spans how tasks are executed, to what skills are needed, to how teams are structured, to how organisational culture must evolve to support new ways of working.

While this covers the wider transformation AI brings across businesses, it acts as a roadmap for L&D.

A roadmap is often what we need because its not uncommon for senior leaders to treat “training” (as they call it) as a boomerang that’s thrown at will when they decide people need to know stuff.

The framework above provides a view to where the friction/pain points/ problems exists in the cycle of change. That’s where we should focus.

Map it out

I mentioned before to not blindly follow frameworks, and that advice is the same here.

This view from BCG is a useful foundation for each of us to think about “where can we add value”, but it will look different for each environment.

So, I’d recommend you map out what your organisational journey looks like today.

Explore the 3 pillars of tasks, talent and teams across your business and how/where AI is starting to and might impact these. It’s here you will uncover the friction and pain points where we can be of most service.

Some of that will be through tooling, no doubt.

Yet, I feel pretty safe in saying you’ll be spending a good deal of your time navigating changes within people and systems.

Final thoughts

There’s much to say, of course, but only so much attention span I can ask you to give.

I’m thinking of expanding some of this thought into a long-form video, if that sounds like something you’d like to see, let me know.

In the meantime, so additional resources to explore on this include:


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence Skills

Why Skill Erosion is a Real Problem That No one Can Ignore

I kinda think of this post as a sequel to my analysis on “The Hidden Impact of AI on Your Skills”.

Somehow, it’s been a year since I hit publish on that one.

Isn’t it funny how time works? I remember so clearly spending months researching and putting all the pieces together to look deeper into the real impact of AI on skills so far, and now, here I am talking about it like some sort of ancient text.

My reminiscing aside…

The message of that piece was to think deeply about the over-reliance we will easily slip into with AI, and how easy it will be to convince ourselves we’re learning how to do something, when in reality, AI is doing it for us.

A year later, I only see more activity, which has amplified both.

That’s not to say there are not those who are rejecting total delegation to AI and those finding the balance between artificial and human intelligence.

We’ll talk about some of those later.

Consequences

It’s such a serious sounding word, isn’t it?

Like something your parents would say to you.

Our choices can lead to consequences in many forms, that’s the risk we all take, and not to keep sounding like some old stoic, but life is essentially all about risk.

Back in October last year, when I spoke about AI over-reliance and the illusion of expertise, I only covered in small detail what the consequences of those choices could mean.

A year later, it’s clear to me that’s skill erosion.

The Great Erosion of Skills

Do you remember just after the pandemic, when every headline was something like “The Great ‘x’ of blah blah?”

I’m happy to make a contribution to that movement 😂.

Jokes aside, you might be noticing some people’s skill sets are eroding through lack of use, and some aren’t even learning the skills at all. This is being driven by the change in the tasks we now deliver.

As AI gets better and better at completing a wide variety of tasks, it means we (as humans) do less in certain areas.

That is not always a bad thing.

Cognitive offloading of some tasks can amplify our ability to perform better in the workplace. A good example of this is GPS. Before we had GPS in our lives to guide us to destinations, we’d spend hours pouring over gigantic maps with tiny text, trying to figure out the best route.

Now, at the touch of a button, we’re guided without having to activate one brain cell.

There’s another side to this coin, though.

Humans, for the most part, want to take the path of least resistance and favour instant gratification over the challenge (I’m no different here).

The problem is that real learning and thus improved performance are about navigating the challenges. It’s really hard to learn how, what and why if you don’t experience the struggle.

AI doesn’t take this away all on its own, how we use AI does.

In our quest for “more time”, “creative freedom”, “improved efficiency” and every other statement that tech CEOs blurt out about AI, we’ve become obsessed with the automation of everything.

This creates the consequences I’m talking about.

What we lose and what we gain

I always remember an old colleague saying, “You can have it all, you just can’t have it at the same time”.

While it was in relation to something else, I can’t help but think it fits well in this conversation.

I’ve found life to be a series of trade-offs.

If you say yes to one thing, you’re saying no to something else. It sounds like easy math (and it is), but it’s by no means a simple equation.

I’m not the first to consider the impact of AI in this way.

The folks at Gartner have been covering this as they look towards what is putting future workforce’s at risk.

Here’s an excerpt to ponder:

Gartner predicts that by 2028, 40% of employees will be trained and coached by AI when entering new roles, up from less than 5% today.

While this shift promises faster onboarding and adaptive, scalable learning, it also means fewer chances for employees to learn from experienced peers. Junior staff, who once relied on mentorship and hands-on experience, will learn primarily from AI tools, while senior staff increasingly depend on AI to complete complex work.

This shift accelerates the loss of foundational skills and weakens expert mentorship and relationship development across the organization.

Source: Gartner

We have skills eroding through lack of practice and application, and it seems, the quick expiring of skill creation with future generations entering the workforce.

Harold Joche put it nicely when he said, “One key factor in understanding how we learn and develop skills is that experience cannot be automated”.

So, what can be done?

Are we doomed to roam the world skill-less and watch AI-powered tools suck the life out of the world itself? Of course not, there is a way, my fellow human hacker.

Strategies and tactics to prevent skill erosion

So, instead of moaning about the great wave of skill erosion, I’d rather focus on doing something about it.

The good news is there’s a lot we can all do.

If you haven’t already, you can find a ton of my guidance in these articles:

  1. The Hidden impact of AI on your skills
  2. How to stop AI from hijacking your thinking?

Saves me repeating myself like a broken record here.

Plus, the folks at Gartner offer some basic but useful actions for the workforce:

  • Watch for AI mistakes and rising error costs, and keep manual checks in place where AI is used most.
  • Retain your senior staff and encourage peer learning to slow skill loss.
  • Focus on roles at risk and review your talent strategies regularly to keep key skills strong.
  • Pair AI with human oversight and maintain manual checks as a backup for AI.
  • Encourage employees to continue exercising core skills (e.g., analysis, coding, problem-solving) even when AI tools are available — through simulations, rotations and shadowing.
  • Use AI simulations and adaptive training, but make sure people still learn from each other.

My question to you: What would you add?

Final thoughts

There’s much more to ponder on this.

Like with everything in this space, whether it happens or not is down to your individual choices and intentions. So, if you want to craft a career for the long haul, make smarter choices when it comes to your skills.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.