Categories
L&D Tools Learning Strategy

How To Build a Modern Learning and Development Team

Many corporate L&D teams are slow to evolve their philosophies.

The world’s getting psyched about AI, but 90% of L&D functions still use e-learning as their revolutionary delivery method.

We’re in the middle of a transformation that could make or break what we know as the L&D function.

This means we have to rewire not only what we do, but how we do it. That’ll lead to a new look for our local L&D teams. One shaped around providing the best service to your workforce (aka your customers).

Perhaps, the first question should be if we even call it an L&D team anymore? But I’ll leave that debate for another day.


1/ Define the Goals of a high-performing learning team

We start with our ‘why’.

Why are we doing all this and what for? Deep stuff, I know, but we need this clarity before we can continue.

Let’s keep this simple, my take is:

  1. Enable performance and build organisational capability
  2. Provide return on investment to the business
  3. Support the organisational strategy with the right skills at the right time

The ultimate goal of any workplace learning function is to enable and support people in performing at work.

We serve our audience by improving their performance.

They feel good. We feel good, and the company feels good. Plus, better performance should = more business revenue and access to improved career opportunities.

But…for the business…

Corporate L&D is about making an impact on the bottom line. There is no getting away from that. No matter how much it sucks to say.

And that can be achieved by focusing on performance.


2/ Explore the tasks and skills your L&D team needs

Before we explore roles, you need to get clear on the tasks your team does and the skills needed to do these.

To get clear on the tasks and skills you need, try out this exercise.

Tasks vs Skills assessment

Create a table like the one below.

Spend time with your team to analyse the tasks they do today and cross-reference this with the skills the team needs to do these.

TasksSkills
What tasks will your team deliver?What skills will they require to do this?
The Task + Skill pre-assessment

Use this data to shape how and what your team delivers.

If you have no clue where to start, fret not. I have a wealth of industry data to help you get clarity on the skills side.

The modern L&D professionals must-have skills

Here’s what we see from industry analysts in 2025.

1/ Redthread Research

First up is Redthread Research with their Future-proofing L&D: Developing the Right Skills report.

They asked 400 L&D pros: What skills will L&D functions need in the future? Their comments were grouped into 30 skills across the 6 skill categories you see below.

Source: Redthread Research

They revealed the 5 main skills L&D teams self-identified as:

  1. Using and applying AI
  2. Data and analysis
  3. Learning design
  4. Tech literacy (or what I call Digital Intelligence)
  5. Human skills

2/ The Josh Bersin Company

In addition, HR research analyst, Josh Bersin, offers this perspective on the evolution of L&D teams role up to the modern day.

Source: It’s Time For an AI Revolution Report 2025 | Bersin

3/ The Learning and Performance Institute

Then we have this collection of 30 skills that the Learning and Performance Institute positions as a marker of need for teams.

Source: LPI Capability Map

Where RedThread, Bersin and the LPI have a far more polished corporate approach.

I couldn’t round-up this section without including my analysis, which is more minimalistic in its tone.

Every year, I review and distil the latest industry data to identify the 7 essential skills for modern L&D pros. I use data points from industry bodies and research houses (like the above), plus I map this to data that the 5,000+ readers of my weekly newsletter share with me through annual surveys.

These, coupled with my direct work with organisation’s gives me a constant view of ‘what is actually needed’ for L&D teams to survive and thrive.

The image below shows the focus for 2024.

Fret not, the 2025 edition is coming in a few months 👀.

an image of the 7 skills modern learning and development professionals need.

Despite what you read in this section, context is king.

Your org, its culture and business goals will vary. So you need the skills that support this environment.

No matter where you land, I hope we can all agree that the modern L&D team should focus on performance instead of pushing education.


3/ How to identify the most critical roles for your L&D team

Now this, my friend, is much harder to align on in 2025.

I wrote the first edition of this guide in 2018. The only decent tool to surface content back then was Google, yet the times have changed.

We both see how all forms of artificial intelligence are reshaping how we work.

Although tasks and skills can be identified.

Who (or what) actually delivers on the tasks is not so clear-cut anymore. We’re using LLMs and AI assistants to support us with many tasks in a collaborative approach, while AI agents are quickly coming into the discussion as “do it for me” solutions.

Now, identifying who does what is going to come down to:

  • What can only a human do well to the level required
  • What can a human and AI do well together
  • Where is AI more efficient than a human

This is not set in stone as a framework.

It’s how I’m currently assessing roles that could exist in an L&D team in 2025, and I do expect that they will not all be done by a human.

What could L&D look like in 2027?

2027 is an arbitrary number, yet I don’t like to look too far into the future.

To help frame how we can think about a modern L&D team, let’s look at the makeup of workplace teams at large with the help of Microsoft.

Microsoft Work Trend Index Report 2025

I think we’ll see a mix of these over the years.

Probably not a linear progression, especially how tech moves so fast.

Taking all this into account, this is my bet on roles within the L&D team. They could be fulfilled by a human, a human with AI or just AI.

The answer all depends on the tech, tasks and the context of what you do.

And yes, you might lead a team of AI and humans, too.

Bonus: Get a 3 minute overview of that Microsoft research

The anatomy of a modern L&D function

Let’s unpack what modern learning and development could be.

While this section would historically be classified as “roles”, I think of them more as categories where several roles can exist. Maybe we should treat them as mini-departments.

Note that as we’re reshaping what an L&D function looks like in a world of AI, this will not include many traditional roles that are currently accepted as the standard.

Our goal is not to bolt on AI to the existing model.

We’re transforming!

I get that this is very anti-traditional.

While today the function focuses too much on creation, I don’t think that will be the case as we look beyond 2027. Our industry is always slow to adapt, yet in those years from now, I believe we’ll have figured out how to operate creation and curation with AI, mostly.

That’s not to say human oversight won’t be needed, because it will.

But I don’t see as many design or creator roles on the scene when 1-2 people, powered with the right tools, can do the work of a much larger team.

We’re production-focused today, yet I see that balance swinging to the more meaningful side of enablement in the long term.

This isn’t about bolting on AI but rather redesigning what you deliver, and performance enablement is the more human-centred approach to workplace learning (or whatever we call it in 2027).

3 powerful modern L&D roles to invest in

Let’s pretend you put a gun to my head and demand an answer to “what are the most impactful roles?” These would be my answers

1/ The Learning Strategist

A slight bias to this title as it’s the one I’ve given myself in my business.

When you run the show, you can do crazy stuff like this. Anyway, the strategist function is what it says on the tin, it helps define and refine the strategy for the L&D department.

It’s too easy for L&D teams to wander around doing anything and everything, while pretending it’s ‘all part of the plan’, only to have nothing to show for it all come year’s end.

That’s why you need this role.

It may not be one person, but it needs to exist.

Your strategist will:

  • Define the team’s performance and capability strategy (with some sort of CLO)
  • Determine what needs to be done, why and define the return on investment
  • Direct your team throughout the year to focus on the right things, not more things, to achieve the shared strategy.

This is the core of the role, and you can make it as big or as small as you’d like.

2/ The Learning Experience Architect

These aren’t instructional designers, and they won’t focus on building stuff in authoring tools.

Learning Experience Architects (LXA’s) focus on building the right experiences to solve real problems. Forget about creating paint by number delivery experiences.

These people will be front-facing into the business, so they’ll be focused on performance consulting to design and deliver both digital and physical experiences across the organisation (with and without AI).

Your Learning Experience Architect will:

  • Build strong relationships, trust and bring credibility to the L&D team’s work
  • Map out the real problems with stakeholders to design and deploy the best solution (and maybe that’s not an L&D one)
  • Develop the philosophy and design architecture for performance
  • Lead human experiences across physical and digital spaces

L&D is still a business of people.

No matter how much AI integrates across our workflow, people still buy from people, and your LXA’s will be a key part in influencing behaviour change.

3/ Product Manager/ Tech Architect

Naming conventions aside, you can’t have an L&D function without a dedicated focus on technology.

To do that well, you need people who have no clue about learning design, methodologies and all that noise, yet they can work with those who do to craft the best tech stack.

This role spearheads the technology stack that supports both your team and performance across the business.

Your Product Manager/Tech Architect will:

  • Research, test and identify the most useful technology for company performance initiatives
  • Craft the user experience and interface of all L&D technology across your organisation
  • Support for the design and deployment of in-house and external tools
  • Advise the L&D team on tools and infrastructure to support all team performance projects

4 shared skills every L&D team needs to have

I know the world likes to put people in boxes, but I’m not a fan of that.

Your L&D team will be full of specialists and generalists, yet they do need a shared language of skills across the function.

Let’s focus on some of the core shared skills needed:

🤖 AI Fluency

As I update this post in 2025, the last 3 years have been dominated by one word: “AI”.

We say ‘AI’, but what we mean is generative AI, which has kicked off this wider AI arms race. AI itself has been around a long time, yet generative AI, deployed in particular through Large Language Models like ChatGPT, has sparked mass attention across the space.

It would be foolish of me not to recognise this impact and how the skill of intelligently leveraging it is a must, no matter your industry.

AI in itself is not a skill, but knowing how to use it is.

A diagram illustrating how to build AI skills at work, showing a Venn diagram with two overlapping circles labeled 'Experiment & explore' and 'Structured experiences,' with 'Good Enough' in the intersection.

The term “AI Literacy” has been thrown around to describe the collection of skills and behaviours humans need to expertly leverage tools in this category. I’ve recently come to appreciate the term “AI Fluency”, introduced by the Anthropic team, instead. It sounds less education-ish, in my opinion.

No matter what you call it, we all need it.

AI Fluency looks like this:

  • Foundational understanding of how AI works (all types, not just generative)
  • The ability to determine when AI could be a useful part of a solution, and with this, when it should be used and where.
  • The capability to collaborate with AI tools to deliver an outcome (the how).
  • Understanding the opportunities and limitations of AI systems from a technical and ethical perspective

Back when I worked in a team, I was always the ‘tech’ guy.

Everyone would come to me with anything related to tech. That can’t happen with AI, it never should have with tech in general. It’s everyone’s shared responsibility to know how to use AI intelligently.

Plus, it’s a must have set of skills and behaviours for the modern day workplace.

It seems we’ll all need to be AI-native.

Resources:

  1. The AI for L&D Crash Course
  2. AI in L&D resources, frameworks and system playbooks
  3. AI tool and system tutorials

📣 Storytelling (inc marketing and sales)

I’m cheating here a bit because I’m grouping a few categories under one banner.

We’re in the business of people, which means we must be able to connect with them and influence them through change.

That means you have to be a good storyteller.

People don’t change for no reason, and I’m a big believer that good stories entertain peoplebut great stories change people.

Marketing and sales are both forms of storytelling as they serve the purpose of telling the stories of your work, people and the results.

What I’m not saying is you need to learn to be an expert marketer and salesman, but you’d be wise to understand a few techniques.

Our industry falls too often into the trap of build and they will come. Sadly, they don’t.

You can have the most amazing learning content in human existence, but if no one knows it exists, what it does and why it’s important for them ,  you’re pretty much digging your own grave.

You need the whole team to buy into this way of thinking.

It’s something we each do every day, in many ways.

You market/sell your skills to a potential employer, you market ideas to business leaders, and you even use marketing techniques to convince your crush to go on a date with you.

Resources:

  1. What L&D gets wrong about marketing, and how to do the stuff that works
  2. How to use the power of content marketing to amplify your L&D products
  3. The ultimate mini-guide to positioning your L&D products for success
  4. How to master the art of storytelling for business
  5. The rare skills L&D pros need to thrive and survive in the modern workplace

🤝 Performance consulting

Think of Performance Consulting as being the workplace detective of the L&D world.

It’s not just about throwing a training program at a problem and hoping it sticks (or not spraying and praying as I was once told).

NopeYou dig deeper.

You chat with the team, look at the data, and figure out what’s really happening. Then, you come up with a game plan that might be training, but could also be other stuff like better tools, process changes, or even a simple conversation.

The end game is making sure everyone performs better and the business scores a win.

The art of consulting seems lost in L&D teams.

We take a lot of questions, but ask few questions. Sometimes, you have to do this. The nature of your organisation can be tough to change.

Your mission is to partner with the workforce to understand their needs and propose the best solution. That might. not even be an L&D solution 😨.

Resources:

  1. Unpack why performance consulting is so important for the modern L&D pro here.
  2. Craft your skills in this area with The Art of Performance Consulting Masterclass for L&D Teams

🖥️ Digital Intelligence

We’re biological beings living in an increasingly digital world.

This skill isn’t exclusive to our industry. It’s a must for every human.

I’ve spent too much of my career watching people shy away from tech. You cannot do that anymore. I hate to sound like one of those morons on social media who say, “Do this or be left behind”. But I’m going to make an exception here.

If you don’t invest in your digital intelligence, you will be left behind.

Defining Digital Intelligence

Let’s keep this simple.

It’s about being savvy, aware and adaptable with new digital technologies. You don’t need to be an expert but you must be aware of what’s available. Be curious, always.

Our world needs more digital-savvy pros.

As the world of learning continues to be eaten up by tech. You would be wise to become fluent in the language of technology to become a valued strategic business partner.

Graphic displaying the concept of Digital Intelligence, emphasizing skills such as Tech Ability, Digital Literacy, Ethical Awareness, and Agility & Experimentation.

Even outside of L&D, a basic level of understanding of how different technologies work, connect and support one another is essential.

Not every member of your team needs to be an expert in this field, but they should have a shared understanding.

Again, some of you reading this might be resistant and suggest that a technology team should do this.

Your team will know the application for any learning tech better than anyone. They will understand how it works in practice, so keeping up to date with modern technology will always be an enhancer

This knowledge will separate your team from the industry.

If you keep up with trends, absorb what is useful, avoid what is not and apply what works for your audience. You will create a high-performing learning function.


The L&D leader

You may know it as the Head of Learning, Director of Learning, Chief Learning Officer or whatever flavour of the month title is today.

Maybe this is you right now, or it’s one of your aspirations.

Nonetheless, let’s unpack what it means to be an L&D leader in the modern era.

What does a L&D leader do?

The learning leadership role for me is one of a strategist, coach and enabler.

What I learnt quite quickly when heading up a department myself is that 80% of my tasks move to nurturing people.

I still got involved in projects from a strategic standpoint, but doing the actual project build wasn’t part of the gig. This was difficult to get my head around at first.

You’re now in charge of the compass and the map, plus your crew of L&D pros.

That doesn’t mean you stop learning.

Like your team, you will be on top of evolving trends, insights and practices which allow you to nurture a clear vision for the team to work towards.

You enable your people to do what they need to.

A modern learning leader should provide the freedom to experiment, propose new ideas, test stuff and ultimately innovate the way you work.

Of course, context, culture and organisational constraints play a big part in what you can achieve. So don’t be too hard on yourself about this.


This will evolve

As I said from the get-go, this is just how I would build a team today.

Your working

These will evolve as the world changes and the dynamics of the workplace with it. So, don’t take what I’ve shared as set in stone.

I’m sure a year from now my recommendation could look different.

Instead, use these as the foundational building blocks to the function you want to shape.


Final thoughts

When you look at your L&D team today, ask yourself:

  1. Are they providing the service your audience needs?

  2. If not, why not?

  3. Has your team been set up to deliver the service your workplace requires?

I hope this helps.

If you have any questions or comments, you can leave them below, message me directly, or find me for a chat on LinkedIn.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Skills

The 7 Skills L&D Teams Need For Today’s World

Updated: Aug 2024

Practising what we preach in the L&D industry is not as common as you might think.

We always talk about the latest learning philosophies, skills and experiences other people need. Rarely do I see or hear anyone talk about the modern L&D skills needed to navigate today’s world.

In a world increasingly shaped by remote work and digital platforms, the challenges for L&D teams are mounting.

Categories
Career Development

How You Can Build a ‘100 Million Dollar’ Skillset

Let me throw a big question your way.

Are you working on building your 100-million-dollar skills?

I’m not saying these skills will magically deposit a cool $100 million in your bank account tomorrow (but wouldn’t that be something?).

The term “100-million dollar skills” isn’t about actual dollar value (sorry).

It’s a metaphor borrowed from Alex Hormozi to describe the ultra-valuable skills that can elevate your career opportunities and overall wealth. It’s about focusing on the quality of skills, not just quantity.

Why High-Value Skills Matter

Historically, everyone used to craft better opportunities with fancy job titles.

Thankfully, the world has changed.

Niche skills make you stand out, consider these your career currency.

They’re the skills that separate you from the pack. I’m talking about the way you communicate, your ability to think critically, make sound judgments, and how you work intelligently with AI.

Why focus on high-value skills?

I’d hope the answer to this is obvious but I’ll play ball anyway.

Like money, you can compound these skills to unlock better opportunities further down the road. We’re playing an infinite game in a finite space after all. Think of it as investing in a high-yield stock that keeps giving back yearly. Building skills in high demand ensures that you are a key player in your field with an advantage very few possess.

As Tim Ferriss said: “Don’t try to be one of, be the only”.

The 100 million dollar skills principle in action

Speaking of Tim Ferriss, he’s a good example to explore on this.

For those who don’t know, Tim is the popular author and I suppose ‘productivity/self-help consultant’ of the 4-hr books, and his podcast.

Tim’s career began in technology startups in Silicon Valley, a highly competitive environment where efficiency and rapid learning are crucial.

Although he found early success, Ferriss was overwhelmed by overwork and stress, pushing him to seek more efficient ways to manage his time and productivity.

Recognising the need for a change, Ferriss started focusing on what he terms “meta-learning”, a skill of learning how to learn efficiently and effectively.

He explored various techniques for time management, productivity, and personal optimisation, aiming to work smarter, not harder. This exploration led to the development of the “4-Hour” concept, which he first applied to his personal health and fitness routines.

I’ve always struggled to clearly define what Tim actually does.

His skill has always felt like ‘Tim Ferris’ because he’s the only one doing the many things he does in the way he does them. Meta-learning sounds much better, though.

How Tim unlocked unique opportunities with niche skills

Ferriss’s breakthrough came with the publication of “The 4-Hour Workweek”.

A book that encapsulated his principles of lifestyle design and productivity. The book, which details how to outsource life tasks, automate business processes, and design an ideal lifestyle, struck a chord with a global audience tired of the traditional 9-to-5 grind.

Including me, back in 2015.

The success of “The 4-Hour Workweek” transformed Tim from a stressed entrepreneur into a leading voice in life hacking (do people still use this word?) and personal productivity.

His ability to distill complex subjects into actionable advice proved to be a high-value skill, setting him apart from other self-help authors.

Especially at the time because many self-help authors acted like gurus, where as Tim adopted a professor approach of showing, not just telling.

Building on his success, Tim continued to expand his niche skills into other areas, including cooking (“The 4-Hour Chef”) and fitness (“The 4-Hour Body”). Each project leveraged his meta-learning skills, showing others how to master complex skills quickly and efficiently.

He also launched a popular podcast, “The Tim Ferriss Show” where he interviews world-class performers from diverse areas to share their experience.

This podcast has run for over a decade with millions of listeners.

Today, Tim Ferriss is recognised not just for his books and podcast but for his unique approach to learning and productivity.

Like I said, he’s kinda known for doing Tim Ferris stuff which no one else is even attempting.

His mastery of meta-learning, combined with his skill in communicating these concepts to a broad audience, has not only built his career but showcases the power of niche skills in creating a successful and influential career.

The 9-5 example

You might read the above example and think “That’s cool but I’m not going to be able to do that”.

I totally get that. Tim is in the 1% of that category.

So, what could this look like for us in the 9-5 game? 

Let me tell you the story of my pal, Dave. He’s a great guy and works a 9-5 (probably a few more hours here and there) like most of us.

On the surface, you might not think Dave is killing it in the Career Game.

But in reality, he’s crafted a set of skills which has turned him into a in-demand consultant able to command an annual salary of up to $150k. How is he doing this you ask? AI, quantum physics, world class heart surgeon??

No – his skills are niche in Excel and data visualisation.

Were you expecting something sexier? Most people do. Dave is not doing anything revolutionary. He discovered early in his career that people are terrified of excel.

They love the data output and beautiful visualisations, yet seeing rows induced a sense of doom.

Dave didn’t see doom here, he saw opportunity.

He told me “I saw an opportunity to scale something I could do well and tolerate what others couldn’t”. He quickly found his skills in-demand in his first organisation because no one else wanted to tame the beast of excel.

Dave became the Excel and data king 👑.

It turns out, that people will pay kings very well. You could do this too, as could I. Everyone has access to and uses Excel. We all produce data in many apps, yet most of us suck at it. Dave understood that and built his 100-million dollar skillset around that.

You can do this in any job and industry with the millions of apps we each use.

Let’s unpack the blueprint to do that together ↓

How to identify your High-Value Skills

Identifying which skills can catapult your career into that $100-million valuation starts with a good look at your current job and industry.

Ask yourself:

  • What skills are most admired and rewarded in my field?
  • Which abilities do top performers in my sector possess that I can develop?
  • How do my unique insights and capabilities stand out?

The idea is to zero in on skills that add significant value to your work and enhance your unique selling proposition.

Whether it’s exceptional project management, innovative problem-solving, or cutting-edge tech proficiency, these are the skills that can define a high-value career.

4 ways to compound these High-Value Skills

  1. Focused Learning: Pick one skill at a time to develop. Trying to master multiple skills simultaneously often leads to mediocrity. If critical thinking is your target, dedicate time to courses, books, and activities that enhance that skill specifically.
  2. Practical Application: Apply what you learn in real-world scenarios. If you’re improving your tech skills, work on projects that allow you to use new tools. Real-world application cements learning far more effectively than theory alone.
  3. Feedback and Iteration: Seek feedback from peers and mentors. Understand how your skills are perceived and where you can improve further.
  4. Network and Collaborate: Engage with others who excel in areas you aspire to master. Networking isn’t about swapping business cards anymore. It’s about exchanging ideas and strategies that can help refine your own skills.

Final Thoughts

Building your 100-million dollar skills isn’t about adding more to your plate.

Be brutally specific on the 3 – 5 skills that can make the difference in your industry or even cross-industry. Start today, focus deeply, and create your own opportunities.

Oh, and be like Dave.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence

Where L&D Adds Real Value In The AI Noise

There’s a ton of talk about AI adoption.

It’s odd because the validation of “adoption” has many definitions dependent on the context and environment. The common pitfall is to measure adoption as ‘use of AI tools’ alone.

As we know with previous technology, usage alone doesn’t mean meaningful adoption.

Setting what adoption looks like in your organisation is not a task for the L&D team.

Yet, we have an opportunity to contribute to long term and meaningful adoption of AI across workforces as part of a wide collaboration in a community.

Lets talk about that…

It takes more than access

Let’s go beyond the veil of bullshit we see online.

Access to an AI tool alone means nothing, and putting on one hour lunch and learns to “make people learn AI” is a comical up-skilling strategy.

If you’re a long time reader, you’ve heard me become a broken record when I talk about what it takes to nurture meaningful and long-term change.

We have much to consider with context, culture and constraints in each environment. No two workplaces are the same, that’s why the cookie-cutter “adoption frameworks” make me laugh.

They’re a good point of inspiration but you shouldn’t follow them like a strict set of instructions.

Saying that, what is it we need to consider beyond tools?

Read on…

People, Systems and Tools

As you’ve probably guessed, launching new technology and tools alone rarely leads to meaningful adoption.

There’s a bigger ecosystem at play.

We have to consider:

1/ People

Where are people at today and how do we meet them?

Everyone will have a different understanding, maturity and receptiveness to something new and unknown. In AI’s case, we have a mix of emotions from “will this take my job” to “I want it to do all this stuff I hate doing”.

The most difficult part of a change process is people because we’re all so unpredictable.

2/ Systems

Quite simply, how we work today.

What are the tried, tested and trusted conscious and unconscious systems we have in place. This covers both how we execute tasks and how we think about executing those tasks (deep, I know).

We each follow different types of systems in our day to day.

Understanding what these are and how AI will impact those is key in this change.

3/ Tools

The part you’re most likely more familiar with.

Here, we should consider the tools in use today alongside new ones being deployed, and how to bridge the gap in both understanding and knowing when and where to deploy them.

Too many forget the ‘when and where’ part at their own peril.

Where you can add value

Source: BCG

For us to recognise where we can provide support and drive value, we must note what’s changing.

I think this framework from BCG can help recognise the moments where performance support is most need with AI transformation.

They propose it for navigating AI transformation at scale, and through an L&D lens, I see this as a conversation point of what to map against when focusing on how best to support workforce’s.

It’s built on two key dimensions:

1️⃣ AI Maturity

It progresses from tool-based adoption by individuals, to workflow transformation, to full, agent-led orchestration. Most organizations, and even teams within them, operate across multiple stages at once, not in a linear path.

2️⃣ Workforce Impact

This spans how tasks are executed, to what skills are needed, to how teams are structured, to how organisational culture must evolve to support new ways of working.

While this covers the wider transformation AI brings across businesses, it acts as a roadmap for L&D.

A roadmap is often what we need because its not uncommon for senior leaders to treat “training” (as they call it) as a boomerang that’s thrown at will when they decide people need to know stuff.

The framework above provides a view to where the friction/pain points/ problems exists in the cycle of change. That’s where we should focus.

Map it out

I mentioned before to not blindly follow frameworks, and that advice is the same here.

This view from BCG is a useful foundation for each of us to think about “where can we add value”, but it will look different for each environment.

So, I’d recommend you map out what your organisational journey looks like today.

Explore the 3 pillars of tasks, talent and teams across your business and how/where AI is starting to and might impact these. It’s here you will uncover the friction and pain points where we can be of most service.

Some of that will be through tooling, no doubt.

Yet, I feel pretty safe in saying you’ll be spending a good deal of your time navigating changes within people and systems.

Final thoughts

There’s much to say, of course, but only so much attention span I can ask you to give.

I’m thinking of expanding some of this thought into a long-form video, if that sounds like something you’d like to see, let me know.

In the meantime, so additional resources to explore on this include:


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence Skills

Why Skill Erosion is a Real Problem That No one Can Ignore

I kinda think of this post as a sequel to my analysis on “The Hidden Impact of AI on Your Skills”.

Somehow, it’s been a year since I hit publish on that one.

Isn’t it funny how time works? I remember so clearly spending months researching and putting all the pieces together to look deeper into the real impact of AI on skills so far, and now, here I am talking about it like some sort of ancient text.

My reminiscing aside…

The message of that piece was to think deeply about the over-reliance we will easily slip into with AI, and how easy it will be to convince ourselves we’re learning how to do something, when in reality, AI is doing it for us.

A year later, I only see more activity, which has amplified both.

That’s not to say there are not those who are rejecting total delegation to AI and those finding the balance between artificial and human intelligence.

We’ll talk about some of those later.

Consequences

It’s such a serious sounding word, isn’t it?

Like something your parents would say to you.

Our choices can lead to consequences in many forms, that’s the risk we all take, and not to keep sounding like some old stoic, but life is essentially all about risk.

Back in October last year, when I spoke about AI over-reliance and the illusion of expertise, I only covered in small detail what the consequences of those choices could mean.

A year later, it’s clear to me that’s skill erosion.

The Great Erosion of Skills

Do you remember just after the pandemic, when every headline was something like “The Great ‘x’ of blah blah?”

I’m happy to make a contribution to that movement 😂.

Jokes aside, you might be noticing some people’s skill sets are eroding through lack of use, and some aren’t even learning the skills at all. This is being driven by the change in the tasks we now deliver.

As AI gets better and better at completing a wide variety of tasks, it means we (as humans) do less in certain areas.

That is not always a bad thing.

Cognitive offloading of some tasks can amplify our ability to perform better in the workplace. A good example of this is GPS. Before we had GPS in our lives to guide us to destinations, we’d spend hours pouring over gigantic maps with tiny text, trying to figure out the best route.

Now, at the touch of a button, we’re guided without having to activate one brain cell.

There’s another side to this coin, though.

Humans, for the most part, want to take the path of least resistance and favour instant gratification over the challenge (I’m no different here).

The problem is that real learning and thus improved performance are about navigating the challenges. It’s really hard to learn how, what and why if you don’t experience the struggle.

AI doesn’t take this away all on its own, how we use AI does.

In our quest for “more time”, “creative freedom”, “improved efficiency” and every other statement that tech CEOs blurt out about AI, we’ve become obsessed with the automation of everything.

This creates the consequences I’m talking about.

What we lose and what we gain

I always remember an old colleague saying, “You can have it all, you just can’t have it at the same time”.

While it was in relation to something else, I can’t help but think it fits well in this conversation.

I’ve found life to be a series of trade-offs.

If you say yes to one thing, you’re saying no to something else. It sounds like easy math (and it is), but it’s by no means a simple equation.

I’m not the first to consider the impact of AI in this way.

The folks at Gartner have been covering this as they look towards what is putting future workforce’s at risk.

Here’s an excerpt to ponder:

Gartner predicts that by 2028, 40% of employees will be trained and coached by AI when entering new roles, up from less than 5% today.

While this shift promises faster onboarding and adaptive, scalable learning, it also means fewer chances for employees to learn from experienced peers. Junior staff, who once relied on mentorship and hands-on experience, will learn primarily from AI tools, while senior staff increasingly depend on AI to complete complex work.

This shift accelerates the loss of foundational skills and weakens expert mentorship and relationship development across the organization.

Source: Gartner

We have skills eroding through lack of practice and application, and it seems, the quick expiring of skill creation with future generations entering the workforce.

Harold Joche put it nicely when he said, “One key factor in understanding how we learn and develop skills is that experience cannot be automated”.

So, what can be done?

Are we doomed to roam the world skill-less and watch AI-powered tools suck the life out of the world itself? Of course not, there is a way, my fellow human hacker.

Strategies and tactics to prevent skill erosion

So, instead of moaning about the great wave of skill erosion, I’d rather focus on doing something about it.

The good news is there’s a lot we can all do.

If you haven’t already, you can find a ton of my guidance in these articles:

  1. The Hidden impact of AI on your skills
  2. How to stop AI from hijacking your thinking?

Saves me repeating myself like a broken record here.

Plus, the folks at Gartner offer some basic but useful actions for the workforce:

  • Watch for AI mistakes and rising error costs, and keep manual checks in place where AI is used most.
  • Retain your senior staff and encourage peer learning to slow skill loss.
  • Focus on roles at risk and review your talent strategies regularly to keep key skills strong.
  • Pair AI with human oversight and maintain manual checks as a backup for AI.
  • Encourage employees to continue exercising core skills (e.g., analysis, coding, problem-solving) even when AI tools are available — through simulations, rotations and shadowing.
  • Use AI simulations and adaptive training, but make sure people still learn from each other.

My question to you: What would you add?

Final thoughts

There’s much more to ponder on this.

Like with everything in this space, whether it happens or not is down to your individual choices and intentions. So, if you want to craft a career for the long haul, make smarter choices when it comes to your skills.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Skills

The Dangers Of Accepting What You See Online

Once upon a time, I enjoyed using social media.

Sounds weird to say nowadays, I know. Yet, when I first started using platforms like LinkedIn nearly 15 years ago, it was both a different time and place.

Platforms felt more conversational-based and not driven by clickbait. The algorithms weren’t so optimised for mass outrage.

I used to learn tons as a mid-twenty something trying to navigate the odd corporate world in London.

I had a ritual of saving posts to read later and call upon as a sort of personal learning system.

I adopted the same approach in the early days on Instagram.

Specifically trying to take my body from the grasshopper lightweight it was, to some form of decent-sized guy who didn’t look like he could slide through the cracks of doors.

Again, I’d absorb whatever good stuff I could.

Back in 2012, fitness influencer’s weren’t really a thing, so I didn’t have to cut through any noise.

Today, the story couldn’t be farther from this.

I only use LinkedIn these days, and even that is becoming a struggle because I’m met daily with disinformation, misinformation, smart people saying dumb things and a host of selfies desperately being used in the search for attention.

What concerns me most these days is so many people’s inability to ‘read beyond the headlines’.

I see this most often in the last few years with the sharing of the absurd amount of research and reports on AI. Look, I understand the attention game and why people indulge in the shock factor to garner attention.

The problem is that these people and posts are proliferating an epidemic of often incorrect statements, and viewers aren’t nearly being skeptical enough.

They say that AI is killing our critical thinking and analytical judgment, yet we seem to be doing that fine ourselves by not questioning what we see.

A great quote I keep in my notes folder reminds me of the need to both doubt and ask questions: “Don’t believe everything you think or see”.

So, my question is, when did we stop looking beyond the veil? And why don’t we ask questions anymore or do our own research?

I’m not expecting an answer, I’m just throwing it out there.

A few case studies

We see case studies on this almost daily.

They’re probably hundreds at this stage, yet we have two recent ones which I’m sure most of you have seen. It is going to feel like I’m picking on MIT here, but that’s not the intention.

It is how the data produced is being used by 3rd parties and how that impacts the global narrative.

The latest MIT Study, which claims 95% of organisations are getting zero return on Gen AI projects, has been doing the rounds in the last week. Now, while this makes a great clickbait headline and social post, we need to look deeper.

If more people went to the “Research and methodology” sections of reports, they would be surprised.

→ Ignoring this makes smart people say dumb things.

For this particular example, Ethan Mollick posted a great note on how this paper was researched. I’ve provided that section for you to check out below:

I’ll let you draw your own conclusions, yet I don’t feel 52 interviews with a 6-month timeline is a good enough example set to be used as it is, as “the global view” by many people posting across social.

The devil is in the details, as they say.

Our second example, again from MIT (sorry, I do love you really), but back in June, gave rise to more clickbait and social discourse.

This has nothing to do with the research or the researchers themselves. They set out to see how using AI tools affected an individual’s ability to write essays. They conducted this with only 54 people and on this one task.

The main finding was that AI can help in the short term, but if you always use and rely on it, you’ll diminish your ability to write an essay without it.

Cool, sounds like common sense to me.

But that research gave rise to crazy headlines like this:

See how quickly that turned?

Did anyone reading these headlines, both in news apps or social posts, look beyond this? From what I see, no.

This is where the problem exists.

Even the lead researcher on this report called out the same thing and set the facts straight in their own post.

It’s not necessarily new, yet algorithm-based platforms are loving the attention it creates.

Like I said, this problem is not isolated to one report, it’s everywhere across social media and as such, society at large.

Posts with clickbait headlines and mass engagement are proliferating, often misleading and, in some situations, harmful messages.

So, what can you do?

Become a skeptical hippo

Ok, what I’m not saying here is to become some kind of conspiracy theorist.

Instead, I want you to engage that powerful operating system that sits in each of our skulls. When you see headlines like we’ve covered or clickbaity posts, try the following:

1. Go to the primary source

  • Locate the actual report or paper (not just a blog post or tweet about it).
  • Even if you don’t read every detail, scan:
    • Abstract (what the study did and found).
    • Methods (how many people, what was tested, what tools were used).
    • Limitations (almost always at the end).

2. Ask 3 key questions

When you see a claim, pause and ask yourself:

  • Who was studied? (demographics, sample size, context).
  • What exactly was measured? (recall, ownership, not general intelligence).
  • How broad are the claims? (exploratory finding vs universal truth).

If the headline claims more than the study actually measured, that’s a red flag.

3. Notice the language

  • Headlines often use absolutes (“ChatGPT destroys learning!”).
  • Scientific reports usually use tentative language (“suggests,” “indicates,” “preliminary”). Spotting this mismatch helps you resist being pulled into the hype.

4. Slow down your consumption

  • Disinformation spreads because social media rewards speed + emotion.
  • Slow thinking (literally taking a minute to check the source or read the abstract) interrupts that cycle and gives you space to process critically.

TL;DR: Read beyond the headlines, ask the questions and embrace those skeptical hippo eyes.


📝 Final thoughts

While this might partly sound like human raging against the machine, I hope my sentiments of ‘do your own research’ and ‘be more skeptical to reach your own conclusions’, come through.

I don’t believe we need to worry about AI killing our critical thinking and analytical judgment, if we’re doing that fine all by ourselves.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence

How To Stop AI From Hijacking Your Thinking

While I’m not on the doomsday train of “AI will destroy all human thinking” entirely on its own. I can’t ignore the level of stupidity that some humans exhibit when working with AI.

I shouldn’t be surprised, really, as the path of least resistance is paved with instant gratification, which is a dopamine daydream for the digitally addicted.

Still…

What happens to human thinking when so many outsource it to an artificial construct?

I’m saying this as much to myself as I am to you.

This is turning into a strange “dear diary” entry, but stick with me.

This is the end…or is it?

We both see the polarising views plastered across social feeds.

Logic seems to be lost in most conversations.

Most posts are either “AI will destroy your brain” or “outsource all your thinking to AI” I don’t know about you, but I’m not cool with either of those options.

It doesn’t help when the majority blindly believe every headline that’s emotionally tweaked to grab attention. Taking time to look beneath the surface usually paints a different picture. Yes, I’m looking at you, MIT study.

Moving away from all the noise, only one question is worth asking right now:

Are you thinking with AI, or is AI shaping your thinking?

Maybe it’s doing both, and maybe you’re aware of that.

Saying all that, here are a few points I think are worth exploring.

What happens to ‘human thinking’ if we over-rely on AI?

This is a real grey area for me.

I’ve seen countless examples where too much AI support leads to less flexing of human skills (most notably common sense and deep thinking), and I’ve seen examples where human skills have improved.

In my own practice, my critical thinking skills have improved with weekly AI use over the last two years. It’s what I class as an unexpected but welcome benefit.

This doesn’t happen for all, though.

It depends on the person and their intent, of course.

Research and experiences seem to affirm my thoughts that the default will be to over-rely.

I mean, why wouldn’t you?

This is why any AI skills program you’re building must focus on behaviours and mindset, not just ‘using a tool.’

You can only make smart decisions if you know when, why, and how to work with AI.

One unignorable insight I’ve uncovered from collecting research over the last few years, and psychoanalysing that together with AI, is the importance of confidence in your capabilities to enable you to think critically with AI.

A diagram illustrating how confidence shapes critical thinking with AI, featuring four quadrants: Independent Thinker, Balanced Evaluator, Avoidant User, and Blind Follower, framed by high and low AI literacy and trust levels.
Where are you playing?

This is the battleground of most social spaces today.

High-performing organisations and teams will be those that think critically with AI, not outsource their thinking to it.

Being a “Balanced Evaluator” is the gold standard. So, we could say that thinking about thinking is the new premium skill (more on that ltr).

The combination of high AI literacy (skills, understanding how AI works, limitations) with high trust (knowing the right tool for the job and a willingness to use it effectively) is not straightforward.

That’s where you come in as the local L&D squad.

To be here, you must critically engage with AI by asking whenhow, and why to trust its output. This requires questioning, verifying, and a dose of scepticism that too many fail to do but sorely regret when it backfires.

Also, don’t interpret “AI Trust” as blind faith. This is built through experimenting and learning how the best tools work.

What does meaningful learning with AI look like?

I (probably like some of you) have beef with traditional testing in educational systems.

It’s a memory game, rather than “Do you know how to think about and break down x problem to find the right answer?” We celebrate memory, not thinking (bizarre world).

My beef aside, research shows partnering intelligently with AI could change this.

This article, between The Atlantic and Google, which focuses on “How AI is playing a central role in reshaping how we learn through Metacognition”, gives me hope.

The TL;DR (too long; didn’t read) of the article is that using AI tools can enhance metacognition, aka thinking about thinking, at a deeper level.

The idea is, as Ben Kornell, managing partner of the Common Sense Growth Fund, puts it, “In a world where AI can generate content at the push of a button, the real value lies in understanding how to direct that process, how to critically evaluate the output, and how to refine one’s own thinking based on those interactions.”

In other words, AI could shift us to prize ‘thinking’ over ‘building alone.’

And that’s going to be an important thing in a land of ‘do it for me.’

Side note: I covered my view on the future of learning ditching recall and focusing on human reasoning, in a previous post. You’ll find a bunch of examples showing this in action there.

To learn, you must do

The Atlantic article shared two learning-focused experiments by Google.

In the first, pharmacy students interacted with an AI-powered simulation of a distressed patient demanding answers about their medication.

  • The simulation is designed to help students hone communication skills for challenging patient interactions.
  • The key is not the simulation itself, but the metacognitive reflection that follows.
  • Students are encouraged to analyse their approach: what worked, what could have been done differently, and how their communication style affected the patient’s response.

The second example asks students to create a chatbot.

Coincidentally, I used the same exercise in one of my “AI for Business Bootcamps” last year.

It’s never been easier for the everyday human to create AI-powered tools with no-code platforms.

Yet, you and I both know that easy doesn’t mean simple.

I’m sure you’ve seen the mountain of dumb headlines with someone saying we don’t need marketers/sales/learning designers because we can do it all in ‘x’ tool.

Ha ha ha ha is what I say to them.

Clicking a button that says ‘create’ with one sentence doesn’t mean anything.

To demonstrate this to my students, we spent 3 hours in an “AI Assistant Hackathon.” This involved the design, build, and delivery of a working assistant.

What they didn’t know is that I wasn’t expecting them to build a product that worked.

Not well, anyway.

I spent the first 20 minutes explaining that creating a ‘good’ assistant has nothing to do with what tool you build it in and everything to do with how you design it, ya know, the A-Z user experience.

Social media will try to convince you that all it takes is 10 minutes to build a high-performing chatbot.

While that’s true from a tech perspective, the product and its performance will suck.

You need to think deeply about it

When the students completed the hackathon, one thing became clear.

It’s not as simple or easy to create a high-quality product, and you’re certainly not going to do it in minutes.

But, like I said, the activity’s goal was not to actually build an assistant, but rather, to understand how to think deeply about ‘what it takes’ to build a meaningful product.

I’m talking about:

  • Understanding the problem you’re solving
  • Why it matters to the user
  • Why the solution needs to be AI-powered
  • How the product will work (this covers the user experience and interface)

Most students didn’t complete the assistant/chatbot build, and that’s perfect.

It’s perfect because they learned, through real practice, that it takes time and a lot of deep thinking to build a meaningful product.

“It’s not about whether AI helped write an essay, but about how students directed the AI, how they explained their thought process, and how they refined their approach based on AI feedback. These metacognitive skills are becoming the new metrics of learning.”

Shantanu Sinha, Vice President and General Manager of Google for Education

AI is only as good as the human using it

Perhaps the greatest ‘mistake’ made in all this AI excitement is forgetting the key ingredient for real success.

And that’s you and me, friend.

Like any tool, it only works in the hands of a competent and informed user.

I learned this fairly young when a power drill was thrust into my hands for a DIY mission. Always read the instructions, folks (another story for another time).

Anyway, all my research and real-life experience with building AI skills have shown me one clear lesson.

You need human skills to unlock AI’s capabilities.

You won’t go far without a strong sense (and clarity) of thinking, and the analytical judgment to review outputs.

Embrace your Human Chain of Thought

Yes, I made up this phrase (sort of).

Let me give you some context…

Early iterations of Large Language Models (LLMs) from all the big AI names you know today weren’t great at thinking through problems or explaining how they got to an answer.

That ability to break down problems and display its thinking is called a Chain of Thought technique.

This was comically exposed with any maths problem you’d throw at these early-stage LLMs.

They would struggle with even the most basic requests.

It’s a little different today, as we have reasoning models. These have been trained to specifically showcase how they solve your problems and present that information in a step-by-step fashion.

We now expect all the big conversational AI tools to do this, so why don’t we value the same in humans?

Those who nurture this will have greater command of their career.

So don’t ignore your Human Chain of Thought.

Focusing your energy on the ability to explain your reasoning is far more useful in a world littered with tech products that can recall info on command.

Tools to enhance, not erode your thinking with AI

A couple of useful tools and frameworks to get you firing those neurons from the most powerful tool at your disposal (fyi, it’s your brain).

1/ Good prompting is just clear thinking

Full disclosure: There’s no such thing as a perfect prompt.

They’re often messy, don’t always work every time in the same pattern and need continuous iteration.

Saying that, you can do a lot (and I mean a lot!) to set yourself up for success.

Here’s a (sorta framework) I use to help think critically before, during and after working with AI.

Flowchart displaying a framework for critical thinking with AI, highlighting steps including assess, pre-prompt, output analysis, challenge, role reverse, and prompt.

Step 1: Assess

Can AI even help with your task? (It’s not magic, so yes, you need to ask that)

Step 2: Before the prompt

  • What does the LLM need to know to successfully support you?
  • What does ‘good look like’?
  • Do you have examples?

⠀And, most importantly, don’t prompt and ghost.

Step 3: Analyse the output

  • Does this sound correct?
  • Is it factual?
  • What’s missing?

Step 4: Challenge & question

I’m not talking about a police investigation here. 

Just ask:

  • Based on my desired outcome, have we missed anything?
  • From what you know about me, is there anything else I should know about ‘x’? (works best with ChatGPT custom instructions and memory)
  • What could be a contrarian take on this?

Step 5: Flip the script

Now we turn the tables by asking ChatGPT to ask you questions:

Using the data/provided context or content (delete as needed), you will ask me clarifying questions to help shape my understanding of the material.

They should be critical and encourage me to think deeply about the topics and outcomes we’ve covered so far. Let’s start with one question at a time, and build on this.

This is a powerful way to develop your critical skills and how you collaborate with AI.

P.S. Get more non-obvious insights and guidance on AI prompting in my short course designed specifically for busy people like you.

2/ Unpack the problem

Before you start building that next ‘thing’, check out this little framework, which has helped me to do my best work over the last decade.

3/ Partner with AI, don’t use it like a one-click delivery service

If I had a dollar for every time I said this, I’d be a billionaire by next year.

Often, it’s the small and simple actions that can bring the most valued results.

That’s not to say it’s easy to do.

In this video, I share how you can use AI to improve your critical thinking as a thought partner.

Final thoughts

There’s much more to say about this, friend.

But we’ll pause here for now.

Thinking is cool, and thinking about thinking is even cooler.

Let your brain dwell on that for a bit. AI can be an extension of your thinking, but never let it shape it.

Keep being smart, curious and inquisitive as I know you are.


Before you go… 👋

If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.