With nearly 20 years at the forefront of learning technology, I help L&D professionals harness technology to improve performance and skills. My mission is to simplify complex tech, making it accessible and actionable. I work with leading global Fortune 500 companies, and share weekly insights with 5,000 readers in my Steal These Thoughts newsletter.
The LMS is dying (apparently), Roger, the latest automated AI workflow, is going to replace my content engine and make me trillions of dollars while I sleep (allegedly), and Ninja claw, Tigerclaw or Dragon claw will do life for me so I can just wander around asking “why am I here?” (how fabulous).
Jokes aside (I do love them, though), I’ve never seen such a typhoon of pressure on the everyday human.
So, I think we all need to…BREATHE.
Better? Great.
What we can focus on is: What actually matters to us?
I believe that part of the answer to that is our skills. I get it’s hard to build the “right” skills to work with AI. That’s why I focus on skill sprints of 3-6 months these days because so much is expiring, evolving and emerging all at once.
In this one, we’re exploring 3 high-value AI skills you need in 2026.
What if your AI skills from last year are already outdated?
All the feelings
I know that’s a scary question, but damn does it move fast.
One week I’m happy making my AI assistants smarter with internal knowledge files and now I’m running around extending their capabilities with MCP’s, plugins and all sorts of sorcery.
My point being, sitting still for too long on the technical side of AI tools is becoming somewhat dangerous. I know too many people chasing after tools, but I’m more interested in cross-platform skills that enable high value from the majority of tools.
I have 3 today which I believe will serve you well across the never-ending spectrum of AI applications.
What are AI skills?
An engineer will have a different view on this vs an L&D pro.
I break AI skills into two components:
One component is “How do I use ‘x’ tool and it’s features”
AI ‘fluency’ and ‘literacy’ are better ways for us to frame “AI skills”
These frameworks acknowledge that using a generative AI tool, most popular being an LLM, is about more than the technical skill of using the tool alone. Personally, I like the term ‘AI fluency’ because the technology moves so fast. Which means both the tech skills and behaviours you need to work with it are forever updating.
For the purpose of today’s conversation, we’ll focus specifically on the technical skills side of this because I have written a lot lately on the behavioural side.
Know what’s expiring, evolving and emerging (The 3 E’s)
I do a quarterly skills review.
It used to be every half year, but AI changed that plan.
Your skills are an ever-flowing organism (best analogy I could think of in this moment). That means they never stand still, and without attention, they’ll degrade.
If your goal is to build a talent stack (a combo of your skills, behaviours and attitudes) that’ll keep you employed for the long term, you’d be wise to perform regular maintenance on yours too.
I use a simple framework I picked up from a Gartner report over a decade ago.
All you do is look at your current skills and ask:
What skills are expiring and no longer serve me and/or the world today?
What skills do I need to evolve to meet the demands of today?
What are the emerging skills I can get ahead of?
While I say simple, that doesn’t always equal easy.
You could even spin this as a bit of a life analogy too. We all need to know what to leave behind, what to double down on and what to keep a lookout for.
I’ve tagged each of the 3 AI skills I recommend you invest in with one of these categories to give you a sense of where it’s in your current skill cycle.
3 high-value AI skills you need in 2026
1️⃣ Prompting [Evolving]
No, it’s not dead, but it has evolved.
AI skills have such a short shelf life, and prompting was hailed to be the most important skill of the century. While I didn’t agree then (or now), it is the main way we interact with LLMs.
At its core, prompting is just inputting a query.
A lot of frameworks from 2023-2025 are no longer needed because models have become much more capable with memory and custom instructions. Prompting is also weird because no one template works and two people with the exact same prompt can get widely different outputs.
You still need to prompt, just less skilfully than before because of our next two skills.
These sound the same but I’m framing them in two ways.
One for builders/engineers and the other for non-technical users.
For builders, context engineering is the science of high-signal curation. They’re not stuffing AI with every possible file. The goal is to build dynamic systems that “just-in-time” retrieve the exact tool, specific knowledge chunk, or agent-to-agent communication needed for the next step.
This is not too dissimilar from what end users like me, and you can now do.
As we start using more AI agents, knowing how to provide the right source, in the right format at the right time is critical. We already see these opportunities with memory, custom instructions, connecting our favourite apps and uploading multiple file sources into our chats.
Context engineering for the everyday human is about building the infrastructure around the AI, so it has everything it needs to make better decisions.
An art and science in itself.
We could say that prompting + context layering + skills/instructions = the 3 layers of a strong AI response.
Fyi, I have a guide and a video series on context layering/engineering coming in a few weeks.
3️⃣ Creating Skills for AI [emerging]
The best example of this is Claude Code and Cowork.
You can create a skills package which teaches the LLM how to perform a task. It’s a combination of context, instructions and guidelines that can repeatedly do the task with little oversight.
To make the best use of this, you’ll need to get comfortable with creating Markdown files. In these, you’ll unpack not only how to complete a task but why, and all the micro elements that make up the big choices. Then maintaining that skill just like you do with your own.
We’ll see this capability with more LLMs across this year. The great thing is you can take your .md files to any tool, so maybe we’ll all be building .md files now.
The real skill here is knowing how to break down a process for an LLM to understand with a combination of instructions, examples and evaluations.
Final thoughts
Ok, folks.
That’s my thinking out loud for the day.
Obvs, AI moves so fast, these are relevant skills right now, but maybe not forever.
Before you go… 👋
If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.
You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.
Whenever someone asks me, “How do I learn AI?” I have to pause because it feels like it’s a double-loaded question.
I understand why people want to ask me that, especially in my line of work and visibility in showcasing different use cases of artificial intelligence specifically in the realm of learning and development.
Despite this, it’s still really hard for me to give a concrete and clear answer.
It’s difficult because there are so many courses, resources, and experiences that are available to you. For the most part, they all sound the same, look the same, and tell us that they’re gonna give us the same outcome.
I’m always of the belief that there isn’t one tool, one method, or one strategy to rule them all.
And… to top it all off, the term “learn AI” is pretty ambiguous.
What I assume they mean is to understand how a generative tool, like an LLM, works and how they can work with it to get their desired results.
What is the context of the individual and organisation?
What are the constraints they may be experiencing?
What is their current starting point?
Without understanding all of these key inputs, it’s very hard for anyone to give you a clear answer to the question I’m constantly posed.
Now I know you’re probably reading this and thinking “oh God please stop writing so much and just tell me what I should do.”
So that is exactly what we are going to do today.
While I’m not gonna give you a clear 100% “this is the course to rule them all”, what I will do is share a selection of choices.
What is AI Fluency?
I’ll be using this term a lot, so let’s unpack it.
A simple way to define AI fluency is the ability to understand, use, and effectively collaborate with AI tools in your day-to-day life and work.
AI fluency is not about turning people into software programmers or highly technical experts. Instead, it simply means equipping people with the confidence and capability to think and work alongside AI.
In practical, easy-to-understand terms, being “AI fluent” means you know how to do the following:
Know when to use it: Understanding what AI is actually good at, recognizing its limitations, and deciding when a task needs a human touch instead.
Communicate clearly: Knowing how to give AI tools the right instructions (prompts), context, and constraints so they give you helpful and accurate results.
Apply human judgment: Critically evaluating the information AI gives you rather than taking it at face value. This means being able to spot mistakes, biases, or missing context.
Use it responsibly: Understanding the basic ethical guidelines of AI, such as protecting sensitive or private data, and taking ultimate responsibility for whatever the AI helps you create.
I like this framing because it’s about treating AI as a capable “thought partner” to make your life easier, while always relying on your own human judgment to steer the ship.
Now we’ve clarified that, onwards we go.
How I assessed these zero-cost AI courses
Okay, here’s the method to my madness with this assessment.
I’ve looked across the market at all of the courses which are tagged with ‘AI fluency’.
There are a lot of free resources on YouTube with a varying degree of quality to consider, but I’ve not included these here because that’s really based on your own taste.
This led me to focus my testing on the leading AI and tech companies building the tools we know. The idea is since they’re building these tools, they might just be a good place to learn about their capabilities.
With that focus, I spent the last six weeks completing courses from the following companies:
Google
DeepLearning.ai
Anthropic
Microsoft
That gives us four courses in total.
You’ll notice one big omission here from OpenAI.
I did visit OpenAI’s Academy to try and find some resources and courses for the general population. But I found the Academy very difficult to navigate. I didn’t really find anything beyond the ability to sign up to webinars, and rather outdated looking help articles.
That’s not to say that I dislike OpenAI at all.
I would be more than happy to add them into this once I see an improved offer and can feel confident in recommending that to you.
My recommendations for each course will include:
What does the course cover?
Who would it benefit the most?
What will you tangibly walk away with that you can use the next day?
Now, a disclaimer for you…
The right choice for you is contextual, and depends on the 3 questions I posed in the opening section. I’ll give you my recommendations on which courses are truly worth your time in the sea of hundreds currently available.
Yet, your choice on which of the 4 is most useful to you is…well, down to you.
You can use these options for your company, too
Long-time readers will know I’m all about the value!
Since I’m not part of the AI bros club (let’s be honest my hair is far too fabulous and fashion sense too cool for that), where you get literally nothing outside of “This changes everything” comments, I’m making these recommendations not only for your development, but that of the teams you support too.
A lot of organisations are asking you to lead or play a big support role in adoption of AI.
The good news is once you’re confident with the content, you can then use it to your advantage to enable your org to be just as smart with AI (if you want to, of course).
The 10 second answer
The 4 AI fluency courses I’d recommend for L&D pros
Let’s get this show on the road.
This will be the only playbook you need, and I’ll do my best to keep it updated as the years roll or until my battery dies (hopefully, not anytime soon).
AI For Everyone by Deeplearning.AI
Funny name for this one because I don’t believe it’s for everyone.
Of all the courses I share today, this one is the most advanced-ish, so I’d reserve this for those who are farther down the yellow brick road of AI awareness.
Who it’s for:
This was one of the first AI courses I came across in mid-2023, although it covers much more than Generative AI. It’s taught by Andrew Ng, Founder of Coursera and Google Brain (a deep learning predecessor to the more well known Google DeepMind), and an actual computer scientist.
So yes, this guy legit knows what he’s talking about.
While I found this experience great for my inner nerd, it’s certainly only for those who want a broader understanding of everything AI, not just LLMs and everything under the generative umbrella.
Maybe one best placed for my fellow learning tech folks.
What it does well:
The course gives you everything you’d need to know about AI without being an engineer.
Talking head videos and resources are clearly explained, and easy to download for your future viewing pleasure.
Where it falls short:
Now, this only falls short depending on your context.
If you’re not an engineer or don’t care about getting deeper into an AI specialised role, then it will probably fall short for you in many areas with its overwhelm. It is stuffed with lots of info, and most is non-relevant if you just use a standard LLM every day.
I mean, it has no hands-on use of AI tools but I find it hard to see where this could be weaved in with all the Matrix level explanations.
Verdict:
❌ Skip, if you’re 95% of average AI users.
✅ Dive in if you’re an engineer, curious nerd or want to make a career pivot to AI as a broader category.
While they came late to the useful LLM party, it can’t be denied they’re certainly a frontrunner with their abundance of tools and Godzilla-sized ecosystem.
Who it’s for:
This is pitched at the pure beginner. If you or your teams look at an LLM and have no clue what’s going on, this is the place to start.
Google’s team give you an overview of the basics that’ll give you a better standing in those workplace conversations.
What it does well:
Like I said, it’s framed perfectly for beginners, and in the traditional online course delivery we all know, but maybe don’t love. There’s plenty of focus on low-level tasks where you can collaborate with AI, and a surprisingly good overview of prompting too.
Where it falls short:
Like most choices on this list, the videos are super corny.
Even more so with these Google ones. I feel like the real humans could be avatars because the delivery is so scripted and robotic which breaks any form of immersion.
If you know the basics and are confident with generic LLM activities, you won’t get much from this.
Note:If you want to go to the next level, Google offers a paid professional certificate which bolts on a “How to build with AI” category, which is missing from all of the above. Free to access tools like Google AI Studio are criminally underrated because they’re pitched as “developer” tools but are very easy to use once you’re set up with some basic knowledge. This course provides that next step in your AI fluency journey.
Verdict:
❌ Skip if you’ve been using LLMs for a few years and know the basics.
✅ Dive in if you’re at beginner territory and need to build the foundations with understanding and working with LLMs
AI Fluency: Framework and Foundations by Anthropic
I know I said “this isn’t a competition with one winner”…but…
This one is my most highly recommended for everyone.
I like this one because it’s focused on “how to work and think with Gen AI and LLMs as a category”, rather than, “here’s how to use our tool”. It does a good job on prompting you (pun intended) to think about how you engage with AI and its outputs.
This is more valuable for the long game with AI. The tools will change, but investing in how you think and work with AI won’t.
Who it’s for:
It’s for anyone ready to move beyond tools and templates and start thinking with AI.
So, basically, everyone reading this.
Simply having access to AI doesn’t make you competent and fluent with AI.
Anthropic
What it does well:
The course gives you a clear fluency framework you can pick up and implement with a few tweaks. The exercises (when you find them) are practical, immersive, and teach you to think about how you’re working with AI, not just what you’re asking it.
There’s a strong breakdown of techniques that improve both AI outputs and your own thinking in the process, and I liked the encouragement to work alongside a LLM to solve the problems the course unpacks.
Where it falls short:
The UX needs work.
Exercises are buried below the fold with no direction to scroll down. I was surprised an instance of Claude wasn’t embedded for these. Why not include live demo exercises in the videos to actually model thinking with AI?
There’s also a section covering transformers, compute, and tokens. Nice to know, but for most people this isn’t the advantage the course frames it as.
Verdict:
✅ Take it.
It teaches you how to work with AI, not just how to use a tool. That alone sets it apart from 90% of AI courses out there right now.
I can’t think of someone who wouldn’t benefit from it.
I know a lot of companies are using Microsoft infrastructure, so it was a no brainer to include this one.
The ultimate assessment of ‘if this is right for you and your teams’, is if you’re using Copilot as your go-to company LLM.
Who it’s for:
The easy answer is if your organisation’s LLM of choice is Copilot and you want to make the most of it across all the popular MS apps.
What it does well:
A comprehensive walkthrough of Copilot that’s perfect for any MS powered organisation. If you’re using Microsoft tools, you can’t go wrong here.
You also get a good overview of AI Fluency (according to Microsoft) and ideas on how to amplify its principles across a team, department and org level.
Where it falls short:
It’s painfully long, and stuffed with too much unnecessary content.
When you look at it, it’s just poor design. There’s some good stuff here but it’s buried by mountains of content. It felt like the brief was ‘how much can we stuff in?’. This is the biggest problem for this one. If you’re a copilot user, it’s made for you but you’ll be going through a lot of ‘nice to know’, not ‘this is actually gonna help me do better work’ content.
I don’t like the gamified angle with experience points either.
That might be just me, yet I felt like I was in some 90’s retro game at points, and I didn’t want to be there.
Verdict:
❌ Skip if you’re not using Copilot
✅ Dive in if you’re using Microsoft across your organisation with Copilot but select the parts that make most sense for your work, as you don’t need everything on offer here
3. Create a free Google account to access Google AI Studio, where you can test prompts, build apps and learn about agents in one place.
That’s it.
Google’s AI Essentials gives you everything you need to know about generative AI, and Anthropic’s fluency course builds on this by teaching you how to work and talk to AI without degrading your power of thought. Google’s AI Studio is the cherry on top as it lets you practice everything in one space with free access to top models.
That’s my zero-cost learning plan for you 😉.
Final thoughts
There you have it, friend.
These are my picks of the most impactful zero-cost AI courses worth checking out. Like I said, there’s no overall winner, just what works best based on your context, experience level, and goals.
I hope this helps you build your skills and enable those around you with AI too.
Let me know your thoughts on these courses, and any I missed that you think I should check out.
See you next time, human.
Before you go… 👋
If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.
You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.
It’s everything, everywhere, all at once, and kinda feels like a chokehold at times, or maybe that’s just me.
Yes, I have contributed to this myself in my domain of learning. I love to endlessly explore how modern technology can enhance and amplify human learning, but these AI bros on social media are making it hard for me to keep enjoying that.
Nonetheless, while AI is cool, sexy and is an integral part of the infrastructure of how work and learning are done, we have more to life than those two little letters.
So, I thought what better way to bring some balance to AI everything than by sharing some good old-fashioned analogue tools that any human can plug and play.
We compare ourselves to others we shouldn’t, fear the tech takeover and continue to be glued to sensationalist headlines curated by outlets that position themselves as ‘news’.
It’s tough, and I feel it too.
Before the tidal wave of AI, ‘purpose at work’ used to be one of the top drivers in both L&D and employee engagement strategies. It’s fallen to the side in the rush to jump on the AI bandwagon, yet, it feels like a crisis of purpose could be on the rise.
If AI is threatening to do everything that we do, where does that leave me and you?
It’s for that reason that purpose both at work and in real life is having more of a moment.
But I feel like it has to be said, as it is the ultimate test in my opinion.
If you’re not prepared to cough up, let’s say, $100 a year to use your L&D product, then don’t expect your workforce to do it.
This is the same question I ask when crafting my products and services. We each vote with our time, attention and money. It’s the ultimate compliment for someone to say ‘yes’ to all three.
I can sleep at night knowing I say YES to these.
I encourage you to reflect on the same at the start of every year when everyone talks about ‘Learning Strategy’.
Don’t just focus on strategy, understand the value.
Ask your whole team, if this was a paid product, would we all pay to use it?
The answer to this is everything you need to know.
Dropbox, the cloud storage provider, was created in this way. Drew Houston, the CEO, was so frustrated with existing solutions that he built his own. He pays for it, and it turns out millions would pay for it too.
If you wouldn’t buy your own product, why should anyone else?
3/ The simple skill-building strategy to stay relevant
Ahh skills…why do we insist on making it so hard?
Our industry is built to support best in class skills, yet we find so many ways to make it complicated with complex terminology like oncologies, taxonomies and the latest ‘skill-based systems’, whatever that means.
I feel exhausted just reading that last sentence.
It can be simpler, it should be simpler.
Part of my rituals at the beginning of the year involves analysing my skillset, but with none of the complex tools our industry chucks at us. Instead, I use something much simpler to ensure I have the most cutting-edge skills to do what I do, and keep ahead of the pack.
This is what I do.
I grab a notebook or open a doc and do the following:
List my current high-level skills
The emerging technology, trends and challenges in my industry
Then, I ask these questions:
What skills are expiring and no longer serve me and/or the world today?
What skills do I need to evolve to meet the demands of today?
What are the emerging skills I can get ahead of?
Yes, it’s that simple.
You can call me crazy, but I believe you could graft this onto a much larger population of a workforce, too. We’re often convinced that it all needs to be complex to be valuable, but that’s not right.
Sometimes the simple things can have the biggest impact.
Final thoughts
Ok, that’s it for this one, friend.
Expect more analogue and digital tools to keep coming your way. At the end of the day, we all know that any tool is only good in the hands of a competent human.
→ If you’ve found this helpful, please consider sharing it wherever you hang out online, tag me in and share your thoughts.
Before you go… 👋
If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.
You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.
Yet, I find myself in rare company these days as so many seek to use AI to ‘do the work’ instead of collaborating to do ‘your best work’.
I think that’s gonna be a big problem and I have some thoughts on that.
Before GPT
The first time I used a generative AI tool was back in early 2023 when a colleague introduced me to a platform called GPT 2.5 from, at the time, a little known company called OpenAI.
ChatGPT didn’t exist yet.
This was it’s basic form before we experienced life through a prompt bar. The only people who were playing around at this point were nerds like me. After I got around the not so friendly interface, I saw the impact of this tech’s early potential.
At that time, I kept thinking this would be a great way to collaborate with technology to do better work.
What I couldn’t see at that point, or perhaps didn’t want to recognise, is humanities desire for instant gratification and the obsession to outsource/delegate every piece of work. The current AI marketing from all corners of the industry leans on this sense of ‘work is bad, so let AI do it for you’. I know that sounds like some weird slogan from a commercial in the 60’s.
The purpose of work
I get a great deal of value from my AI tool stack.
Perhaps I’m the weird one but my focus with AI is to help me to my best work, not outsource it.
The work, very much like learning, is where the hard stuff happens.
The ‘aha’ moments you would never have conceived without the focused effort, the seemingly unrelated events that craft a connective bridge of ideas which lead to something incredible.
My industry of workplace learning has/had a saying “Learning is the work and the work is learning” – its something like that.
It seems like too many of us have fallen out of love with doing the work.
Again, the problem isn’t AI, it’s us.
Our intentions have become skewed in the promise of an era where an artificial intelligence will do anything and everything for you. Yet, we rarely sit back to ask “Just because we can, does it mean we should?”, and even if we can, do we really want to?
Doing ‘the work’ is a big part of purpose for many.
Purpose, meaning and fulfilment is a dumpster on fire that is quickly rolling across society as we race to delegate, automate and outsource everything in the pursuit of “reclaiming time” or “Being efficient”.
We may not see it now, but its coming.
A bit of effort, struggle and focus is not bad for you, so don’t discount “Doing the work”.
“If you don’t make mistakes, you’re not working on hard enough problems and that’s a big mistake.” – Frank Wilczek
Before you go… 👋
If you like my writing and think “Hey, I’d like to hear more of what this guy has to say” then you’re in luck.
You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.
L&D teams trying to and/or being responsible for total company AI adoption is a fools errand.
We both know this is stupid, yet I see too many instances of companies trying to “train” their way into successful AI adoption.
Of course, L&D teams alone aren’t going to make any organisation achieve meaningful AI adoption (however you measure that). Yet, we do have a part to play, and recognising where we can best support is critical.
So, lets explore Where L&D Can Support Meaningful AI Adoption.
It’s odd because the validation of “adoption” has many definitions, dependent on the context and environment. The common pitfall is to measure adoption as ‘use of AI tools’ alone.
As we know, with previous technology, usage alone doesn’t mean meaningful adoption.
Setting what adoption looks like in your organisation is not a task for the L&D team.
Yet, we have an opportunity to contribute to long-term and meaningful adoption of AI across workforces as part of a wider collaboration in a community.
Let’s talk about that…
It takes more than access
Let’s go beyond the veil of bullshit we see online.
Access to an AI tool alone means nothing, and putting on one hour lunch and learns to “make people learn AI” is a comical up-skilling strategy.
If you’re a long-time reader, you’ve heard me become a broken record when I talk about what it takes to nurture meaningful and long-term change. We have much to consider with context, culture and constraints in each environment. No two workplaces are the same, that’s why the cookie-cutter “adoption frameworks” make me laugh.
They’re a good point of inspiration, but you shouldn’t follow them like a strict set of instructions.
Saying that, what is it we need to consider beyond tools?
People, Systems and Tools
As you’ve probably guessed, launching new technology and tools alone rarely leads to meaningful adoption.
There’s a bigger ecosystem at play.
We have to consider:
1/ People
Where are people at today, and how do we meet them?
Everyone will have a different understanding, maturity and receptiveness to something new and unknown. In AI’s case, we have a mix of emotions from “will this take my job” to “I want it to do all this stuff I hate doing”.
The most difficult part of a change process is people, because we’re all so unpredictable.
2/ Systems
Quite simply, how we work today.
What are the tried, tested and trusted conscious and unconscious systems we have in place? This covers both how we execute tasks and how we think about executing those tasks (deep, I know).
We each follow different types of systems in our day to day.
Understanding what these are and how AI will impact those is key to this change.
3/ Tools
The part you’re most likely more familiar with.
Here, we should consider the tools in use today alongside new ones being deployed, and how to bridge the gap in both understanding and knowing when and where to deploy them.
Too many forget the ‘when and where’ part at their own peril.
For us to recognise where we can provide support and drive value, we must recognise what’s changing.
I think this framework from BCG can help recognise the moments where performance support is most needed with AI transformation.
They propose it for navigating AI transformation at scale, and through an L&D lens, I see this as a conversation point of what to map against when focusing on how best to support workforces.
It’s built on two key dimensions:
1️⃣ AI Maturity
It progresses from tool-based adoption by individuals to workflow transformation, to full, agent-led orchestration. Most organisations, and even teams within them, operate across multiple stages at once, not in a linear path.
2️⃣ Workforce Impact
This spans how tasks are executed, to what skills are needed, to how teams are structured, to how organisational culture must evolve to support new ways of working.
While this covers the wider transformation AI brings across businesses, it acts as a roadmap for L&D.
A roadmap is often what we need because it’s not uncommon for senior leaders to treat “training” (as they call it) as a boomerang that’s thrown at will when they decide people need to know stuff.
The framework above provides a view of where the friction/pain points/ problems exist in the cycle of change. That’s where we should focus.
Map it out
I mentioned before not to blindly follow frameworks, and that advice is the same here.
This view from BCG is a useful foundation for each of us to think about “where can we add value”, but it will look different for each environment.
So, I’d recommend you map out what your organisational journey looks like today.
Explore the 3 pillars of tasks, talent and teams across your business and how/where AI is starting to and might impact these. It’s here that you will uncover the friction and pain points where we can be of most service.
Some of that will be through tooling, no doubt. Yet, I feel pretty safe in saying you’ll be spending a good deal of your time navigating changes within people and systems.
Final thoughts
I’m going to leave it here for now, folks.
There’s much to say, of course, but only so much attention span I can ask you to give.
I’m thinking of expanding some of this thought into a long-form video. If that sounds like something you’d like to see, let me know.
In the meantime, some additional resources to explore on this include: