Categories
Artificial intelligence

The Ultimate Guide To Using (or Avoiding) AI At Work

There’s a time and place for everything.

  • Dumb hairstyles = school and college
  • Overpriced tight-fitting clothes to impress the opposite sex = your 20’s
  • Not being judged for eating an entire chocolate log = Christmas
  • Using generative AI tools = ?

While I hope you agree with the rest, the last one is debatable.

Depending on your relationship with AI, your view on ‘when’ to use its delightful powers can be vastly skewed.

The ‘AI cultists’, as I like to call them, will proclaim we should use AI for everything, while ‘dooms dayers’ will warn you not to touch it as you’ll lose your humanity.

Of course, the truth of the matter is not so clear-cut.

There’s an interconnected web of assessments and decisions to be made. The good thing is this is all human-powered. The world has been so focused on ‘how’ to use new tools, that we’ve paid little attention to why and when.

Let’s change that.


📌 Key insights

  • AI is a tool, not a saviour
  • Boring and basic is where AI shines best with tasks
  • Balance your understanding and application for maximum benefits
  • AI is not a hammer

Assess tasks not jobs for AI

I appreciate LinkedIn CEO Ryan Roslansky’s concept of assessing ‘tasks, not jobs’ in the context of generative AI at work.

This idea originates from Ryan’s Redefining Work article, where he explores how AI will accelerate workforce learning and amplify the importance of skills.

Ryan suggests moving away from viewing jobs as titles, and instead, seeing them as a collection of tasks. These tasks will inevitably evolve alongside AI and other technological advancements. He recommends breaking your job down into its primary daily tasks.

You can bucket those tasks in this format:

  1. Tasks AI can fully take on for you, like summarising meeting notes or email chains.
  2. Tasks AI can help improve your work and efficiency, like help writing code or content.
  3. Tasks that require your unique skills – your people skills – like creativity and collaboration.

This sets the stage for how I currently recommend working with AI.

Where AI helps best

You might see glamorous examples of generative AI tools on social media.

In reality, the majority of benefits come from tackling boring and basic tasks. I’m talking about writing better emails, summarising reports, and brainstorming ideas.

It’s smart to delegate simple, mundane, yet time-consuming tasks to AI.

This creates space for more human-centred work.

I don’t understand why some people seem determined to have AI handle the human elements. What a boring life that would be! I want AI to handle the laundry via a workflow so I can focus on building cool stuff – not the other way around.

Asana's report on the leading use cases for generative AI at work.
Source: Asana AI at Work Report
A Gallup chart showing how AI tools are used differently across job levels.
Source: Gallup

A bunch of smart folks have done lots of research on this.

The above visuals come from Gallup and Asana, but I want to talk a little bit about a joint research project from Boston Consulting Group and Harvard.

These two powerhouses wanted to cut through the hype to see if AI tools like ChatGPT can improve productivity and performance. They worked with 758 BCG consultants (about 7% of their individual contributor-level staff) and split them into three groups:

  • One without AI access
  • One with GPT-4
  • Another with GPT-4 plus some training on prompt engineering

These consultants tackled 18 real-world consulting tasks to see how AI would affect their work.

The results? Pretty impressive, I’ve got to say.

The consultants using AI managed to complete 12.2% more tasks and knocked them out 25.1% faster. But here’s what really caught my attention – the quality of their work shot up by more than 40%!

It’s one thing to do something at speed, but another to do it at such high quality too.

That’s the trap I see happening in every industry right now. Too many prioritise speed over quality. You can have both if you craft the right skills to collaborate with AI.

There was a catch though (when is there not!).

When consultants tried to use AI for tasks it wasn’t built for, their performance dropped by 19%.

I don’t see this as a negative. It’s very helpful to know where the limitations are. You cannot have a balanced approach without this. Another particularly interesting outcome was how the consultants ended up using AI.

Some folks took a hybrid approach, blending AI with their expertise, while others went all-in and relied heavily on AI.

Both styles seemed to work, but context was key.

While those marked as novice employees found the biggest performance gains, this dropped with those classed as experienced workers. Those in the latter category still saw a modest boost of 15% in most tasks.

TBH, I’d take that on most days.

You can’t buy time

Time is a fickle thing.

It’s our most precious and non-renewable resource.

If you’ve been to any of my keynotes in the past year, you will have heard me touch upon this. Perhaps it’s the broadened awareness of my mortality.

It’s probably got something to do with being very close to 40 years old, which my 23-year-old self didn’t expect to happen.

My impending mid-life crisis aside, time is something you should care about deeply.

You can always make more moneybut you can’t buy more time.

The biggest promise and opportunity with AI tools is being able to reclaim that precious resource.

I’m not fussed about making 6-figures or building teams with AI only. I’m much more invested in getting time back to spend with those close to me and doing more of the human stuff I love at work.

We’re starting to see what people are doing with some of these time gains.

In Oliver Wyman’s AI for business research, they estimate Gen AI could save 300 billion work hours globally each year. I think that would be a wonderful outcome (as long as it doesn’t involve me doing more washing!).

A Boston Consulting Group chart showing how people are using the time they save by using AI.
Source: Boston Consulting Group (BCG)

Where AI Is Not Your Friend

I know this might break some hearts, but…AI is not your saviour.

Life is a mix of opportunities and pitfalls.

Research from BCG and Harvard offers an important lesson: generative AI works exceptionally well when used for tasks it can handle. However, beyond that, it’s the wild west.

As always, context is key in decision-making, and tools are constantly improving. This is where I like to appeal to everyone’s common sense. Yet, as I’m often reminded, common sense, it seems, isn’t so common these days.

It’s impossible for me to cover every task across every industry you might encounter.

Instead, here’s a general framework to help you determine when to use generative AI. The summary is simple: AI works well with tasks with pre-defined guidelines and less severe consequences of a f**k up. It should not be relied upon in what I class as ‘mission critical’ matters, aka the human stuff.

Over-reliance on AI is already a significant threat to education, work, and life.

We explored this in a recent edition on the “Hidden Impact of AI on Your Skills”.

In schools, new research has shown generative AI harms students’ learning because they over-rely on these tools, quickly losing key human skills. More alarmingly, we’ve seen the rise of AI companions as therapists and friends among 18–24-year-olds (especially men), replacing vital human connections.

This is why I always emphasise helping people develop the mindset and behaviours to use AI intelligently. Note: I define ‘using AI intelligently’ as understanding the why, what, how, and when of AI applications versus tasks.

Adoption can easily slip into addiction.

An easy framework to decided when and when not to use generative AI.
Choose wisely, human

How to identify tasks AI can help with

This is the thing we all need help with.

Where can and can’t AI help me?

There’s no clear-cut answer to this. I’d love to give you some fancy 2×2 framework but I don’t believe that will serve you well. Each scenario is context-specific, and generative AI tech is evolving so fast.

I tend to think about my tasks in a macro and micro view.

Your tasks can easily be broken down into sub-tasks (micro). We’ve talked about continuing to invest in your thinking in this era of AI. This is something that requires deep thought and reverse engineering your ideal outcomes.

As an example, I use a little table like this:

A easy decision-making framework to identify tasks AI can help with at work.
Simple and effective

It’s not fancy, but it does the job.

We have two macro tasks:

  • Presenting insights and actions on the L&D functions performance to senior leaders
  • Launching a new internal course

For our first task, my outcome is to deliver a presentation to senior leadership on L&D performance.

So, I break down (in my mind) the micro tasks to reach that, as you see above. I then assign each of those to a column. Note: The first column can be automation without AI.

I don’t use this for every task, only those that I believe, with my current experience of Gen AI, could be an opportunity to work smarter.

What’s key is the AI components are always low to mid-level, and the mission-critical parts are always done by me (the human).

Final thoughts

Knowing how to use AI tools is useful.

But understanding why and when to call upon their power is an advantage.

As we’ve covered, there is no one right way to assess this. The simplest part (imo) is to get clear on what are the uniquely human tasks in your work. Mark these as ‘mission critical’ – so you have zero or very minimal AI assistance.

Your low and mid-level should become clearer with this.

I say this sooo often, but it’s a damn good quote and continues to be relevant in this space:

With great power comes great responsibility

Uncle Ben (Spiderman’s uncle)

Think wisely about when to wield that power.


Before you go… 👋

If you like my writing and think â€œHey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Learning Technology

A Practical Guide to Choosing the Best Learning Tech For Business

Don’t you love buying new stuff?

I do, especially when it comes to new tech. Around this time every year, I find myself doing a strange dance with Apple’s newest iPhone. It’s new, has shiny colours, and has attractive buttons – I have no clue what they do, but I want them!

The problem is there’s a difference between a want and a need.

And I may want…but I do not need it.

This happens inside companies too.

In research for my 2022 talk “How to stop buying bloated learning tech”, I discovered that the average company provides over 88 different workplace tools for employees. A crazy stat, I know.

If you think about it, it’s not really that staggering in today’s environment.

I’ve worked with large organisations with 20+ LMS/LXPs and tiny organisations with nothing but a Google doc.

We’re in a confusing land of boundless technology and pressure from market expectations. We lose our way with acquiring shiny new things when we would do well to step back and figure out what’s the right tool for the job.

I want to help you get unstuck from this.

After a decade-plus dealing with vendors, L&D teams and senior leaders (I’m looking at you CFOs). I’ve picked up a few useful tools to navigate the buying process without losing my sanity.

I hope these serve you well.


📌 Key insights

  • Before buying, clearly understand the problem you’re solving and who it benefits.
  • Use a simple process to evaluate tech options (see framework below).
  • Always ‘try before you buy’ with a pilot if possible.
  • Choose tools that genuinely meet your needs, and resist the temptations!

Hard questions, easy decisions

An easy win in your journey before you even embark on the “let’s buy this new thing”, is to answer these 3 questions:

1️⃣ What are you solving?

2️⃣ Who are you serving?

3️⃣ What is the best tool for that job?

While they seem simple, that doesn’t mean they’re easy to answer.

Your response to these should be treated as a guiding light throughout your buying process. So keep going back to these if you get off-track.

On top of this, I’ve previously shared my â€œUltimate Guide To Buying New Learning Technology Checklist”. It’s a meaty one which walks through the process I used with a 30,000-plus sized business.

The key takeaways are to figure out if you really need anything new and pick partners, not providers. Read the whole thing to learn from my wins and avoid my failures.

How to find the right tool for you 

The process of acquiring new tech goes through a pre, during and post-cycle.

Everything I shared above covers the ‘pre’ phase. So before you eyeball the below, run through those to set yourself up for success (I still can’t shake that term from my corpo days).

Now, we’ll cover what to do in the assessment stage.

You’ve spoken to a few providers and seen some demos here. Things are getting a little bit more serious, and you need a framework to help you decide who to continue dating or ditch.

My Zero-Cost Assessment Framework

Ok, here’s what I do during the dating game of “Who will be our new learning tool?”.

I’m still waiting for Netflix to get back to me on the show concept, btw.

This framework stops me from making bad decisions. We all make bad dating decisions, so don’t be hard on yourself. You’ll note the framework pulls from what we spoke about in the ‘pre’ phase. That’s why you should cover that before scrolling any further.

To assess our options, I create a spreadsheet.

Yes, you read that correctly. I don’t use AI because I don’t need to. However, you can introduce it into the later stage of this for some ‘devil’s advocate’ perspectives.

Right, our spreadsheet (or table) looks like this:

Now, all you have to do is provide the answers for each supplier for a bit of competitive analysis.

Before we get to that, let’s unpack these 4 stages in a bit more detail:

  1. Research: This throws back to our ‘pre-analysis’. Does the tool and supplier help you solve the problem? Make sure it aligns.

  2. Assess: Does it differ from the tools you have? Here I find it useful to ponder if it’s a product or a feature, meaning are you buying something that one of your existing tools might add on? You want a unique product, not a copy of something you already have with a different look.

  3. Connection: Not enough teams consider interoperability between tech. Will the new tool connect with your existing stack to share data? How so, SSO etc, and what APIs are available out of the box? No good having a shiny new toy that won’t play with anyone else. This is where teams get burned most.

  4. Test: When I became a head of L&D, my firm rule with new tech was “Unless we can test it, we don’t buy it”. Looking at staged demos and a few client stories isn’t enough. You need to get hands-on. The best companies will do this for you.

    Anything from 4 weeks – 3 months is perfect.

    You don’t want to be that person who signs off on a multi-year contract on the promise of a product demo. I’ve been there, it sucks and procurement calls for contract breaks are not delightful.

Save yourself, friend.

Analyse with AI

When you’ve answered these questions for each supplier, you can use AI to do a competitive analysis.

You don’t have to, but I feel like I’m committing a cardinal sin if I don’t mention it.

Upload your document to your LLM of choice, and ask:

  • “Give me a competitive analysis of the suppliers in this document. Provide a high-level summary of no more than 100 words. I want a clear outcome and your reasons why you chose a particular supplier as the best option”

  • “Let’s play devil’s advocate with this analysis. What could I be missing? What haven’t I asked or considered as part of this process”

  • “What might be the unexpected and unintended consequences on our current tech stack for users if we introduce this tool? (Note: You will need to provide the context on your tech stack)”

  • “Rank every tool in order of suitability and provide in-depth reasons as to why you ranked in this order based on my requirements”

  • “Create an exec summary of the most suitable tool that I can share with my CPO and CFO.” Power up this prompt by providing an example of what a good summary looks like, and the key points to cover.

I think you get my drift.

AI can be a useful thought partner when you have structured data.

Don’t be seduced by market expectations

It’s easy to be starstruck by technology.

It feels smarter to buy the new thing, instead of fixing the old one. We currently live in an age where AI is often treated like the second coming of the tech gods. I’m pitched at least 3 new tools a day in my DMs.

But let’s not forget that any tool, no matter how advanced, is only as good as the problem it solves.

Read that line again. Let it sink in.

On your journey, you will find lots of tools that want to date you. So, always keep in mind the problem you’re solving.


Final thoughts

Ok, we’ve covered the dating game of L&D tech.

A little time, research and reflection can save you from a dreadful relationship. Try these frameworks out, and let me know how you get on.


Before you go… 👋

If you like my writing and think â€œHey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence

The Hidden Impact of AI on Your Skills

The world has tons of reports/research on performance with Gen AI tools.

It has even more industry use cases by the day.

We know that, when used intelligently, these tools can enhance performance (see thisthis, and this). Plus, it’s clear that we see the biggest short-term ROI in the boring and basic tasks.

I’ve found in L&D (and, to be fair, many industries) that we get distracted. We focus so often on the ‘shiny thing’ that we continually miss the point.

If AI ‘does it for you’, what happens to your skills?

Although I like the power, potential, and continued promise of AI tools, I’m troubled by the unexpected consequences of the manic pursuit of ‘AI at all costs.’

Especially, AI’s impact on skills.

I sense that we already over-rely on certain tools, and in doing so, we both create illusions of capabilities and fail to invest in moments of intentional learning.

Granted, a lot of this comes down to the intent and ability of human users.

But with all-time high levels of use across millions of Gen AI tools, and all-time low levels of AI literacy, we could be heading for a skills car crash of our own design.

Let’s unpack that.

📌 Key Insights:

  • Smart Gen AI use can expand skill capabilities for a limited time
  • We aren’t improving skills in most cases, mastery requires more than AI alone
  • The majority of people will over-rely on AI tools and become ‘de-skilled’
  • AI tools can help us improve critical thinking processes
  • We must be more intentional in how we approach skill-building in an age of ‘do this task for me’

The Controversial Idea: Skills will be destroyed if we let AI do everything

So many people are scared that AI will take their job.

They think they’ll lose because AI tools can do the tasks better.

But what if you lose, not because AI does your job better, but because you over-relied on the temporary power it grants? You’d be the master of your own demise.

It’s easy to think, â€œThat will never happen to me.” 

Maybe it won’t.

But I’d ask you to consider your use of AI tools today. My assumption is that most people use them in a ‘do this thing for me’ approach, rather than a “show me how to do this.”

Here exists a problem we aren’t paying enough attention to.

An AI-first approach will damage the capabilities and potential of skills (if you allow it).

Somewhat an observation for now, but a dark reality I’d like to avoid.

My thinking behind this comes both from real-world experience with consulting clients on Gen AI skills programs, and what I’ve seen in more advanced research this year.

An excellent piece of research from Boston Consulting Group has been one of my favourites on this topic. It unpacks, with an experiment involving 480 of their consultants, that Gen AI can increase productivity and expand capabilities.

That’s the headline, of course.

AI’s impact on skills: What we know today

The problem with most research and reports is that most don’t read beyond the headline.

Hence why we have so many cult-like statements about Gen AI’s endless power. It is powerful, in the right hands. But any power comes at a cost.

For those willing to go deeper, we find both a bundle of exciting opportunities and critical challenges.

Here’s some we haven’t discussed ↓

1/ Gen AI grants short-term superpowers

No surprise here, I think.

Gen AI tools grant easy access to skills we don’t possess. They can amplify the level of our current skills too. The BCG team coins this as an ‘exoskeleton’ effect.

Explained in their own words:

“We should consider generative AI as an exoskeleton: a tool that empowers workers to perform better and do more than either the human or Gen AI can on their own.”

Being a nerd, I compare this to something like Iron Man.

For those not familiar with the never-ending Marvel films, Tony Stark is a character who has no superpowers (but is a highly intelligent human). To play in the realm of superheroes, he creates his own suit of armour that gives him access to incredible capabilities he doesn’t have as a human.

The caveat is that he needs the suit to do those things.

Essentially, using an AI tool is like being given a superpower you can use only for 20 minutes. It exponentially increases your abilities, but without it, you go back to your normal state. And everyone has the same access to this power.

BCG found the same in this research.

We could call this a somewhat false confidence dilemma.

This presents a few challenges to navigate:

  • How do we combat the illusion of expertise?
  • What happens when you don’t have access to AI?
  • How do we stop addiction to the ‘easy option’?

Spoiler: I don’t have all the answers.

However, this temporary boost in abilities often leads to another problem – the illusion of expertise.

2/ We have to fight the illusion of expertise

This is a big challenge for us.

Getting people to look beyond AI’s illusion of expertise.

You know what I’m talking about.

Now everyone has access to creation tools, they all think they’re learning designers who can create their own amazing products. We both know how that’s going to turn out.

As an example in my work, I can build a decent website with AI, which does the heavy coding for me. But I can’t do it without it, not unless I learn how to do it.

Yes, I built x, but I’m not a software engineer.

There’s a big difference, and sadly, I see people falling into this trap already.

Now, with all this new tech, I don’t need to know the ‘why’ or ‘how’ behind something being built by AI.

But what does this mean for my skills?

A big part of skill acquisition focuses on the ‘why’ and ‘what’, in my opinion. I don’t need to know every little detail, but it helps to have a basic understanding.

I see a few unintended consequences if we don’t clearly define what is a ‘short-term expansion enabled by tech’ and what is ‘true skill acquisition’:

  • We over-rely on AI tools, and this over-reliance erodes critical thinking skills, a key element in real-world problem-solving
  • We lose context, a sense of understanding
  • Our human reasoning skills will erode in the face of “But AI can tell me”

Each of us will fall into different groups based on our motivations.

I’m not saying we’re all collectively going to become skill-less zombies addicted to a digital crack of Gen AI tools, but it will be a reality for some.

an image of a broken phone showing an AI assistant which a human has over-relied on for building skills

What happens to human skills if we over-rely on AI?

This is a real grey area for me.

I’ve seen countless examples where too much AI support leads to less flexing of human skills (most notably common sense), and I’ve seen examples where human skills have improved.

In my own practice, my critical thinking skills have improved with weekly AI use these last two years. It’s what I class as an unexpected but welcome benefit.

This doesn’t happen for all, though.

It depends on the person, of course.

BCG’s findings seem to affirm my thoughts that the default will be to over-rely. I mean, why wouldn’t you? This is why any AI skills program you’re building must focus on behaviours and mindset, not just ‘using a tool.’

You can only make smart decisions if you know when, why, and how to work with AI.

But we can learn with AI too

I (probably like some of you) have beef with traditional testing in educational systems.

It’s a memory game, rather than “Do you know how to think about and break down x problem to find the right answer?” Annoying! We celebrate memory, not thinking (bizarre world).

My beef aside, research shows partnering intelligently with AI could change this.

This article between The Atlantic and Google, which focuses on â€œHow AI is playing a central role in reshaping how we learn through Metacognition”, gives me hope.

The TL;DR (too long; didn’t read) of the article is that using AI tools can enhance metacognition, aka thinking about thinking, at a deeper level.

The idea is, as Ben Kornell, managing partner of the Common Sense Growth Fund, puts it, â€œIn a world where AI can generate content at the push of a button, the real value lies in understanding how to direct that process, how to critically evaluate the output, and how to refine one’s own thinking based on those interactions.”

In other words, AI could shift us to prize ‘thinking’ over ‘building alone.’

And that’s going to be an important thing in a land of ‘do it for me.’

To truly do so, you must know

Google’s experiments included two learning-focused examples.

In the first example, pharmacy students interacted with an AI-powered simulation of a distressed patient demanding answers about their medication.

  • The simulation is designed to help students hone communication skills for challenging patient interactions.
  • The key is not the simulation itself, but the metacognitive reflection that follows.
  • Students are encouraged to analyse their approach: what worked, what could have been done differently, and how their communication style affected the patient’s response.

The second example asks students to create their own chatbot.

Strangely, I used the same exercise in my recent “AI For Business Bootcamp” with 12 students.

Obviously, great minds think alike đŸ˜‰.

It’s never been easier for the everyday human to create AI-powered tools with no-code platforms.

Yet, you and I both know, that easy doesn’t mean simple. I’m sure you’ve seen the mountain of dumb headlines with someone saying we don’t need marketers/sales/learning designers because we can do it all in ‘x’ tool.

Ha ha ha ha is what I say to them.

Clicking a button that says ‘create’ with one sentence doesn’t mean anything.

To demonstrate this to my students, we spent 3 hours in an “AI Assistant Hackathon.” This involved the design, build, and delivery of a working assistant.

What they didn’t know is I wasn’t expecting them to build a product that worked.

Not well, anyway.

I spent the first 20 minutes explaining that creating a ‘good’ assistant has nothing to do with what tool you build it in and everything to do with how you design it.

Social media will try to convince you that all it takes is 10 minutes to build a chatbot.

While that’s true from a tech perspective, the product, and its performance, will suck.

Just because you can, doesn’t mean you will (not without effort!)

When the students completed the hackathon, one thing became clear.

It’s not as simple or easy to create a high-quality product, and you’re certainly not going to do it in minutes.

But, like I said, the activity’s goal was not to actually build an assistant, but rather, to understand how to think deeply about ‘what it takes’ to build a meaningful product.

I’m talking about:

  • Understanding the problem you’re solving
  • Why it matters to the user
  • Why the solution needs to be AI-powered
  • How the product will work (this covers the user experience and interface)

Most students didn’t complete the assistant/chatbot build, and that’s perfect.

It’s perfect because they learned, through real practice, that it takes time and a lot of deep thinking to build a meaningful product.

“It’s not about whether AI helped write an essay, but about how students directed the AI, how they explained their thought process, and how they refined their approach based on AI feedback. These metacognitive skills are becoming the new metrics of learning.”

Shantanu Sinha, Vice President and General Manager of Google for Education

AI is only as good as the human using it

The section title says it all.

Perhaps the greatest ‘mistake’ made in all this AI excitement is forgetting the key ingredient for real success.

And that’s you and me, friend.

Like any tool, it only works in the hands of a competent and informed user.

I learned this fairly young when a power drill was thrust into my hands for a DIY mission. Always read the instructions, folks (another story for another time).

Anyway, all my research and real-life experience with building AI skills has shown me one clear lesson.

You need human skills to unlock AI’s capabilities.

You won’t go far without a strong sense (and clarity) of thinking, and the analytical judgment to review outputs.

Going back to the BCG report, a few things to note that support this:

1/ Companies are confusing AI ‘augmenting with skill building’

As we touched on earlier, AI gives you temporary superpowers.

Together (you and AI) you can do wonderful things. Divided, not so much (unless you have the prerequisite knowledge to do the task).

We can already see both companies and workers confusing their abilities to (actually) perform a task.

AI gives both a false sense of skills, and terror at the lack of them.

2/ Most people can’t evaluate AI outputs

Again, any of us can code with AI.

But that doesn’t mean we know what’s going on or how to check if it’s correct.

This is the trap anyone can fall into. Knowing how to validate AI outputs is critical. We need to pay more attention to this. You know, thinking about thinking, and all that.

An AI framework on when to use and when not to use.

3/ Without context, you’re doomed

Content without context is worthless.

That’s a general rule. Exceptions apply at times. Nonetheless, you need the context of when and when not to use AI tools to get results.

As we know, it’s not a silver bullet.

The solution to this is getting a better understanding of Gen AI fundamentals.

Another BCG report, in collaboration with Harvard, discovered that success in work tasks with AI came down to knowing when is the right time to call on those superpowers.


How to help humans use AI for REAL learning

Ok, we can see a potential problem if left unchecked.

Here’s a few ideas, tools and actions to do something about it:

1/ Cover AI fundamentals

Too often ignored with people going straight to tools.

Yet, knowing how and why a technology works means you become the chess player, and not a chess piece that’s moved by every new model and tool.

The world has lots of resources to help you with this.

Here’s some from my locker:

2/ Don’t confuse ‘do it for me’ with ‘learning to do’

While AI can enable individuals to complete tasks they wouldn’t be able to do independently, this doesn’t automatically translate to skill acquisition.

Help people recognise the difference.

To truly learn anything, you need a combination of:

  1. Understanding key concepts
  2. Engage in practice
  3. Commitment to improve

3/ Nurture your Human Chain of Thought

I introduced this concept in last week’s edition.

You might have heard me say “AI is only as good as the human using it” like a broken record.

Like any tool, it only works in the hands of a competent and informed user.

I learned this fairly young when a power drill was thrust into my hands for a DIY mission. Always read the instructions, folks (another story for another time).

Anyway, all my research and real-life experience with building AI skills have shown me one clear lesson.

You need human skills to unlock AI’s full potential.

4/ Encourage critical thinking before and after using AI

Infographic illustrating a structured approach to engaging with AI tools, featuring boxes labeled 'Assess', 'Pre-Prompt', 'Output Analysis', 'Prompt', 'Role Reverse', and 'Challenge' with relevant questions and tasks for each.

Despite what social media gurus say, we all very much need to use our brains when working with AI.

If you want to do useful stuff, that is.

I’ve shared a system you can use to achieve this with all your AI interactions before. You’ll stand out from the digital zombies with this.

5/ Prompt an Engineer’s Mindset

BCG refers to this as the ‘engineer’s mindset’ as it originates from mostly engineering roles (both physical + digital).

I call it the â€˜Builder’s mindset’, and I think this is a cheat code for life.

I would say I’m only as successful as I have been because of it. I learned it during my teenage years of coding in SQL and Java. It’s built around the principles of understanding what, why, and how of building anything.

Back in the day, I used it to build SQL-based reporting applications.

I didn’t even think about building the app before I knew more about the consumer.

Simple things like:

  • Who are they?
  • What problems are they having?
  • Why are those problems happening?
  • What would this look like if it were easier for them?

Over the years, I’ve adapted this into all my work, especially writing.

As of today, before I begin any work, I ask:

  1. Why am I building this?
  2. What problem is it solving?
  3. The ‘So What’ test?
  4. How will you build it?

I can only solve a problem or create a meaningful post/product/newsletter/video if I know the above.

Like a builder, you piece together an end goal.

When you reveal this, the next part is easy → Reverse engineer this process.

As this is such an important point, I need more than the written word to explain this.

So, here’s a short video where I explain how to use this framework:

Modern ways to reshape skill-building with AI

I’ve spoken a lot about AI coaches.

We can throw AI tutors into that mix, too.

Here’s how I see the difference btw:

  • AI Tutor = Breaks down concepts and works in more of a professor style
  • AI Coach = Works with you in a live environment to solve challenges together. Basically, the new “Learning in the flow” but with AI.

Of course, these terms are interchangeable, and the capabilities can be merged.

FYI, today’s NL partner, Sana, is doing a great job in this department with their soon-to-be-released AI tutor. You should check that out.

Often, I find it’s easier to show you what I’m talking about with AI than try to describe it to you, so here’s examples of both:

Using AI as a Tutor with Google AI Studio

Using AI as a Coach with Google AI Studio

In case you’re wondering, I use Google AI Studio to show these features because it’s easy to access for most people.

It’s a sandbox where you can experiment.

But you shouldn’t use this for work, just as a place to experiment. For Tutor and Coach tools in the workplace, more are entering the market.

Final thoughts

So, will AI destroy or amplify your skills?

Only if you let it.

This is by no means a closed book. No doubt, I’ll cover more on this as time goes on.

For now, be smart:

  1. Craft your builder’s mindset
  2. Borrow superpowers but build real ones through practice.
  3. AI is powerful and has great potential, but don’t forget the unique human and technical skills you need to be ‘fit for life.’

Before you go… 👋

If you like my writing and think â€œHey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Learning Technology

The YouTube Growth Playbook: 4 Proven Strategies For Creators

I run lots of experiments in a “How does x thing work” style.

Sometimes I share these, and often I don’t. Mostly because I’m not sure people would care if it’s a non-L&D topic.

Anyway, I’ve inadvertently found myself growing my YouTube channel by 60% subs and 240% views in the last 8 months with 2 -3 hours of work a month.

In this, I’ve learnt a lot about how the platform works. So, I’ve crafted a sort of YouTube Growth Playbook. I have no plans to be a YouTuber but I’m fascinated by how it works.

After many conversations and lots of reflection.

I think there’s a lot to learn from the YouTube design process for L&D design.

From choosing the right idea to designing the user experience to packaging it for a viewer. I see so many similar traits with L&D.

That seems like something worthwhile for us to unpack.

The Experiment: Growing a YouTube Channel in 2-3 hours a month

As mentioned, I have no plans to be a YouTuber.

I just need somewhere to store the videos I share on LinkedIn and this newsletter. My solution is YouTube. I use it more like a storage locker rather than trying to attract people to it.

At the start of the year, I became curious to learn ‘what makes people click’ on a video.

This inevitably sent me down a black hole of learning. I like to think we can agree that is a good type of black hole to fall into. I discovered that crafting high-quality YouTube videos is pretty similar to good end-to-end learning design.

You can transfer a lot of frameworks and skills from writing and marketing into this space.

What happened?

I’ve been running a growth experiment with YouTube for 8 months.

→ It’s increased subscribers by 60%.

→ It’s increased watched hours by 240%.

I’m not out to become a YouTuber, but I do love to learn how things work.

Since our world works on a lot of digital platforms, and we each use them to promote work and business, I like to know how this game is played.

Here are my 4 biggest lessons from taking my channel from 300 subs to over 1,000 (as of this post) in 8 months, with 2-3 hours input a month:

  • Writing is more important than you think: For scripts, descriptions with SEO, titles and hooks.
  • Good design matters: From your thumbnails to how the video looks and flows.
  • Experiment often: This is especially true with thumbnails, descriptions and titles. Some of the biggest players on this platform do this, and it’s something which has worked for me.
  • Build evergreen content, not just for today: Play the long game. I have a set of 4 to 5 high-performing videos that bring in 80% of subs and views. Why? Because they’re evergreen content that transcends the years with relevance.

Note: I could have achieved way more if I was dedicating myself to this, but that’s not my plan.

4 proven YouTube growth strategies for creators, designers and educators

After 8 months of experimentation, I landed on 4 principles that enhance both growth on YouTube and serve as good learning design.

Let’s unpack these in more detail.

Why writing is an essential skill for YouTube growth

Writing is more important than you think

You’re forgiven for thinking that a successful video is all about, well…good video.

It actually starts with good writing.

Writing is one of the most important skills you need. Without good writing (and editing) you cannot:

  • Get buy-in for your work
  • Build a compelling message
  • Explain complex problems
  • Design quality user experiences

And much more.

Specifically for YouTube, writing comes in:

  • Generating video ideas
  • Scriptwriting
  • Video titles
  • SEO-rich video descriptions
  • Transcripts and subtitles

We use almost all of this for good learning design.

Until you can clearly write down what you’re solving and how you’re solving it for the user, your product doesn’t have much hope.

One thing I like about the way YouTube Strategists (yes, they’re a real thing) think about design is with â€˜packaging’. This term is used to describe how you put all the moving pieces together to craft an experience people will engage with.

What I can tell you, is this doesn’t stop when the video is filmed.

It’s about how you script your intro, the video title you choose, your video description and how you will distribute this to your audience. I’ve found the best people obsess over these things to a militaristic level, and that makes sense.

You cannot do well long-term without this approach.

This is no different in L&D. The work is not over when you finish building an experience (no matter if physical or digital). Think about how you will ‘package’ your product to drive value for users.

Good design matters

This transcends industries and products.

I’d go as far as to say it’s a ‘fact of life’.

In YouTube land, thumbnails, video titles and descriptions are gold. They’re what bring in the viewers, but without a well-designed video behind them, they’ll flop.

This is something I learned early on.

The dopamine video hit

I wrote a devilishly enticing title and designed a clickbaity thumbnail for a bare-bones video.

The goal was to see how much of an effect these variables have on a sub-par product. Long story short, the video did very well in the first 12-24hrs. It racked up a few thousand views, but the people spoke with downvotes.

Despite the high engagement in the first 24 hours, the video died.

Why? It was the most downvoted video I released, and YouTube takes note of that. You can have thousands of views and all the dopamine-spiking materials you want, but if many people start downvoting, YouTube stops recommending.

Lesson: You can’t hide rubbish with rainbow paint.

The big hits

I don’t have and don’t want any viral videos.

I detest the word ‘viral’ in this digital age. Instead, I focus on what I do best: ‘how-to’ educational videos. Probably most of the ones you’ve come to know me for.

The videos that have performed well for me, checked these boxes:

  • They were well-planned with a clear structure for the viewer’s experience
  • Each had a short intro which explained ‘what you’ll get’ from this
  • They solved a problem that people in my niche face daily
  • They provide a painkiller to the pain the viewer is experiencing
  • Each had a clear and compelling title and description

Examples of two of my best-performing videos showing these strategies are below:

As you can see, nothing special in particular.

The main thing they do is solve a problem. Just like in L&D. So, solve the problem and you’ll do well.

Research-backed YouTube growth strategies

Instead of me, re-sharing everything I’ve ever learnt about this topic.

Here’s a few resources from people I’ve learnt from and their best advice:

Paddy Galloway: The (Original) YouTube Strategist

Until I came across Paddy, I’d never heard of the phrase ‘YouTube Strategist’.

That was 5 years ago.

Today, it’s pretty common for strategy teams to exist. What I enjoy about Paddy is his critical design mind. You can take how he thinks about YouTube and apply it to many industries. He’s one of those people who can see how it all works together.

The best video I’ve seen him share some of his work in is below.

Here’s 2 of my favourite strategies he shares:

  1. The CCN (Core, Casual, New) Framework: It’s hard to capture everyone, and you probably shouldn’t try too. I love the old saying â€œIf you design for everyone, you design for no-one”.

    What you can work on is how your target audience will experience content. Core people will watch anything, casuals drop in from time to time and new are those discovering the experience for the first time.

    If we’re being honest, L&D for most company’s is a numbers game, and this is something to keep in mind. Listen to Paddy’s explanation.


  2. What makes good video ‘packaging: In the YT world, this refers to the thumbnail, description and title of the video, hence the word ‘package’.

    This is about discoverability.

    I wonder how many L&D teams think about this when their content or workshop sign-up is plastered on a LMS and company intranet.

    Will you risk your product being buried behind poor packaging? More on this from Paddy.

What makes good packaging w/ Aprilynne Alter

I stumbled on Aprilynne’s videos when seeking how to create worthy thumbnails and titles that support user discoverability.

I didn’t want my hard video work to go unnoticed because the packaging sucks. Lucky for me, Aprilynne had done a ton of research on what works in this space.

As I watched the video, I couldn’t help but think about the similarities when it comes to filtering content on a LMS and LXP.

With so much on these systems, you need to help your best (and most important) content standout.

The same design principles for this transcend YouTube.

Two videos I recommend you check out. The first focuses on what makes a good YouTube thumbnail (which you can use for your content system of choice), and the second, unpacks the thought process behind this with the creator.

These strategies have worked well in my experiments so far.

What L&D audiences experience

Take a look at the below.

Looks familiar, right? You know what it is in 5 seconds – a typical course catalog.

This is what your audience experiences when you push them to a content system. There is nothing inherently bad about this image. It’s just not very clear, compelling and ultimately helpful for them.

An image of a learning management system course catalog and how this could be improve by applying techniques use for YouTube content

I’m sure these could be all worthwhile courses for your audience, but they kinda all blend into the background.

As an example, we can see x3 science related options for AI. Each contain a similar title and quite generic imagery. There’s not much to differentiate on that front.

However, the one clear difference are the higher education brand names, which will probably sway a prospective viewer.

Out of the 3, I’d expect 90% will choose Harvard because it’s well-known and is positioned as being prestigious.

That doesn’t mean the others are worse, and the viewer may never know. Perhaps, they’re better but they’ve not been able to communicate the specificity here on the screen where it matters.

This is where ‘packaging’ is key.

The importance of good packaging

The one tile that stands out to me is at the start of the bottom left.

Yes, the Google one, but why?

  • The background image is minimal but a contrast to the same stock images used by the rest
  • It has a very clean visual look so I’m not struggling to make sense of it
  • The title is the most specific one on the screen. It tells me everything I need to make an informed decision in a few seconds. The others are a bit vague and require further investigation. This turns into a Russian roulette because I won’t spend time on them all. This is where brand familiarity helps.

A good test is to put this image into powerpoint and create a second slide that’s blank (complete white space).

Swipe between the two slides, one with the image and the other with white space. Now note where your eyes are naturally drawn to on the image.

This is the course you’re most likely to click.

Use this to reverse engineer the reasons behind the decision. What attracted you and why?

Assume you’re not Harvard

With many off-the-shelf content libraries, you’ll be competing against more well-known brands.

Your job is to stand out in that sea to ensure company content is front and centre. So you need to package things differently. Build clear, compelling and specific imagery, titles and descriptions.

In sum: The little things matter more than you think.

Experiment

Probably the thing we should do more.

Not just in design, but with ongoing packaging of content and courses.

One thing I noticed about bigger YouTube Creators is how they tweak the elements of their packaging if performance is not as expected.

Sometimes the thumbnail didn’t work or the title doesn’t land, maybe the description is not specific enough for the engine to rank it higher in searches.

So, they change it.

Perhaps days later, sometimes hours. They look at audience data to uncover what’s not clicking. You can do the same with courses and content.

If your work is not hitting the mark after a week, try:

  • New visual design – mix up your imagery
  • Change the title – we all know most of us just read headlines, so make yours compelling
  • Description – most learning platforms search engines work the same as YouTube’s. They look for relevant keywords in both titles and descriptions against what a user has requested. Make sure you have these words in yours.

Often, it’s not about re-creating the product, it’s a matter of re-packaging it.

In sum: Don’t kill the product if it doesn’t work first time. Improve the packaging and distribution.

Create evergreen content

This phrase might be new to you, so allow me to explain.

Evergreen means something that is continually fresh and relevant. In a world littered with short-form content and experiences driven by instant gratification, you want to play the slow growth game.

Simply this means building products (I use this word here to describe content and courses) that last for the long-term.

You do this by updating them for relevance.

Sadly, I don’t see this happen often in many L&D teams. The default is to create another new thing rather than update what you have. I tried not to mention AI today, yet I find a good use case for it here in helping you to update old content.

Try sharing your current content with the AI tool of choice and ask it to:

  • Check for relevance
  • Search for new research on the topic
  • Suggest improvements in style and structure
  • Pose thoughtful questions for the audience

I’ve found this helpful with my website.

You’re not asking the tools to ‘create this for me’, you’re working with it as a design assistant.

In sum: Before you create something new, consider how you can refresh something old.

The 3 key elements of a YouTube growth playbook

📝 Final thoughts

OK, friend, we got through a lot today.

  • Don’t let good products get lost with poor packaging
  • It’s the little things that matter: titles, descriptions and imagery
  • Experiment with distribution and packaging: Don’t kill the product, improve the packaging
  • Before building something new, refresh what’s old – this is (probably) the best option
  • Your work needs good design, packaging and distribution to succeed

Bonus: 🤖 Tool for you

To help you apply the topics we’ve covered today, I’ve built a custom AI-assistant with all the videos, tips and knowledge sources in this edition.

It’s not going to make you a superstar YouTuber earning mega bucks, but it will help you put today’s thoughts into practice.

Try out the YouTube Strategist.


Before you go… 👋

If you like my writing and think â€œHey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.

Categories
Artificial intelligence

How To Turn Your Ideas Into Working Prototypes With AI

As generative AI advances show no sign of slowing down, neither do the growing uses for it as experience designers.

One that I’m growing fond of is the ability to build working prototypes for L&D ideas, apps and other creations.

No doubt, as I have, you spend a lot of time trying to help stakeholders visualise what ‘x’ thing could look like.

This normally involves scrappy drawings in a notebook or standing in front of people to do your best Bob Ross impression on a whiteboard.

People nod, yet it’s pretty hard to see how your idea would work and how users engage with it. However, what if I told you, a tool exists where you can turn your ideas and scribbles into a semi-working prototype in 15 minutes? One you can share with users to get instant feedback.

Sounds like a scam, I know.

Thankfully, it’s quite real. I want to show you a handy feature from Claude (another LLM like ChatGPT) to achieve this.

Here’s how it works.

Getting started with Claude

Claude is another conversational AI tool like ChatGPT.

It works exactly the same way apart from a few features. Most of it’s team is full of ex-Open AI employees (ChatGPTs creator) and backed by a huge investment from Amazon.

Before we unleash you with building apps, you need to do the following:

  1. Visit Claude’s website and sign-up for a free account
  2. Access your profile on the left-hand side of the screen at the bottom with your email/username
  3. Click ‘settings’ to access your ‘profile’ screen. Here, scroll to the bottom of this page and select ‘enable artifacts’

Artifacts is what we’re going to use to build interactive prototypes of your ideas. This allows Claude to generate code snippets, text documents, and loads of different designs.

The advantage is that it enables you to view semi-working apps in real time and tweak these.

How to transform your ideas into working prototypes

I’m not going to write out the step by step playbook here.

Instead, you can follow along to get everything you need with me in this video:

What can you use Claude Artifacts for?

A bunch of stuff.

Remember those whiteboard sessions and scribbles in your notebook from earlier? You can turn those into something a stakeholder can actually play with.

Instead of saying, “here’s how it can work in my notebook scribbles” you can say “I’ve built a simple prototype to give you a feel of how it could work, try it here”.

A couple ideas you could build:

  • A content library
  • A low-level LMS or LXP interfaces – handy if you wish to see how people behave with a new experience
  • Landing pages
  • Course pages
  • Simple apps
  • Websites

There’s much more you can do.

Everyone can build, but not everyone is a builder

As always, the power of new tools is in the hand of a skilled practitioner.

You will be able to build something of more value than the everyday employee as you know the context of your field. Look at this type of tool as another design assistant in how you build L&D tech solutions.

The barrier for entry to build stuff is getting lower every week.

That doesn’t mean everyone is a builder though (more on that in the future).


Before you go… 👋

If you like my writing and think â€œHey, I’d like to hear more of what this guy has to say” then you’re in luck.

You can join me every Tuesday morning for more tools, templates and insights for the modern L&D pro in my weekly newsletter.