Skip to content

Public Services

Generation Ready: Building the Foundations for AI-Proficient Education in England’s Schools


Paper1st September 2025

Contributors: Kayla Crowley-Carbery, Robert Johnson


Foreword

Education systems globally stand at a pivotal juncture. Artificial intelligence is not just a technological advancement that disrupts the world of work; it fundamentally changes the way we learn. The process of change in learning started three decades ago with the internet’s widespread use in personal computers, which provided instant access to large quantities of information with no precedent in human history. The current change, accelerated by AI, further drives the demand for deeper thinking, analysis and creative skills.

In an AI-driven world, success will be defined not by what one knows, but by the depth of one’s thought, creativity and mastery of collaboration. Learning is becoming more dynamic and personalised, with AI tools serving as adaptive tutors that cater to each student’s unique needs and pace. However, we must remind ourselves that AI is a tool, not a teacher. Learning with AI will only reach deep learning levels if it is facilitated and guided by a professional in pedagogy – a teacher who empowers students to navigate a world of unprecedented knowledge and complexity.

In Estonia, we are taking a leap into the AI era in education by bringing artificial-intelligence tools into our classrooms this autumn. We are driven by the recognition of the risk of cognitive offloading, where students use AI to bypass learning rather than deepen their understanding. This requires a shift in pedagogical approaches towards emphasising problem-solving, critical thinking, collaborative work and ethical reasoning. We believe that successful innovation in education depends on two key frameworks. First, it must be supported by teachers, and therefore a teacher-training and empowerment programme for the use of AI is a fundamental element in educational innovation. Second, to implement a national programme, we established a national governance structure for innovation that includes scientific guidance followed by the implementation in partnership with tech companies.

This report by the Tony Blair Institute for Global Change (TBI) is to be commended for its timely and insightful analysis of how England’s education system must adapt to the age of artificial intelligence. The report provides a crucial examination of the challenges and opportunities facing England’s education system in this rapidly evolving landscape, and its findings demand our immediate attention. As policymakers, we must recognise that AI is no longer a distant prospect; it is already shaping the way our students learn. The question, therefore, is not whether AI will impact education, but how effectively we will harness its potential to empower learners and drive change.

The report presents a clear vision for England. By taking action across four key pillars – reforming the curriculum, building teacher confidence, equipping families and upgrading digital infrastructure – England can not only prepare its students for the AI era but also set a new global standard for how education systems adapt to technological change. The “AI-literacy gap”, where many teachers lack the confidence and training to effectively integrate AI into their classrooms, is a significant obstacle to innovation. The task for policymakers is to ensure that educators have access to high-quality professional development and ongoing support to build their expertise.

Equally important is the need to address the growing skills shortage in AI-related fields. By reforming the curriculum, promoting computer-science education and creating clear pathways for students to pursue technical careers, countries can strengthen their talent pipeline. Moreover, we must ensure that all students, regardless of their background, have access to AI education and the necessary digital infrastructure to participate fully in this transformation. This requires targeted interventions to address inequalities in access to technology, teacher training and resources.

As policymakers in England consider the insights presented here, it is crucial to recognise that the integration of AI into education is rapidly becoming a global race. Nations that proactively equip their teachers and students with AI literacy and skills will secure a decisive advantage in the future economy. We in Estonia will be watching with great interest.

Kristina Kallas

Minister of Education and Research of Estonia


Executive Summary

England’s education system is not preparing pupils for a future shaped by artificial intelligence. Without urgent reform, young people will leave school unready for an AI-driven labour market, undermining national competitiveness and deepening inequality.

AI proficiency is fast becoming essential for modern life, yet the national conversation has not kept pace. In response to AI’s growing presence in schools, some would rather pull the handbrake than steer the wheel – banning its use in schools, clinging to traditional methods or turning away out of fear that it will make young people less willing or able to think for themselves.

These reactions misread the moment. AI is no longer on the horizon; it is here, already embedded in pupils’ lives and reshaping the skills needed to thrive. Education should not be shielded from disruption – it must be used to shape it.

The stakes are high: economic resilience, national competitiveness and social mobility all hinge on a coherent response.

Generative AI is transforming both the types of jobs available and the skills required to do them – even in fields once considered untouchable, such as law, medicine, software engineering and journalism. Up to 3 million existing jobs could be displaced, while job postings requiring AI skills have grown 3.6 times faster than all UK jobs in the past decade. [_],[_]

AI readiness is also becoming a global race. Countries such as South Korea, Estonia, China, Singapore and Germany are embedding AI literacy and digital infrastructure into their education systems as national priorities, preparing an AI-ready workforce. The UK risks being outpaced.

Another clear risk is the emergence of a new disadvantage gap, where pupils who master AI-era skills surge ahead, leaving others further behind.

Yet despite these high stakes, far too little is being done to adapt. Curriculum design, teaching practice, parental engagement and digital infrastructure remain out of step with the demands of the AI era.

Preparing pupils to thrive requires much more than marginal tweaks; it demands a fundamental change in both what and how they learn. Foundational subject knowledge remains vital, as it is the basis for discernment – the ability to ask good questions, separate fact from misinformation and make informed decisions – but must be complemented by three broad sets of skills:

1. AI literacy – learning about AI and how to use it (all pupils)

2. Human-centred capabilities – such as critical thinking, problem-solving, communication and ethical reasoning (all pupils)

3. AI technical skills – designing and deploying AI systems (technical career paths)

Of course, these skills alone are not enough – AI tools must also be safe and effective, a point the TBI will return to in future work – but they remain a vital prerequisite for the modern era. They are not only workplace essentials but also fundamental to thriving in an AI-shaped world, influencing how we manage our finances, build relationships, learn, make decisions, safeguard our wellbeing and engage with society.

The current system is veering off course. There are significant gaps in AI literacy. TBI-commissioned analysis finds that only one in five teachers says that anyone at their school teaches pupils how AI works and what it is, and just one in ten reports AI being integrated into subject teaching.[_]

This leads to pupils using AI ineffectively, often to bypass rather than deepen learning – already, almost one in five secondary pupils uses AI to do all of their schoolwork.[_] It also reflects a skills gap among teachers, who generally lack adequate support: 43 per cent rate their AI confidence at just three out of ten, and 91 per cent of those using AI are self-taught, according to TBI-commissioned insights.[_],[_]

Parents are also under-equipped: almost one in four adults struggles with routine digital tasks and most know little about AI’s potential for learning.[_] Poor digital infrastructure across many schools and homes critically undermines the prospect of building widespread AI proficiency.

Shortfalls in human-centred capabilities persist too. Employers prize skills such as communication and critical thinking, not least in higher-skilled roles, but shortages are widespread. And despite rising demand for AI technical skills, many employers cannot fill vacant roles, while the education system is not adapting: a fifth of non-selective state schools do not offer GCSE computer science at all, while just 13 per cent of the Key Stage 4 cohort take GCSE computer science.[_],[_]

Equally alarming is the opening of a new digital divide, driven by uneven access to AI, guidance and skills development. According to TBI-commissioned analysis, schools with larger disadvantaged cohorts are less likely than others to teach pupils how AI works or use it in class, and are less likely to offer GCSE computer science.[_],[_] Independent schools are three times more likely than state schools to have a school-wide AI strategy.[_] A third of pupils lack continuous access to a device at home on which they can do online schoolwork.[_]

This is a progressive opportunity to recast the purpose of education in England. AI proficiency must be a universal entitlement, not a niche privilege. With bold reform, England can set a global benchmark for how education systems adapt to technological disruption – ensuring every pupil, regardless of background, is ready to thrive in the age of AI.

Policy Foundations to Build AI Proficiency in Schools

To meet this challenge, the government must act across four pillars:

Pillar 1 – Pupils: Make AI Proficiency a Core Outcome of Schooling in England

  • Embed foundational AI literacy in the primary curriculum from Key Stage 2, using the SEAME framework (Social & Ethical, Application, Model, Engine).

  • Require all Year 7 pupils to take a standalone, age-appropriate AI-literacy module, introducing key concepts, applications and ethical considerations.

  • Revise the statutory computing curriculum (secondary) to include AI data use and applied-AI competencies such as digital-media use, basic cyber-security and data literacy.

  • Require all pupils to complete a Certificate in Applied Computing before leaving school, combining modular, self-paced learning with hands-on projects. This would cover core programming, computational thinking and real-world digital skills such as prompting and critiquing AI, using data, basic coding and responsible tool use.

  • Reform GCSE computer science to emphasise practical application, incorporating machine learning, ethics and project-based assessments such as building chatbots or analysing data sets.

  • Review and revise the English Baccalaureate (EBacc) and Progress 8 frameworks to better incentivise the uptake of computer science.

  • Treat AI as a cross-curricular competency. Integrate AI across all subjects to enhance learning and higher-order skills, with updated subject specifications and assessable outcomes.

  • Overhaul the curriculum review by creating a statutorily independent body to lead curriculum and assessment reform, with government retaining the final say. Introduce staggered subject-review cycles and build a data platform, supported by expert input and AI-augmented consultations, to integrate diverse evidence and reflect real-world needs.

Pillar 2 – Teachers and Leaders: Build Workforce Confidence and Capacity to Teach With and About AI

  • To localise training and support, create a national network of AI hubs offering peer-led professional development and teacher accreditation. Introduce government-funded fast-track pathways for AI lead teachers in every school, tied to career-boosting progression and leadership routes.

  • Address the low uptake of GCSE computer science among girls by building on targeted initiatives, with AI hubs and school AI leads incubating new approaches and mainstreaming proven ones.

  • Launch an AI-mastery pedagogical framework to guide how AI is embedded in teaching and learning, evolving over time as technology advances and best practice emerges.

  • Update the Teachers’ Standards and Early Career Framework to include AI competence.

  • Introduce a new National Professional Qualification (NPQ) in Leading AI and Digital Innovation to support school leadership.

Pillar 3 – Families: Equip Parents and Carers to Support AI Readiness at Home

  • Require every school to create a parental-engagement plan on AI, driven by the school’s new AI lead. This should explain AI use, safeguards and how parents can support learning. Schools would choose their own engagement methods, using resources curated by the Department for Education if desired.

  • Establish a national network of parent ambassadors, trained to provide peer-to-peer support, run tech demonstrations, host online groups and advise on home-based AI learning. Fund outreach with micro-grants, prioritising disadvantaged communities.

Pillar 4 – Infrastructure: Build the Digital Foundations for AI Readiness

  • Dramatically scale up investment in digital infrastructure, ensuring universal high-speed broadband and resilient WiFi in all schools, and equip schools with the technical capacity required to manage devices and connectivity.

  • Adopt a hybrid device-access model, with Bring Your Own Device as the default and government-funded loaner devices for pupils without access. Ensure every teacher and secondary pupil has a device, and ensure there is at least one per five pupils in primary school. Monitor and adapt the programme to ensure it is effective, and that device ratios remain suitable.

  • Explore partnerships, both with industry and within its own estate, to reclaim, refurbish and supply devices slated for disposal – thereby cutting costs, expanding access and promoting circular, sustainable use of technology.

Get weekly email updates from the Institute

Chapter 1

The Progressive Case for AI Proficiency in Education

AI is already transforming the labour market – and the pace and breadth of this transformation are accelerating, fundamentally changing the skills people need to thrive. This has profound implications for education policy. To remain competitive and inclusive, the UK must equip its workforce with the skills to adapt to technological disruption and harness new economic opportunities.

Most economists agree that AI will replace tasks, augment them or add new ones within roles.[_],[_] There are also some human skills that will be insulated from automation, due to a mix of social norms and current AI capabilities.

Whereas in previous technological transitions, routine manual and clerical jobs were most vulnerable to automation, today cognitive tasks performed by highly educated professionals are increasingly exposed to AI. Recent advancements with generative AI have expanded automation’s reach into domains once considered immune, such as law, design and software engineering.[_]

Not only is AI reshaping work, augmenting and automating roles, but an increasing proportion of roles in the UK require AI skills. According to a report by PwC, this has doubled in eight years (between 2016 and 2024) – and across most sectors.[_]

Looking ahead, the frequency of job disruption within and between occupations and industries is likely to increase even further as AI capabilities improve and accelerate over time. What we think should be done by AI is also likely to change. Most notably, the potential development of agentic AI – or artificial general intelligence, capable of outperforming humans across most tasks – could further revolutionise how we live and work.

Considering these changes, preparing for the AI-era economy will need to become a core objective of England’s education system – not just by introducing coding classes but by embedding AI-era skills throughout formal education.

This does not mean rehashing the old debates that unhelpfully pit knowledge against skills. Foundational knowledge will still be critical, and the curriculum will need to reflect and protect that. Emerging evidence shows that AI tools – usually narrow AI tools – can even support with the development of foundational knowledge. Pupils must acquire a secure understanding of core concepts, and this knowledge is the basis for discernment (being able to ask good questions, identify truth from misinformation and make informed decisions). This mirrors what happens in work – for instance, radiologists need deep anatomy and pathology knowledge to check, interpret and communicate findings from scans, and the demand for radiologists has increased, despite the introduction of AI.[_],[_]

But in parallel, pupils will need to become AI-proficient. What is meant by this? At minimum, every pupil in England will need to build core competence in AI literacy (how to use and critique AI) and human-centred capabilities (higher-order skills that complement AI, such as critical thinking and communication). The former can be embedded within subjects across the curriculum, while AI can be weaved into subject teaching to support the latter (for instance, by asking pupils to interrogate texts, data and the connections between them, generating creative work, and honing communication and reasoning).

While all pupils will need to develop AI literacy and human-centred capabilities, only those pursuing a career in, say, building or training AI would need to develop deep technical skills (how to design, train and deploy AI). Contrary to what some critics have argued, this will be the case even though AI can increasingly undertake many technical tasks. Take coding, for instance, which AI can already undertake but which is still the most efficient way to develop computational thinking skills, understand programming and build the skills required to interact with intelligent machines.[_] Technical skills will also need be taught earlier and more equitably – this means they cannot just be developed at university.

Figure 1

TBI’s framework to develop an AI-proficient workforce

Domain

Purpose

Examples of core knowledge

Applied skills

AI literacy

Understand and use AI

Key AI concepts (models, data life cycle, limitations)

Tool awareness and terminology

AI capabilities and limitations

Impact of AI on reshaping work and human roles

Societal and ethical principles

Critically choose/prompt AI tools

Interpret AI outputs and biases

Communicate AI’s implications to others

Human-centred skills

Provide human skills to flourish in an AI-augmented world

Logic, reasoning, identifying bias and fallacies

Vocabulary, sentence structure, arithmetic

Strategies for planning and reflection

Communication, creativity, critical thinking and collaboration

Literacy and numeracy, science and digital skills

Leadership, resilience, social influence

Meta-learning/cognition

AI technical skills

Design and build AI

Probability and statistics

Algorithms and data structures

Machine-learning architectures and evaluation

Human-computer interaction basics

Collect and clean data

Train and fine-tune models

Prototype system architectures

Integrate robotics/edge devices

Computational thinking

Source: TBI analysis[_]

Together, these three broad sets of skills will give pupils the ability to take full advantage of an economy being reshaped by AI. But the pace of change in AI also means the education system needs to be agile enough to respond to changes in future skills demands – for instance, as more tasks become automatable and new tasks emerge requiring different skill sets. And steps will need to be taken to chart those changes reliably and use them to inform changes in the curriculum.

The knowledge and skills framework is designed not just to prepare pupils for the world of work, but to equip them for life in a society reshaped by AI. It goes beyond teaching pupils to use tools, instead emphasising the development of knowledge and human-centred skills that are becoming more important because of AI, such as the ability to think critically and communicate effectively, and navigate AI systems that will increasingly shape most aspects of daily life, including news, health, finances and civic participation more generally.

The framework is tool-agnostic, applying to both generative and narrow AI, and supports – rather than replaces – domain knowledge by helping pupils access, apply and interrogate information more effectively.

Crucially, preparing pupils is only part of the solution; AI tools themselves must be safe, ethical and used responsibly, and must be built and evaluated in ways that clearly demonstrate measurable improvements in teaching and learning – TBI will return to these questions in future work.

What Is at Stake?

The government faces a stark choice: get caught in the tailwind of disruption or capitalise on a generational opportunity to boost prosperity. The ability to work with and build AI systems will be a cornerstone of future employment and economic competitiveness. For the UK economy, the current disruption poses significant labour-market risks if it is not managed properly. A previous TBI paper, The Impact of AI on the Labour Market, highlighted that between 1 and 3 million jobs could be displaced by AI.

If the government does not respond with effective policies to help people upskill, the ensuing skills mismatch in the economy will reduce the productivity gains from automation and hamper the introduction of new tasks.[_] The UK would fall behind in the global AI economy, leaving swathes of the population unprepared, unemployable and disengaged, straining the welfare system and creating a public backlash against AI. But other outcomes are possible – for example, if managed well, embracing AI could plausibly raise national income by 11 per cent, according to TBI analysis.[_]

Crucially, the difference between these outcomes hinges not only on what firms decide to do, but also on the policy choices made by government. With the right policies – including a coherent approach to curriculum reform, teacher training and access to high-quality digital infrastructure – the government can unlock new sources of economic growth, for example, by enabling the UK’s tech and science sectors to thrive and can build a world of work that is more meaningful, productive and inclusive.

Avoiding Strategic Drift and Competing at the Global Frontier

Investing in an AI-ready workforce is not just a domestic imperative; it is also a strategic move to strengthen the UK’s position on the global stage. Countries around the world are racing to harness AI in education as a lever for economic and geopolitical advantage. They are making bold strides to integrate AI into their education systems, recognising the link between tech-savvy human capital and national power.

Figure 2

How some countries are setting the pace

Country

Examples of core initiatives

South Korea

By 2026, South Korea aims to train all teachers to use digital technology in classrooms, backed by around $0.74 billion of funding between 2024 and 2026.[_]

AI education is offered to elementary, middle and high-school pupils as part of certain subjects, while high-school pupils can also choose AI-related electives. The government is considering introducing a new standalone AI subject.[_]

Government has invested $70 million on digital infrastructure to support AI-powered learning.[_]

Estonia

Estonia launched “AI Leap 2025”, a nationally coordinated public–private initiative providing free AI-powered learning tools/generative-AI models to teachers and pupils. Tools will be tailored to its national curriculum and will be designed to promote critical thinking and skill-building rather than offering direct answers.[_]

The tools will initially be available to 20,000 pupils and 3,000 teachers from autumn 2025, with a view to expanding in 2026.[_] Teachers will get targeted professional training on understanding AI and using tools effectively in the classroom.[_]

About half of Estonia’s schools have employed edtech lead teachers – specialists in digital tools who run on-site labs and train colleagues to integrate technology effectively into their teaching.[_]

Beijing/China

From autumn 2025, Beijing schools will teach courses in AI, either as standalone modules or integrated into other subjects such as information technology.[_]

Instruction will be tailored by school level: primary schools will offer introductory, hands-on experiences; junior high schools will focus on applied understanding in learning and daily life; and senior high schools will emphasise innovation.

In May 2025, China’s Ministry of Education announced a national plan for a tiered AI curriculum. Pupils will progress from basic AI literacy (for example, voice recognition, image classification) in early grades in primary school to machine learning in junior high and algorithm design in senior secondary education. To support this, AI competencies will be integrated into the national teacher-training framework.[_]

Singapore

As part of the EdTech Masterplan 2030, AI tools such as the Adaptive Learning System (customising lesson content to pupil performance), Feedback Assistants (delivering rapid, targeted feedback) and Authoring Copilot (supporting lesson planning) are being rolled out. These are delivered through the Student Learning Space, Singapore’s central online-learning hub for schools, to personalise learning and enhance teaching.[_]

From 2025, all primary and secondary students will take “AI for Fun” modules, offering hands-on, applied learning in machine learning, robotics and ethical use, building on Singapore’s earlier coding programmes.[_]

Launching in 2025, the Smart Nation Educator Fellowship will train a cohort of teacher-leaders in advanced digital and AI pedagogy, with a view to translating this expertise into curriculum design and strategies that build pupils’ digital skills.

Germany

In December 2024, Germany launched Digitalpakt Schule 2.0, a €5 billion programme to upgrade digital infrastructure in schools, train teachers, and support innovative teaching and learning methods.[_]

The initiative includes substantial investment for high-performance wireless local-area network (WLAN), modern devices and digital learning platforms in all schools, while more disadvantaged pupils will be provided with devices.[_]

It supports curriculum reform and digitally integrated pedagogy – including through a dedicated €250 million initiative to support innovative, research-based training – and places a strong emphasis on adaptive, AI-supported learning systems.

Source: TBI

These countries, and others, view AI education as a strategic investment to prepare for the AI era, and are mobilising public and private resources to that end. Failing to modernise its classrooms would leave the UK at a strategic disadvantage – with a workforce less versed in AI than international peers and an economy less able to capitalise on AI-driven growth.

Conversely, with the right policies, the government could ensure that school leavers are AI-savvy and can compete with frontier countries. This would bring economic clout and allow the country to play a global leadership role. In an AI-driven age, countries will look for guidance on how to upskill their populations. If the UK is at the forefront of equipping its citizens with future-ready skills, it would enhance its credibility when advising other governments on how they might do the same. It would also strengthen its voice in global AI-governance debates – for example, by bringing together countries to ensure that the use of AI in education is human-centric, safe and equitable.

Confronting a New Disadvantage Gap and Expanding Opportunity

The attainment gap between disadvantaged pupils and their peers is already alarmingly high. Economically disadvantaged pupils in England, for instance, are 19.2 months behind their peers by the end of secondary school – the highest level since 2012.[_],[_]

In addition to the existing divide in standard academic outcomes, a new fault line will open if all pupils are not trained to be ready for an AI-enhanced world. The disadvantage gap will increasingly hinge not only on pupils’ and teachers’ abilities in today’s core competencies; it will also depend on their ability to develop the faculties required to use AI properly and thrive more generally in an AI-enhanced world.

Those who can properly use AI tools to augment their learning, or profit from AI-enabled teacher-led approaches, will have the chance to transcend the limitations of the current paradigm (where schools lack the resources required to differentiate learning, leaving many behind).

In addition, pupils who can evaluate content and apply it robustly will profit far more from AI than those who passively absorb output with little critical application. Individuals who build higher-order skills that complement AI, such as collaborative problem-solving and ethical reasoning, will have more options in the labour market than those who do not. People who can quickly adapt to future iterations of AI (innovations in AI are occurring more quickly than ever seen before) will continue to benefit from new advantages while others will fall behind.[_] Pupils who receive an education that hones these aptitudes will, therefore, have a substantial advantage over those who do not.

Without a clear, purposeful strategy to democratise access to the new frontier of learning, those who already benefit most from the trappings of the current age will benefit most from the hand of the AI era. Conversely, with the right focus and direction, all pupils could have the chance to craft the skills needed to cope and flourish in an age of AI.


Chapter 2

Why Pupils in England Are Not Prepared for the AI Era

When it comes to AI readiness, the gaps in England’s education system are evident across three core sets of skills that pupils will need to navigate and thrive: AI literacy (how to use and critique AI), human-centred capabilities (communication, creativity and critical thinking) and AI technical skills (how to design, train and deploy AI). Each of these gaps risks leaving young people underprepared for the future.

The AI-Literacy Gap

AI literacy is fast becoming a skills requirement across the labour market. Professional occupations are the largest occupational group in the UK, and have one of the highest levels of exposure to AI.[_],[_],[_] By matching job titles found in LinkedIn job data to Occupational Codes from the International Standard Classification of Occupations (ISCO), TBI analysis found that mentions of AI literacy in professional job descriptions (excluding technology and software roles) have increased by 69 per cent since the latter half of 2024.[_],[_]

Today’s pupils are not just digital natives – they are becoming AI natives. But while they are regularly turning to AI, few are learning about it in a structured way.

Since the launch of ChatGPT in November 2022, young learners have embraced generative AI tools at a pace that outstrips the general population. While surveys indicate that only about half of adults are currently using generative AI, uptake among pupils has grown dramatically – from just 14 per cent in April 2023 to more than 70 per cent by December 2024 (Figure 3).

Figure 3

A growing number of secondary-school pupils (Key Stage 3 and 4) are using generative AI tools

Source: Department for Education[_]

Educators are still finding their footing with AI. Recent research shows that only around one in four UK teachers are using AI daily.[_] In one analysis of 28 prominent education-specific AI tools, five were entirely unrecognised by teachers, 18 were known to 10 per cent or fewer and only five had recognition rates between 11 and 50 per cent.[_]

Even among those teachers who are exploring AI, its use is largely limited to a narrow set of tasks. The most common applications include generating lesson materials (used by 73 per cent of AI-engaged teachers) and supporting curriculum planning (47 per cent).[_] These functions, while helpful, represent only a small slice of what AI offers in an educational context.

Meanwhile, few pupils have the chance to learn about AI in a structured or informed way. To better understand how AI literacy is being addressed in schools, TBI commissioned a Teacher Tapp survey of teachers in July 2025. The findings highlight the scale of the challenge: only one in five state secondary-school respondents said that at least some teachers teach pupils how AI works and what it is. Just over a quarter (27 per cent) said at least some teachers support pupils to use AI in their learning. And only one in ten said at least some teachers used AI specifically within subject teaching.

Figure 4

A minority of teachers in England’s state secondary schools teach pupils about AI and help them use it

Source: TBI-commissioned polling[_]

With limited opportunities to learn about AI in a structured or supported setting, many pupils are independently exploring these tools on their own terms. By December 2024 nearly four in ten secondary-school pupils reported using generative AI tools outside any school-related context. Only a small proportion were integrating AI into their academic work: just 8 per cent said they used AI for lessons and homework, and 18 per cent for homework. Even then, it remains unclear whether this use was prompted by teachers or driven by pupils themselves.

Figure 5

Secondary-school pupils most commonly use generative AI tools outside school (Key Stage 3 and 4)

Source: Department for Education[_]

Note: Numbers may not add up to exactly 100 due to rounding.

AI Misuse and the Risk of “Cognitive Offloading”

As pupils increasingly turn to AI, many are doing so without the necessary understanding or support to use it effectively. In 2024, 22 per cent of UK secondary-school pupils reported being unsure of what AI even means while 43 per cent lacked confidence in recognising or understanding the risks associated with AI.[_] Without structured, guided opportunities to engage with these technologies in an educational setting, pupils are left to navigate AI alone.

Perhaps most worryingly, pupils are increasingly using AI to sidestep learning rather than deepen it. According to one study, 40 per cent of secondary-school pupils use it to generate a first draft which they then edit, while 28 per cent use it to improve their own writing. Nearly one in five (19 per cent) use AI to do all their schoolwork.[_] According to another study, one in five (20.9 per cent) of 13- to 18-year-olds said that they usually just copied what generative AI told them.[_]

The core risk here is that pupils become dependent on AI to solve problems for them, rather than using it as a tool to strengthen their own problem-solving abilities. For instance, a recent meta-analysis found that while use of generative AI can lead to higher engagement, improved academic outcomes and stronger higher-order thinking skills, it can also reduce “mental effort” – the cognitive work required to critically assess and tackle problems independently – when used in certain ways.[_]

This “cognitive offloading” is beginning to show tangible effects. Global studies are linking poor patterns of AI use to declining performance in formal assessments – but also show that when generative AI is embedded in a scaffolded, education-focused model that guides rather than simply provides answers, these risks are mitigated.[_],[_] Without adequate guidance, there is a danger that pupils will develop poor academic habits and fail to cultivate key cognitive skills such as critical thinking and analytical reasoning. As a result, they not only misuse AI but also miss out on its most valuable educational benefits.

The Confidence Gap Among Teachers and Families

Given that many teachers and parents/carers lack the confidence to engage with AI, it is perhaps unsurprising that pupils often struggle to use it well. Without guidance at school or support at home, young learners instead often navigate this complex and fast-evolving technology on their own.

Teachers

While many teachers are optimistic about the potential of AI in education, most do not yet feel confident enough to teach with it or about it. According to one recent study, 43 per cent rate their AI confidence at just three out of ten.[_] And as Figure 6 highlights, among teachers who have not used generative AI tools in their role, 64 per cent say the reason for this is that they do not know enough about how generative AI tools could be used in such a way.[_]

This lack of confidence has a significant impact on adoption. Research shows that teacher self-efficacy is a strong predictor of whether new technologies are embraced.[_] When teachers feel confident, they are more likely to embed AI meaningfully into their pedagogy. But when confidence is low, engagement is stifled.

Figure 6

Teachers and leaders say their lack of knowledge is a key barrier to using generative AI

Source: Department for Education[_]

Meanwhile, formal support for teachers remains scarce. The majority are not receiving any structured training in AI and – as Figure 7 illustrates – the vast majority who are experimenting with AI are doing so without backing from their institutions. It is no surprise, then, that teachers are increasingly calling for clearer guidance on safety and consistent, high-quality professional development to help to turn interest into confident practice.[_],[_]

Figure 7

The majority of teachers who have used AI tools rely on being self-taught

Source: TBI-commissioned polling[_]

Note: CPD stands for continuing professional development.

Families

Parents and carers also play a vital role in shaping how children engage with learning at home – including their interaction with AI. Yet many struggle with even basic digital skills, let alone AI-specific knowledge. Almost a quarter of UK adults struggles even with routine digital tasks.[_]

When it comes to AI itself, parental understanding is often limited and shaped more by media narratives than by real-world experience. Government research shows that while many parents have heard of AI, their knowledge is typically very shallow. Most cannot name AI tools and many associate the concept with science fiction rather than practical applications in education, unless they have first-hand exposure.[_] The debate is further muddied by a tendency to bundle together comments about screen time and mobile-phone bans with the question of using AI as a tool for learning, obscuring the distinct questions that pertain to each.

Parental attitudes towards AI are also mixed. According to government-commissioned research, 29 per cent of parents said they felt positive or somewhat positive about their child using AI for schoolwork, while 38 per cent expressed negative or somewhat negative views.[_] Common concerns include overreliance on AI, data privacy, the reliability of AI-generated content and the fear that unequal access to AI tools could widen existing educational disparities.[_]

Figure 8

Parents in England have mixed views on their children using AI tools for schoolwork

Source: Department for Education[_]

Together, these challenges underscore the urgent need for a coordinated approach to AI literacy – one that supports not only pupils, but also the adults who shape their learning environments. Without building the confidence and skills of both teachers and parents, young people will continue to use AI without the guidance they need to make the most of it.

The Human-Centred-Capabilities Gap

Rather than reducing the importance of human capabilities, AI is increasing demand for the complementary skills that allow people to use it effectively. At the forefront are critical thinking and communication – skills that remains central to success in the labour market, especially in “skills-intensive” roles.

Yet too few people are developing the capabilities needed for the AI era. According to new TBI analysis (see Figure 9), 59 per cent of all job vacancies, and all the skills-intensive job vacancies in the UK, list critical thinking as a key requirement (strong or very strong requirement).

Still, employers consistently report shortages: the government’s Employer Skills Survey found that 43 per cent of UK skills-shortage vacancies were caused, at least in part, by a lack of complex analytical skills, surpassing even the gap in digital skills.[_] Meanwhile, complex analytical skills contributed to 47 per cent of the skills gaps identified among existing workers.[_]

Figure 9

Verbal communication and critical thinking are required in most current UK job vacancies

Source: TBI analysis[_]

Communication is another essential human-centred skill that enhances the effective use of AI. As Figure 9 highlights, this skill is also prized by employers: in 2025, just over half of UK job vacancies prioritise speaking and communication abilities, rising to 93 per cent in the top 50 per cent of roles based on skill intensity.

Yet many in the workforce are falling short. For instance, according to a recent study by the Chartered Institute of Personnel and Development, only 38 per cent of employers believe young workers (aged 16 to 24) are generally good communicators in the workplace.[_]

The AI-Technical-Skills Gap

As AI technologies are adopted across industries, employers are scrambling to hire specialists who can design, build and train AI systems – and demand is rapidly rising. Recent labour-market data indicate that job postings for AI-skilled roles are growing much faster than the overall job market – in fact, job postings requiring specialist AI skills (such as machine learning) have grown 3.6 times faster than all UK postings over the past decade.[_]

TBI analysis of LinkedIn jobs postings found 96,533 software and technology jobs posted since July 2024. In 2025, 23 per cent of these jobs required some level of AI technical skills, compared with 11 per cent in the latter half of 2024. In some cases, the need is particularly high – for instance, 50 per cent of software-developer vacancies and 77 per cent of data-scientist vacancies now require AI technical skills.

Figure 10

There has been an increase in AI skills requirements in software and technology jobs posted on LinkedIn in the UK since H2 2024

Source: TBI analysis of LinkedIn job descriptions provided by the Bright Initiative.

Note: Numbers may not add up to exactly 100 due to rounding.

Yet many employers cannot find people with these skills. In 2025, 52 per cent of UK technology leaders reported an AI-related skills shortage in their teams, more than double the share from the previous year.[_],[_] AI has become the UK’s largest and fastest-growing tech-skills gap, jumping from the fifth most scarce tech skill to the top spot in just 18 months.[_]

All the while, the STEM-skills gap means that there are not enough people who can easily transition into AI and data-science roles from other jobs. For example, 49 per cent of engineering and technology businesses in the UK report difficulty recruiting due to skills shortages.[_]

The struggle to hire AI specialists has led companies to adjust their hiring practices and incentives. Some are de-emphasising formal degrees, reflecting a shift to skills-based hiring to broaden the talent pool.[_] Many employers are paying substantial salary premiums for AI and data expertise – on average, UK jobs demanding AI skills command a 14 per cent wage premium.[_]

The pressure is now on the education system to adapt. To have more AI engineers, data scientists and machine-learning specialists in the future, more pupils today need to engage deeply with subjects such as computing. However, current participation levels in this area are worryingly low.

Although computing is officially part of England’s National Curriculum until Key Stage 4, in practice, pupils receive only minimal exposure – often as little as one hour per week in Key Stage 3. Schools under pressure to maximise exam results often sideline non-examined computing content, meaning the curriculum requirement is frequently ignored. In addition, the current curriculum emphasises basic and general computer-science principles rather than specific emerging technologies like AI.

In addition, pupils are not required to take computer science at GCSE and the majority do not. A fifth of non-selective state schools do not offer GCSE computer science at all, while just 13 per cent of the Key Stage 4 cohort take GCSE computer science. [_],[_]

An Emerging Social Divide: Who Gets to Develop the Skills Needed to Thrive?

As AI becomes increasingly central to learning and work, a new divide is emerging – not just between those who use AI and those who do not, but between those who understand how to use it well and those who are left behind. While AI literacy will be essential for almost all pupils, the opportunity to develop this is far from evenly distributed. Nor is the chance to build deeper technical knowledge, which is critical for future AI-specific careers.

Four fault lines are beginning to define this new inequality: who gets to use AI at school in the first place; access to foundational AI instruction; opportunity to pursue deeper technical pathways; and the accelerating advantage of independent schools in promoting AI proficiency.

Fault Line 1: Who Gets to Use AI at School in the First Place?

Although pupils are using AI more frequently, much of this use is happening outside the classroom and there is a growing divide in how educational settings are integrating these tools. Teachers in schools rated “outstanding” by Ofsted are three times more likely to have received formal training and three times more likely to report a school-wide strategy than schools rated “requires improvement/inadequate”.[_]

Unequal access to digital infrastructure is further compounding disparities in AI readiness. According to one survey, 72 per cent of pupils in Ofsted-rated “inadequate” schools lacked access to individual devices in class, compared to 59 per cent in “outstanding” schools – significantly restricting their ability to engage meaningfully with AI tools.[_]

More broadly, a recent Teacher Tapp analysis shows that early adopters of AI in schools are disproportionately younger, male and concentrated in independent schools – meaning the early benefits of AI exposure are accruing to groups already advantaged by age, gender and institutional resources.[_]

Fault Line 2: Access to Foundational AI Instruction

While generative AI tools are already being used by most pupils, most of that engagement is happening informally and without guidance. Few receive meaningful guidance on how to use it critically or responsibly and this is particularly the case in disadvantaged schools. For instance, new analysis commissioned by TBI and carried out by Teacher Tapp in July 2025 reveals that pupils in more disadvantaged schools are significantly less likely than their peers in other schools to receive structured teaching on how AI works and what it is, or how it can be applied within specific subjects.

Figure 11

Less affluent schools are less likely to focus on teaching foundational AI

Source: TBI-commissioned polling[_]

Fault Line 3: Barriers to Deeper Technical Engagement

Access to the pathways that enable deeper, technical engagement – particularly through computing education – remain highly uneven, too. For example, in the 2023/24 academic year, just 21 per cent of pupils who took GCSE computer science were girls.[_] This stark underrepresentation risks shutting out most girls from pursuing one of the most important fields of the future.

The picture is equally troubling when it comes to socioeconomic background. Schools serving the communities with the lowest uptake of free school meals are 60 per cent more likely to offer GCSE computer science than those with the highest uptake.[_]

Without access to this core subject, pupils miss out on the key technical skills that underpin more advanced study in AI and related fields. Without action to widen access, the AI workforce of tomorrow will be shaped by today’s imbalances – reinforcing inequality, limiting diversity and weakening the UK’s overall talent pipeline in a field of growing strategic importance.

Fault Line 4: The Accelerating Advantage of Independent Schools

Independent schools are offering more structured, hands-on and ambitious learning opportunities than most state-funded schools can currently provide. For instance, a recent study found that independent schools are three times more likely than state schools to have a clear school-wide strategy on using AI.[_] At one school, pupils are already training their own machine-learning models using text, image and audio inputs, and experimenting with programming languages to understand how AI systems work.[_]

State schools also trail the independent sector when it comes to accessing devices: for instance, only 1 per cent of primary and 7 per cent of secondary state schools lend devices to pupils, compared to 38 per cent and 20 per cent respectively in private schools. [_]

New analysis commissioned by TBI and conducted by Teacher Tapp highlights that pupils attending state-funded schools are markedly less likely than their peers in the independent sector to receive structured instruction on AI. This includes teaching on core AI concepts – such as what AI is and how it functions – as well as practical guidance on how to use AI to support learning and targeted teaching on how AI can be applied within specific subject areas.

For instance, independent-school teachers were 2.5 times more likely than their state-school peers to report that anyone at their school teaches pupils how to use AI in their learning. They were also almost three times more likely to report that at least some teachers teach pupils what AI is, and over 3.5 times more likely to report that anyone teaches pupils how to use AI within each subject.

Figure 12

Access to teaching of AI skills is far higher in private schools than in state schools

Source: TBI-commissioned polling[_]

Taken together, these disparities are laying the groundwork for a two-tier education system – one in which a select group of pupils are equipped to thrive in an AI-driven world, while many others are left behind with only limited or superficial exposure. This divide is not just about access to technology; it is about access to opportunity, confidence and the skills that will define success in the decades ahead. If unaddressed, it risks creating a new and enduring disadvantage gap – one that will shape pupils’ academic achievement, career prospects and ability to participate fully in civic life.


Chapter 3

A Plan to Build AI Proficiency in Englands Schools

The picture set out so far in this paper is not inevitable. With the right choices, England can lead the way in preparing young people for an AI-driven world – equipping every pupil with the skills to thrive in the economy of the future. In doing so, it can help secure the UK’s long-term prosperity, reduce inequality and strengthen its position on the global stage. Realising this opportunity will require the government to take urgent and deliberate action across four pillars – pupils, teachers, families and infrastructure.

Pillar 1 – Pupils: Make AI Proficiency a Core Outcome of Schooling in England

AI capabilities are no longer optional – they are critical life skills. Just as children today learn to read and write to navigate a world of language, so too must they learn to understand the systems of AI that increasingly shape their decisions, opportunities and social interactions. This means going beyond passive exposure to tools and instead fostering a critical understanding of what AI is, how it works, how to interpret AI-generated content, and how to use it safely, ethically and effectively. Equally, they need human-centred capabilities – such as critical thinking, communication and ethical reasoning – to use AI well. And for future innovators, clear technical pathways must ensure pupils can go beyond using AI to designing and building it.

Recommendation: Embed foundational AI literacy in the primary curriculum from Key Stage 2, using the SEAME framework (Social & Ethical, Application, Model, Engine).

England’s primary curriculum makes no mention of AI or the data-driven systems that underpin it. At Key Stage 1 and 2 (ages 5 to 11), it focuses on understanding algorithms, writing basic programs, using digital tools creatively and staying safe online. These are essential foundations but are no longer sufficient.

Government should embed AI literacy into the core primary curriculum. It should include clear, age-appropriate learning objectives related to AI, which should align with the SEAME framework – a structured model developed by the Raspberry Pi Foundation that breaks AI literacy into four components. Learning outcomes should begin in Key Stage 2 and the framework should also form the basis for progress into early secondary education, ensuring a coherent developmental pathway.[_]

As these curriculum changes are introduced, government should maintain and strengthen foundational computing skills – particularly programming in age-appropriate platforms, as well as core elements of computational thinking. These skills are not made redundant by AI; on the contrary, understanding how AI systems function requires a strong grasp of logic, abstraction and structured problem-solving.

Embedding AI into the core curriculum would give pupils the opportunity to build these critical-thinking and pattern-recognition skills early on. A light-touch, age-appropriate introduction – focused on core principles rather than technical depth – would prepare all pupils to engage critically with AI as they progress through school. And it would help demystify AI from an early age, cultivating curiosity and confidence rather than passive consumption.

Recommendation: Require all Year 7 pupils to take a standalone, age-appropriate AI-literacy module, introducing key concepts, applications and ethical considerations.

Most pupils leave school with no real understanding of AI – and are aware of this. For instance, more than two-thirds (69 per cent) of UK secondary-school pupils in the UK want more education on generative AI.[_]

Although computing is officially part of England’s National Curriculum until Key Stage 4, its delivery is weak. In practice, schools under pressure to maximise exam results often sideline non-examined computing content, meaning the curriculum requirement is frequently ignored. Secondary pupils receive only minimal exposure – often as little as one hour per week in Key Stage 3 – and the current curriculum emphasises basic and general computer-science principles rather than specific emerging technologies such as AI.

In addition, pupils are not required to take computer science at GCSE and the majority of pupils do not do so. A fifth of non-selective state schools do not offer GCSE computer science at all, while just 13 per cent of the Key Stage 4 cohort take GCSE computer science. [_],[_]

What is required is a continuum of the principles and learning that pupils would acquire as a result of TBI’s recommended changes to the primary curriculum, based again on the SEAME framework.

In the first instance, all pupils should undertake an age-appropriate, standalone AI-literacy module in Year 7, marking the transition into secondary education. Schools would receive funding to commission high-quality provision from trusted external organisations. The module would introduce key concepts such as the difference between narrow and general AI, common applications across industries, and the capabilities and limitations of AI systems. It would also explore ethical dimensions – addressing issues such as bias, data privacy, misinformation and the role of human oversight. Pupils would learn best practices for using AI tools, including how to ask questions and verify outputs.

It would offer this content in a way that is accessible, engaging and developmentally suitable for 11- to 12-year-olds. Content would be designed in line with the latest best practice on digital safety and responsible technology education, and would not involve direct use of generative AI tools, in recognition of platform age restrictions.

Recommendation: Revise the statutory computing curriculum (secondary) to include AI data use and applied-AI competencies such as digital-media use, basic cyber-security and data literacy.

In addition, the existing statutory computing curriculum (secondary) should be revised to better reflect the digital age. As well as programming and algorithms, it would examine how data are used in AI systems and applied-AI competencies such as digital-media usage, basic cyber-security and data literacy.[_],[_]

Recommendation: Require all pupils to complete a Certificate in Applied Computing before leaving school, combining modular, self-paced learning with hands-on projects. This would cover core programming, computational thinking and real-world digital skills such as prompting and critiquing AI, using data, basic coding and responsible tool use.

Finally, all pupils should work towards and complete a Certificate in Applied Computing by the end of Key Stage 4. This qualification would combine modular, self-paced learning with hands-on project work and digital-portfolio development. In addition to core programming and computational thinking skills, there would be a focus on real-world digital skills – such as prompting and critiquing AI systems, using data effectively, applying foundational coding, and the safe and responsible use of tools. The certificate would be flexibly timed: pupils could complete it at any point during secondary school, depending on readiness. Pupils who have not completed the certificate by the end of Key Stage 4 would be expected to continue progressing towards it, with completion required by the end of compulsory education.

Recommendation: Reform GCSE computer science to emphasise practical application, incorporating machine learning, ethics and project-based assessments such as building chatbots or analysing data sets.

A significant share of future jobs will demand deeper technical expertise. Government should take action to strengthen the pipeline of talent equipped to design, develop and deploy AI systems.

While advanced technical skills are developed in later stages of education, they rely on a strong foundation laid much earlier. The content of the current GCSE in computer science is academically narrow and outdated relative to the fast-evolving tech landscape. It focuses heavily on programming fundamentals, algorithms, computer architecture and theory, but does not include topics such as machine learning, data analytics or AI system design. It is also abstract and lacks practical application.

The GCSE in computer science should be reformed. It should be broadened to explicitly incorporate applied AI and machine learning earlier in the curriculum. This would expose pupils to core concepts such as training data, algorithms, model accuracy and the ethical implications of automated systems.

The revised curriculum should also emphasise practical, real-world problem-solving. Pupils should be given opportunities to engage in project-based learning, where they design and implement AI-driven solutions to real challenges – such as building a chatbot, creating a simple image classifier or analysing data to inform decisions. These kinds of projects mirror how AI is used in professional contexts.

Recommendation: Review and revise the English Baccalaureate and Progress 8 frameworks to better incentivise the uptake of computer science.

Uptake of computer science at GCSE remains low, despite growing demand for technical computing skills in the labour market. Just 13 per cent of the Key Stage 4 cohort take GCSE computer science.[_]

This is not simply a matter of pupil interest: provision itself is patchy. A fifth of non-selective state schools do not offer GCSE computer science at all.[_]

These trends point to a broader structural issue. The English Baccalaureate (EBacc) and Progress 8 accountability measures are over-prescriptive. Pupils are nudged to take a combination of subjects – English (two GCSEs), maths (one), science (up to three), a foreign language, and either history or geography. Pupils take just under eight GCSEs on average and, in practice, there is often little room for other subjects, including computer science.

The result is an imbalanced subject hierarchy that reduces pupil choice and restricts access to disciplines that are crucial for participation in the future workforce. Pupils should have greater agency to choose subjects aligned with their interests and future ambitions.

Government should, therefore, reform the EBacc framework, with a view to relaxing its subject constraints and making space for high-value technical and creative subjects, including computer science. Progress 8 weightings should also be revised to better reflect the full range of disciplines needed in a modern, AI-augmented economy.

Recommendation: Treat AI as a cross-curricular competency. Integrate AI across all subjects to enhance learning and higher-order skills, with updated subject specifications and assessable outcomes.

AI literacy should not be viewed as a standalone skill, but rather as a foundational competency that enhances learning across the curriculum. To reflect this, the national curriculum should embed an expectation that pupils will use AI as part of their learning and this integration should be accompanied by clear, assessable outcomes. Subject specifications should be updated accordingly, ensuring AI is meaningfully woven into each discipline and used to support critical inquiry.

This proposal does not reopen the counterproductive debate that pits “knowledge” against “skills”. Foundational subject knowledge remains indispensable and the curriculum should be designed to protect it. Pupils must acquire a secure understanding of core concepts, and this knowledge is the basis for discernment (being able to ask good questions, identify truth from misinformation and make informed decisions).

AI literacy should be developed in parallel with this knowledge base. Used in this way, AI can help pupils cultivate essential higher-order skills. For example, they can use AI to interrogate texts and data, identify patterns and relationships, generate creative outputs, and refine communication and reasoning. These are precisely the human-centred capabilities that complement AI.

To ensure meaningful integration, an appropriate assessment framework should accompany these changes. One promising approach would be to evaluate the quality of pupils’ prompts and questions when using AI systems. Many AI platforms already capture this data, making it feasible to track and assess. Rather than rewarding pupils for producing “correct” answers, assessments could instead recognise thoughtful engagement, the use of knowledge to evaluate responses from AI and evidence of deepened understanding facilitated by AI use.

Ofqual and the DfE should work with awarding bodies to codesign new assessment models that reflect this evolving pedagogical landscape. This could include developing task-based assessments involving structured AI use or piloting new marking criteria for AI-supported work. And by embedding AI into curriculum and assessment, awarding organisations would also be incentivised to innovate.

Spotlight

How Reformed Curriculums Could Weave in AI Literacy and Support Higher-Order Skills

History

  • Core knowledge: Pupils cover the requirement to explain militarism, alliances, imperialism, nationalism and the 1914 trigger for the first world war.

  • Retrieve first: Pupils draft a mind-map of the key causes of the first world war from memory.

  • AI as probe: Pupils ask a large language model (LLM) for questions that reveal gaps; each suggestion is accepted, revised or rejected with justification; pupils can add debates from current historian perspectives (so long as they are properly referenced).

  • Collaborative synthesis: Groups of pupils draft a three-minute script about the 1914 debate; AI can polish language but cannot add new facts.

Biology

  • Core knowledge: Pupils master reactants/products of photosynthesis and aerobic versus anaerobic respiration.

  • Retrieve first: Pupils label a blank flow diagram unaided.

  • AI as challenger: Pupils ask the LLM to spot omissions and pose “why” questions; pupils answer, then compare with AI explanations.

  • Pair project: Pairs design a one-slide pitch on photosynthesis which they then present; AI confined to layout tips and gap-spotting.

Recommendation: Overhaul curriculum review by creating a statutorily independent body to lead curriculum and assessment reform, with government retaining final say. Introduce staggered subject-review cycles and build a data platform, supported by expert input and AI-augmented consultations, to integrate diverse evidence and reflect real-world needs.

While the preceding recommendations would help ensure that pupils are equipped to engage with AI in the present, England’s education system must also be restructured to remain responsive to the fast-evolving skills demands of the future.

At the heart of this challenge is the way curriculum and assessment changes are currently designed and implemented: they are irregular, politically driven and insufficiently responsive to changing demands. This makes the system vulnerable to long periods of stagnation punctuated by disruptive, top-down overhauls.

To address this, government should establish a permanent, statutorily independent body to oversee curriculum and assessment reform. This body should operate on a non-partisan basis and be tasked with conducting regular, transparent and data-informed reviews of curriculum content and subject specifications. To retain a democratic anchor, the government would have ultimate responsibility for approving any changes, but the review process itself should be independent and, in practice, the government would need to provide clear justifications for rebutting its proposals.

One of the core functions of this new body would be to provide schools and teachers with clarity and certainty around when and how the curriculum will change. This could be achieved by setting out fixed review cycles for each subject area, with clear advance notice of any planned updates.

Crucially, the review schedule should be staggered across subjects to avoid overwhelming schools with simultaneous reforms. Review intervals should be long enough to allow new content to bed in, but regular enough to prevent the curriculum from becoming outdated.

This approach would allow school leaders and teachers to plan curriculum implementation, professional development and resource procurement with confidence. It would also enable pupils to benefit from a stable and coherent learning experience, rather than being caught in the churn of reform.

To ensure curriculum changes are informed by the realities of the modern world, the independent body should be supported by a comprehensive data platform consolidating the latest evidence and insights from:

  • parents and pupils

  • experts and practitioners, including educators, developmental psychologists, neuroscientists, employers and academic specialists

  • pedagogical research, ensuring reforms are grounded in what works in the classroom

  • labour-market trends, highlighting emerging sectors and in-demand skills

  • global education systems, offering lessons from international best practice

Where they can expedite and improve the quality and breadth of stakeholder feedback, the government should also run rapid-cycle consultations augmented by AI – for example, drawing on tools from initiatives such as the Incubator for Artificial Intelligence – to broaden participation and deepen the evidence base.

This approach would ensure that the curriculum remains agile and relevant, capable of evolving in step with technological progress, societal shifts and labour-market needs. For teachers and school leaders, it would provide predictability and reassurance, enabling them to implement changes with support and without unnecessary disruption. And it would ensure pupils are taught a curriculum that is aligned with the skills and knowledge they need to thrive.

Pillar 2 – Teachers and Leaders: Build Workforce Confidence and Capacity to Teach With and About AI

Pupils cannot build AI proficiency unless leaders and teachers themselves can support this. While the government has rightly begun issuing guidance on AI use, this is not enough on its own. Teachers are broadly optimistic about the possibilities AI presents but most lack the training and confidence to teach with and about AI, and they already juggle an overwhelming array of responsibilities, which means it is hard to upskill without focused support.

Recommendation: To localise training and support, create a national network of AI hubs offering peer-led professional development and teacher accreditation. Introduce government-funded fast-track pathways for AI lead teachers in every school, tied to career-boosting progression and leadership routes.

A decade ago, the government introduced a successful programme to transform maths education, based on two linked vehicles: a network of “maths hubs” (to build teaching expertise and provide localised professional development) and a nationally supported “maths mastery” approach (to define a shared pedagogical vision). The logic that underpins these steps should be applied to AI.

First, there should be a network of regional “AI hubs” that act as localised centres of excellence for teacher training, curriculum development and peer-led professional learning for AI and computing more broadly. The hubs would initially be hosted by schools already experimenting with AI. Each hub would have a dedicated lead for AI pedagogy and form partnerships with satellite schools, universities, technology companies and education researchers.

AI hubs would run structured professional development sessions on AI and computing more broadly, and participating teachers would be given support in transposing it to different key stages and school contexts. Small groups of teachers would engage in collaborative, inquiry-led continuing professional development over several months, rather than one-off training days. Teachers would have the chance to peer review each other’s lessons. Early-career teachers would be paired with more experienced AI educators to build confidence and there would be opportunities to pilot new approaches, with a view to building a communal base of evidence.

To drive participation, teachers who engage would receive formal accreditation. New, fast-track pathways would be created for teachers to become “AI lead teachers” in their schools, linked to opportunities for career-boosting progression and leadership roles. Funding would be available to cover participating teachers’ time – both to engage with AI-hub activity and to fulfil their roles as AI leads within their schools, where they would share their learnings and help embed change.

Spotlight

Estonias Approach to Diffusing Cutting-Edge AI Techniques to Teachers

Around half of the schools in Estonia have hired edtech professionals. These specialised leads do not necessarily undertake traditional subject teaching; instead, they focus on helping to bridge the gap between educators and digital tools. They run on-site learning labs and support other teachers through seminars on how to use technology in their subjects.

In 2025, the Ministry of Education and Research launched a public–private partnership, AI Leap 2025, which aims to provide free access to advanced AI-learning applications to tens of thousands of pupils and teachers, starting in September 2025.

The existing edtech network has been trained in the latest information about AI technology and will take part in intensive AI training to build AI literacy, embed tools into teaching and support ethical use.

Source: TBI conversations with Estonia’s Ministry of Education and Research, and Education Estonia[_]

Recommendation: Address the low uptake of GCSE computer science among girls by building on targeted initiatives, with AI hubs and school AI leads incubating new approaches and mainstreaming proven ones.

AI hubs and AI lead teachers should also play a key role in closing the gender gap in computing participation. Targeted interventions can boost girls’ engagement in STEM and these encouraging initiatives should be built on to tackle the low uptake of GCSE computer science.[_] AI hubs and leads should help incubate and test new approaches – working with partners who already adapt models for schools – and disseminate evidence-led practice by evaluating what works, sharing insights and supporting wider adoption. In this way, they can help turn promising pilots into mainstream practice and begin to close the persistent gender gap.

Recommendation: Launch an AI-mastery pedagogical framework to guide how AI is embedded in teaching and learning, evolving over time as technology advances and best practice emerges.

Second, there should be an AI-mastery framework to give schools a shared language and pedagogical structure around the foundational principles of AI. Because AI capabilities are changing quickly, mastery cannot take the form of a fixed approach. Instead, it should emphasise core principles and critical approaches, enabling teachers to help pupils understand AI foundations, interrogate outputs and reflect on their implications. Over time, a National Centre for AI in Education could codify emerging best practice, co-develop resources with early-adopting teachers and experts, and adapt guidance as the technology evolves.

Recommendation: Update the Teachers’ Standards and Early Career Framework to include AI competence.

In addition to the introduction of vehicles for diffusing specialist know-how, all teachers should be supported to reach a core level of competency and confidence in AI skills. Government should introduce a new AI teaching standard, while AI literacy should be embedded into the Early Career Framework (ECF).

The Teachers’ Standards set out the minimum level of practice expected from trainees working towards qualified teacher status (QTS) and those completing their statutory induction period. (They are also used to assess the performance of QTS-qualified teachers working in local authority-maintained schools.)

The current version of these standards does not reflect the pace of technological change or the emerging role of AI in education. In 2012, a requirement for digital literacy was removed altogether. A modernised version of that requirement, with a strong focus on AI, should be reinstated into the standards. Such a move would signal that AI competence is a core expectation of the teaching profession, not an optional extra, and would drive cultural and systemic change.

Under these changes, every teacher would be supported to plan, deliver and assess learning safely and effectively with AI tools, and to embed basic AI literacy into their teaching. The government could draw lessons from international examples (like OECD peers such as Singapore and Estonia, which have embedded equivalent digital-pedagogy standards) to help inform their design.

The ECF serves as the foundation for the induction of all new teachers in England, covering their first two years of practice, and is the ideal vehicle for introducing AI fluency at scale from the outset of a teacher’s career. To prepare early-career teachers for a rapidly evolving digital landscape, the DfE should introduce an AI-literacy component into the ECF. This would align with proposed revisions to the Teachers’ Standards.

Recommendation: Introduce a new National Professional Qualification (NPQ) in Leading AI and Digital Innovation to support school leadership.

There is still widespread inaction when it comes to addressing the implications of generative AI (see Figure 13). Across all sectors, most schools have yet to take concrete steps: 52 per cent of primary, 17 per cent of secondary and 54 per cent of special-school leaders report having no current plans to address the issue. Only a small minority have already made changes – just 2 per cent in primary and 10 per cent in secondary schools. While some leaders are beginning to explore their options, decisive action remains the exception, not the rule.

Figure 13

Only a small minority of schools have made changes to account for generative AI

Source: Department for Education[_]

Note: Numbers may not add up to exactly 100 due to rounding.

While some school leaders already have the technical confidence to guide AI adoption effectively, many do not. For a large proportion, the rapid pace of technological change, and the complexity of issues surrounding data protection, safeguarding and pedagogy, have understandably created uncertainty and hesitation. School leaders need structured opportunities to develop the knowledge required to navigate AI safely, strategically and in ways that align with their schools’ broader mission.

National Professional Qualifications (NPQs) are government-recognised, DfE-funded leadership and specialist development programmes for teachers and school leaders in England. They are designed to support career progression and improve school performance by offering structured, evidence-informed training in areas such as behaviour, literacy, teaching and senior leadership. Yet current NPQs do not provide a dedicated route for developing expertise in AI.

The DfE should introduce a new NPQ in Leading AI and Digital Innovation. This new NPQ would equip educators with the knowledge and skills needed to lead the development of AI-related staff training and digital strategy; address ethical, safeguarding and data-protection concerns; and promote AI literacy across the curriculum.

To avoid adding financial pressure on schools, the DfE should fund this NPQ on the same basis as it funded other NPQs in its Covid-19 recovery plan, making it free to eligible state-school staff and delivered by accredited providers. Schools may need to release participants for training time, but this would align with existing NPQ delivery models, which are structured to minimise disruption.

Pillar 3 – Families: Equip Parents and Carers to Support AI Readiness at Home

Parents and carers play a vital role in shaping how children learn at home, including how they engage with AI. However, many lack the knowledge required to support AI proficiency and attitudes towards AI in education are mixed. There needs to be a clear, engaging strategy to bring parents into the conversation.

Government research suggests that early conversations with parents about AI have often been marked by confusion, scepticism and a limited ability to imagine how AI could realistically be used in schools.[_] Yet crucially, when parents are presented with clear, concrete examples of how AI supports teaching and learning – and when safeguards are clearly explained – they are much more open and willing to engage.

However, most schools have not yet begun communicating with parents about AI. This is largely due to the newness of the technology, as well as the many competing demands schools face with limited capacity. In one survey, 60 per cent of parents reported receiving no information from their child’s school regarding the use of generative AI in teaching.[_] In the absence of guidance, many parents have formed their views in a vacuum – according to government research, public understanding of AI is very low, leading to scepticism and widespread confusion around what AI is and how it works.[_]

Recommendation: Require every school to create a parental-engagement plan on AI, driven by the school’s new AI lead. This should explain AI use, safeguards and how parents can support learning. Schools would choose their own engagement methods, using resources curated by the Department for Education (DfE) if desired.

Instead of hearing only dystopian scenarios, parents should be exposed to AI’s potential from sources they trust. As understanding improves, scepticism can give way to informed optimism.

To support this shift, the AI leads recommended in this report should guide each school’s strategic approach. These leads should also oversee schools’ parental engagement on AI.

Each lead should publish a simple, accessible summary of how AI is used. This could clearly explain, for example, that AI tools may help teachers plan lessons, that human oversight is always maintained when decisions are made about pupils’ progress, and that all data is anonymised, protected and never shared improperly. Such transparency would address common fears and demonstrate that parental concerns are being taken seriously.

Schools should be given flexibility in how they engage parents and would be able to draw from a DfE-curated suite of learning materials co-designed with parents, educators and AI experts, if they so wished. Each school could choose the approach that best fits its community – whether through online webinars, mobile app-based bite-sized learning or drop-in sessions for parents.

Other options might include scenario-based discussions exploring how AI is used in the classroom, orientation sessions for new families, or hands-on workshops where parents can try out AI tools and see teacher- or pupil-led demonstrations – such as how large language models support essay planning or how AI can assist pupils with special educational needs.

Recommendation: Establish a national network of parent ambassadors, trained to provide peer-to-peer support, run tech demonstrations, host online groups and advise on home-based AI learning. Fund outreach with micro-grants, prioritising disadvantaged communities.

Peer-led advocacy would also help to build trust in AI and bolster parents’ confidence in using it to support their children’s learning. A grassroots movement of empowered parents would treat parents as partners and leaders in the process of transitioning to an AI-based education system. Having visible parent ambassadors in every school would send a signal that AI in education is nothing to fear if properly used.

The government should, therefore, sponsor a parent-led network that would train and mobilise parents to help other parents navigate AI in education. Schools would identify a group of interested parents who are digitally savvy or simply enthusiastic to learn and lead. These volunteers would undergo training on best practices. They would then become a go-to resource for other parents.

Over time, this could evolve into a national network, where parents from different regions exchange tips and common concerns are fed to policymakers.

These parents would be a first port of call for practical questions about how to set up and navigate tools at home. They could run WhatsApp groups for other parents, where they might share bite-sized use-case clips or where parents could share relatable testimonials. They could host informal drop-in clinics to address parents’ concerns (showing, for instance, how to support a child to use AI well and think critically while doing so). And micro-grants would be available to run local events, tech try-out sessions and myth-busting campaigns in partnership with schools, PTAs, libraries and community centres.

Pillar 4 – Infrastructure: Build the Digital Foundations for AI Readiness

The digital infrastructure for AI readiness is not yet in place. Using AI depends on fast, reliable connectivity, modern devices and the technical capacity to support their effective use. Yet across England, the digital foundations needed to make this a reality remain uneven and underdeveloped – both in schools and at home.

While most schools have broadband connections, many are patchy, inconsistent and lack the bandwidth to stream high-volume content or support large numbers of devices simultaneously. According to one study, only 68 per cent of schools report having reliable WiFi.[_]

In addition, new analysis commissioned by TBI and conducted by Teacher Tapp reveals that progress in expanding reliable WiFi coverage in schools between 2024 and 2025 has been modest at best. The most notable improvement is in secondary schools, where full-school coverage increased only slightly from 49 per cent to 53 per cent. Meanwhile, rates of partial coverage in both primary and secondary settings have remained unchanged or even declined – underscoring a persistent infrastructure gap that continues to undermine digital readiness across the education system.

Figure 14

Reliable WiFi remains out of reach for many schools

Source: TBI-commissioned polling[_]

Device availability remains a significant barrier to digital readiness. New analysis commissioned by TBI and carried out by Teacher Tapp shows that, although more schools are now providing pupils with access to laptops and tablets for in-class learning compared to 2024, overall access is still very limited. The data also reveal a sharp divide between primary and secondary schools, with secondary pupils remaining notably more underserved – particularly in relation to tablets. While primary schools recorded modest improvements in device access between 2024 and 2025, progress in secondary settings has lagged, underscoring persistent inequalities across key stages.

Figure 15

Access to laptops and tablets for in-class learning is still low

Source: TBI-commissioned polling[_]

The picture at home is equally concerning. Some 34 per cent of parents of school-aged children report that their child does not have continuous access to a device at home on which they can do their online schoolwork.[_]

Unlocking the full benefits of AI in education will depend on closing these infrastructure gaps. The DfE has set out 11 digital and technology standards for schools to meet by 2030, including secure broadband, resilient WiFi, cloud-based systems, safeguarding protocols and technical support. But current investment is not enough to deliver them at scale.

The UK risks falling behind global frontrunners that are rapidly modernising their education systems to support digital and AI readiness. Other countries have already taken bold, system-wide steps to close infrastructure gaps. For example, Japan’s GIGA project (between 2019 and 2021) completed 97.6 per cent of its planned delivery of connected hardware devices, in a quarter of the time they had originally planned.[_] In December 2024, Germany launched Digitalpakt Schule 2.0, a €5 billion programme to upgrade digital infrastructure in schools, train teachers, and support innovative teaching and learning methods.[_]

If the government fails to match the level of ambition shown by leading countries, it risks leaving the next generation of pupils – and the future workforce – underprepared for the realities of a rapidly evolving digital world. To close the digital-infrastructure gap and enable equitable, AI-powered learning in every school, the government should commit to two actions.

Recommendation: Dramatically scale up investment in digital infrastructure, ensuring universal high-speed broadband and resilient WiFi in all schools, and equip schools with the technical capacity required to manage devices and connectivity.

First, it should significantly scale up investment in high-speed broadband and resilient WiFi, ensuring that every school site benefits from reliable connectivity, and equip schools with the technical capacity required to manage devices and connectivity.

Recommendation: Adopt a hybrid device-access model, with Bring Your Own Device as the default and government-funded loaner devices for pupils without access. Ensure every teacher and secondary pupil has a device, and ensure there is at least one per five pupils in primary school. Monitor and adapt the programme to ensure it is effective, and that device ratios remain suitable.

Second, it should adopt a hybrid model for digital-device access in schools. As the default, pupils with access to suitable technology should be encouraged to Bring Your Own Device (BYOD), while those without should be provided with school-managed, government-funded loaner devices. Schools would manage these loan schemes using ring-fenced government funding to purchase and maintain devices for pupils unable to supply their own.

To ensure sufficient access, every secondary pupil and teacher should have a tablet, and in primary schools, there should be at least one device for every five pupils – a threshold consistent with Glasgow City Council’s recent rollout, which balanced learning benefits with concerns about screen time.[_] These minimum standards could be met through either BYOD or government-funded loaner devices.

The approach builds on precedents elsewhere. In Victoria and New South Wales in Australia, BYOD is paired with loaner schemes for pupils in need.[_] Estonia maintains school-based reserves alongside a national BYOD framework, and in Denmark over two-thirds of schools run BYOD programmes.[_]

A fully government-funded rollout at these ratios would cost approximately £0.6 billion annually.[_] However, integrating BYOD would substantially reduce public costs – thereby supporting inclusion without placing financial strain on schools or requiring full-scale device distribution.

To ensure effectiveness, the policy should include ongoing monitoring of device usage, access and outcomes, with minimum ratios reviewed regularly to ensure all pupils remain consistently and equitably supported.

Recommendation: Explore partnerships, both with industry and within its own estate, to reclaim, refurbish and supply devices slated for disposal – thereby cutting costs, expanding access and promoting circular, sustainable use of technology.

Third, the government should explore partnerships – both with industry and within its own estate – to reclaim devices slated for disposal, refurbish them and distribute them through the loaner scheme. This approach would reduce costs and support sustainable resource use. A similar initiative in Estonia, where banks have partnered with recyclers to repurpose well‑maintained corporate devices for use by schools, demonstrates how such models can foster circularity while expanding access to technology.[_]

Anything less risks entrenching a digital divide that will only widen with time. Without reliable connectivity and device access, the UK can never realistically reach a point where all pupils are truly AI-ready.


Conclusion

As this report makes clear, a bold, integrated package of reforms is needed: a reimagined curriculum that embeds AI proficiency deeply and broadly, a teaching workforce that is fully supported to lead this transformation, a national strategy that brings parents along as informed, empowered partners, and investment in reliable connectivity and digital-device access. Taken together, these reforms would not only prepare pupils to navigate an AI-driven world – they would give every young person the chance to lead, shape and thrive in it.


Acknowledgements

The Bright Initiative provided LinkedIn job postings data for this report. Bright Data is leading Data for Good projects across the globe through its commitment to empower nonprofits, researchers and public institutions to amplify their impact. The Bright Initiative powered by Bright Data supports projects from 750+ global mission-driven partners by providing cutting-edge public web data solutions to address the world’s most pressing challenges. The methodology employed by TBI is available for review here.

Footnotes

  1. 1.

    https://institute.global/insights/economic-prosperity/the-impact-of-ai-on-the-labour-market

  2. 2.

    https://www.pwc.co.uk/press-room/press-releases/research-commentary/artificial-intelligence--ai--exposed-sectors-see-a-fivefold-incr.html

  3. 3.

    TBI/Teacher Tapp (see Figure 4)

  4. 4.

    https://www.rm.com/news/2023/artificial-intelligence-in-education

  5. 5.

    https://www.gov.uk/government/news/ai-revolution-to-give-teachers-more-time-with-pupils

  6. 6.

    TBI/Teacher Tapp (see Figure 7)

  7. 7.

    https://www.lloydsbank.com/assets/media/pdfs/banking%5Fwith%5Fus/whats-happening/lb-consumer-digital-index-2024-report.pdf

  8. 8.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  9. 9.

    https://consult.education.gov.uk/computing-policy-team/gcse-computer-science-subject-content-update/supporting%5Fdocuments/Proposed%20GCSE%20Computer%20Science%20subject%20content%20update%20%20Equality%20Impact%20Assessment.pdf

  10. 10.

    TBI/Teacher Tapp (see Figure 11)

  11. 11.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  12. 12.

    https://www.suttontrust.com/wp-content/uploads/2025/07/Artificial-advantage.pdf

  13. 13.

    https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-literacy-report-2024.pdf?v=368229

  14. 14.

    https://www.ft.com/content/4e260abd-2528-4d34-8fa4-a21eabfd6db9

  15. 15.

    https://shapingwork.mit.edu/wp-content/uploads/2023/10/acemoglu-restrepo-2019-automation-and-new-tasks-how-technology-displaces-and-reinstates-labor.pdf

  16. 16.

    https://www.nytimes.com/2024/04/01/business/ai-tech-economy.html

  17. 17.

    https://www.pwc.com/gx/en/issues/artificial-intelligence/job-barometer/aijb-2025-united-kingdom-analysis.pdf

  18. 18.

    https://www.rcr.ac.uk/news-policy/latest-updates/scanners-shortfalls-and-stark-statistics/

  19. 19.

    https://www.rcr.ac.uk/news-policy/policy-reports-initiatives/clinical-radiology-census-reports/

  20. 20.

    https://www.raspberrypi.org/app/uploads/2025/06/Why-kids-still-need-to-learn-to-code-in-the-age-of-AI-2025-Raspberry-Pi-Foundation-position-paper.pdf

  21. 21.

    TBI analysis using TBI, UNESCO, Evergreen, WE Forum and EU evidence

  22. 22.

    https://shapingwork.mit.edu/wp-content/uploads/2023/10/acemoglu-restrepo-2019-automation-and-new-tasks-how-technology-displaces-and-reinstates-labor.pdf

  23. 23.

    https://institute.global/insights/economic-prosperity/the-impact-of-ai-on-the-labour-market

  24. 24.

    https://blogs.worldbank.org/en/education/teachers-are-leading-an-ai-revolution-in-korean-classrooms

  25. 25.

    https://www.koreaherald.com/article/10517548

  26. 26.

    https://asianews.network/south-korea-to-inject-70m-into-ai-powered-public-education/

  27. 27.

    https://eesti.org.au/2025/07/27/estonias-ai-leap-chatbots-are-heading-to-the-classroom/

  28. 28.

    https://e-estonia.com/ai-leap-2025-estonia-sets-ai-standard-in-education

  29. 29.

    https://e-estonia.com/ai-leap-2025-estonia-sets-ai-standard-in-education/

  30. 30.

    Ministry of Education, Estonia

  31. 31.

    https://english.www.gov.cn/news/202503/12/content_WS67d18e9ec6d0868f4e8f0c40.html

  32. 32.

    https://beijingpost.com/china-launches-comprehensive-ai-education-initiative-across-schools

  33. 33.

    https://www.tech.gov.sg/technews/ai-in-education-transforming-singapore-education-system-with-student-learning-space

  34. 34.

    https://mothership.sg/2024/10/classes-on-ai-to-be-offered-to-all-primary-secondary-school-students-in-spore/

  35. 35.

    https://www.bmftr.bund.de/SharedDocs/Kurzmeldungen/DE/2024/12/Digitalpakt.html?nn=909712

  36. 36.

    https://deutsches-schulportal.de/bildungswesen/was-hat-der-digitalpakt-schule-bislang-gebracht/

  37. 37.

    “Economically disadvantaged pupils” are pupils who are currently eligible for free school meals or have been eligible for free school meals at any point in the previous six years.

  38. 38.

    https://epi.org.uk/publications-and-research/annual-report-2024/

  39. 39.

    https://www.sciencedirect.com/science/article/abs/pii/S1751157720301991

  40. 40.

    https://www.ethnicity-facts-figures.service.gov.uk/work-pay-and-benefits/employment/employment-by-occupation/latest/

  41. 41.

    https://institute.global/insights/economic-prosperity/the-impact-of-ai-on-the-labour-market

  42. 42.

    https://www.ilo.org/resource/article/how-might-generative-ai-impact-different-occupations

  43. 43.

    Data provided by Bright Data.

  44. 44.

    Text analysis allowed us to remove any mentions of AI which related to its use in the application process. A manual analysis of the text revealed that mentions of AI are therefore related to the company itself, the tools provided for employees or required skills. All three instances would require employees to have some degree of AI literacy.

  45. 45.

    Sourced from multiple surveys across different years: https://www.gov.uk/government/collections/omnibus-surveys#2024-reports. Usually, data would be compared in consistent time lengths (December to December 2023/24) to eradicate any seasonal influence in pupil participation and make the findings more consistent. This is a limitation of the analysis.

  46. 46.

    https://www.twinkl.co.uk/blog/ai-in-education-survey-what-uk-and-us-educators-think-in-2025

  47. 47.

    https://teachertapp.com/app/uploads/2025/01/AI-Spy-Tracking-the-AI-Tools-Teachers-Use.pdf?FIRSTNAME=Becci&LASTNAME=Peters&email=becci.peters%40bcs.uk&submissionKey=cvt-sub-key-cd7320c0af30ee733658e1c1581cf8ba9bfdbd5334241f8f86d47687c376b36b

  48. 48.

    https://www.gov.uk/government/publications/school-and-college-voice-omnibus-surveys-for-2024-to-2025/school-and-college-voice-november-2024#sec-GenAI

  49. 49.

    Teacher Tapp Survey of teachers in England conducted 2 July 2025. Question answered by 4,452 state secondary-school teachers (results weighted to reflect national teacher and school demographics). The comparison year (2024) only included secondary schools, which means for this figure the comparison only covers these schools. We further limited the sample to state secondary schools for this question to cover only those schools that the DfE funds. In this year, this question was answered by 4,738 teachers.

  50. 50.

    https://www.gov.uk/government/publications/parent-pupil-and-learner-voice-omnibus-surveys-for-2024-to-2025

  51. 51.

    https://plc.pearson.com/en-GB/news-and-insights/news/new-research-pearson-finds-over-two-thirds-uk-secondary-students-want-learn

  52. 52.

    https://www.rm.com/news/2023/artificial-intelligence-in-education

  53. 53.

    https://literacytrust.org.uk/research-services/research-reports/children-young-people-and-teachers-use-of-generative-ai-to-support-literacy-in-2024/

  54. 54.

    https://www.sciencedirect.com/science/article/pii/S0360131524002380

  55. 55.

    https://arxiv.org/abs/2404.19699

  56. 56.

    https://knowledge.wharton.upenn.edu/article/without-guardrails-generative-ai-can-harm-education/

  57. 57.

    https://www.gov.uk/government/news/ai-revolution-to-give-teachers-more-time-with-pupils

  58. 58.

    https://www.gov.uk/government/publications/school-and-college-voice-omnibus-surveys-for-2024-to-2025/school-and-college-voice-november-2024

  59. 59.

    https://bmcpsychology.biomedcentral.com/articles/10.1186/s40359-025-02620-4

  60. 60.

    https://www.gov.uk/government/publications/school-and-college-voice-omnibus-surveys-for-2024-to-2025/school-and-college-voice-november-2024

  61. 61.

    https://www.tes.com/magazine/news/general/majority-teachers-need-ai-training-boost-confidence

  62. 62.

    https://www.gov.uk/government/news/ai-revolution-to-give-teachers-more-time-with-pupils

  63. 63.

    Teacher Tapp Survey of teachers in England, including those in private schools, conducted 2 July 2025. Question answered by 7,108 teachers (results weighted to reflect national teacher and school demographics). Responses were filtered based on teachers answering that they had at some point used generative AI tools, based on the question: “When was the last time you used a *general* AI tool (such as ChatGPT, Google Gemini, Microsoft Copilot, DALL-E, Midjourney) to help you with your schoolwork?”

  64. 64.

    https://www.lloydsbank.com/assets/media/pdfs/banking%5Fwith%5Fus/whats-happening/lb-consumer-digital-index-2024-report.pdf

  65. 65.

    https://www.gov.uk/government/publications/research-on-parent-and-pupil-attitudes-towards-the-use-of-ai-in-education/research-on-public-attitudes-towards-the-use-of-ai-in-education

  66. 66.

    https://www.gov.uk/government/publications/parent-pupil-and-learner-voice-omnibus-surveys-for-2024-to-2025/parent-pupil-and-learner-voice-december-2024

  67. 67.

    https://www.gov.uk/government/publications/research-on-parent-and-pupil-attitudes-towards-the-use-of-ai-in-education/research-on-public-attitudes-towards-the-use-of-ai-in-education

  68. 68.

    https://www.gov.uk/government/publications/parent-pupil-and-learner-voice-omnibus-surveys-for-2024-to-2025/parent-pupil-and-learner-voice-december-2024

  69. 69.

    https://assets.publishing.service.gov.uk/media/672a2743094e4e60c466d160/Employer%5FSkills%5FSurvey%5F2022%5Fresearch%5Freport%5F%5FNov%5F2024%5F.pdf

  70. 70.

    https://assets.publishing.service.gov.uk/media/672a2743094e4e60c466d160/Employer%5FSkills%5FSurvey%5F2022%5Fresearch%5Freport%5F%5FNov%5F2024%5F.pdf

  71. 71.

    TBI analysis of ONS job-vacancies data across different occupations in the UK in 2025. The vacancy data were paired with the O*NET skills database that quantifies the level of different skills needed across all major occupations, enabling an estimate of the skill levels currently required in the UK labour market.

  72. 72.

    https://www.cipd.org/globalassets/media/knowledge/knowledge-hub/reports/2024-pdfs/8735-changing-face-of-the-youth-labour-market-web.pdf

  73. 73.

    https://www.pwc.co.uk/press-room/press-releases/research-commentary/artificial-intelligence--ai--exposed-sectors-see-a-fivefold-incr.html

  74. 74.

    https://www.techmonitor.ai/leadership/52-uk-tech-leaders-face-ai-skills-gap/#:~:text=A%20new%20study%20has%20found,including%20924%20in%20the%20UK

  75. 75.

    https://www.uktech.news/ai/ai-becomes-the-uks-fastest-growing-skills-gap-report-finds-20250519?

  76. 76.

    https://www.uktech.news/ai/ai-becomes-the-uks-fastest-growing-skills-gap-report-finds-20250519#:~:text=Artificial%20intelligence%20has%20become%20the,years%2C%20according%20to%20new%20research

  77. 77.

    https://researchbriefings.files.parliament.uk/documents/POST-PN-0746/POST-PN-0746.pdf#:~:text=49,and%20at%20various%20levels%20of

  78. 78.

    https://www.ox.ac.uk/news/2025-03-04-skills-based-hiring-driving-salary-premiums-ai-sector-employers-face-talent-shortage#:~:text=Change%20www,this%20had%20dropped%20to%2031

  79. 79.

    https://www.pwc.co.uk/services/technology/generative-artificial-intelligence/uk-ai-jobs-barometer.html#:~:text=Jobs%20that%20require%20AI%20specialist,premium

  80. 80.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  81. 81.

    https://consult.education.gov.uk/computing-policy-team/gcse-computer-science-subject-content-update/supporting%5Fdocuments/Proposed%20GCSE%20Computer%20Science%20subject%20content%20update%20%20Equality%20Impact%20Assessment.pdf

  82. 82.

    https://www.suttontrust.com/wp-content/uploads/2025/07/Artificial-advantage.pdf

  83. 83.

    https://ukstories.microsoft.com/features/new-report-spotlights-inadequate-access-to-technology-in-english-schools/

  84. 84.

    https://teachertapp.com/app/uploads/2025/01/AI-Spy-Tracking-the-AI-Tools-Teachers-Use.pdf?FIRSTNAME=Becci&LASTNAME=Peters&email=becci.peters%40bcs.uk&submissionKey=cvt-sub-key-cd7320c0af30ee733658e1c1581cf8ba9bfdbd5334241f8f86d47687c376b36b

  85. 85.

    Teacher Tapp Survey of teachers in England, including those in private schools, conducted 2 July 2025. Question answered by 7,017 teachers (results weighted to reflect national teacher and school demographics). The bands for responses based on socioeconomic status are based on the proportion of pupils in the teacher’s school with entitlements to free school meals. Teacher responses are proportioned into quartiles based on this information.

  86. 86.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  87. 87.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  88. 88.

    https://www.suttontrust.com/wp-content/uploads/2025/07/Artificial-advantage.pdf

  89. 89.

    https://www.cliftonhigh.co.uk/artificial-intelligence-in-the-clifton-high-school-classroom

  90. 90.

    https://ukstories.microsoft.com/features/new-report-spotlights-inadequate-access-to-technology-in-english-schools/

  91. 91.

    Teacher Tapp Survey of teachers in England, including those in private schools, conducted 2 July 2025. Question answered by 7,017 teachers (results weighted to reflect national teacher and school demographics).

  92. 92.

    https://www.raspberrypi.org/blog/ai-education-resources-what-to-teach-seame-framework/

  93. 93.

    https://plc.pearson.com/en-GB/news-and-insights/news/new-research-pearson-finds-over-two-thirds-uk-secondary-students-want-learn

  94. 94.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  95. 95.

    https://consult.education.gov.uk/computing-policy-team/gcse-computer-science-subject-content-update/supporting%5Fdocuments/Proposed%20GCSE%20Computer%20Science%20subject%20content%20update%20%20Equality%20Impact%20Assessment.pdf

  96. 96.

    https://www.theguardian.com/guardian-foundation/2024/dec/02/the-guardian-foundation-call-on-the-government-to-embed-news-and-media-literacy-into-the-curriculum

  97. 97.

    https://lordslibrary.parliament.uk/the-future-of-news-report-by-the-house-of-lords-communications-and-digital-committee/

  98. 98.

    https://consult.education.gov.uk/computing-policy-team/gcse-computer-science-subject-content-update/supporting%5Fdocuments/Proposed%20GCSE%20Computer%20Science%20subject%20content%20update%20%20Equality%20Impact%20Assessment.pdf

  99. 99.

    https://www.kcl.ac.uk/ecs/assets/kcl-scari-computing.pdf

  100. 100.

    https://www.educationestonia.org/educational-technologist-estonia/

  101. 101.

    See, for example: https://gap.hks.harvard.edu/improving-girls-sense-fit-science-increasing-impact-role-models; https://gap.hks.harvard.edu/does-encouragement-matter-improving-gender-imbalances-technical-fields-evidence-randomized

  102. 102.

    https://www.gov.uk/government/publications/school-and-college-voice-omnibus-surveys-for-2023-to-2024/school-and-college-voice-february-2024#introduction

  103. 103.

    https://www.gov.uk/government/publications/research-on-parent-and-pupil-attitudes-towards-the-use-of-ai-in-education/research-on-public-attitudes-towards-the-use-of-ai-in-education

  104. 104.

    https://www.internetmatters.org/hub/press-release/ai-research-warns-schools-unprepared-artificial-intelligence/

  105. 105.

    https://www.gov.uk/government/publications/research-on-parent-and-pupil-attitudes-towards-the-use-of-ai-in-education/research-on-public-attitudes-towards-the-use-of-ai-in-education

  106. 106.

    https://assets.publishing.service.gov.uk/media/655f8b823d7741000d420114/Technology%5Fin%5Fschools%5Fsurvey%5F%5F2022%5Fto%5F2023.pdf

  107. 107.

    Teacher Tapp Survey of teachers in England, including those in private schools, conducted 2 July 2025. Question answered by 6,968 teachers (results weighted to reflect national teacher and school demographics).

  108. 108.

    Teacher Tapp Survey of teachers in England, including those in private schools, conducted 2 July 2025. Question answered by 6,968 teachers (results weighted to reflect national teacher and school demographics).

  109. 109.

    https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-literacy-report-2024.pdf

  110. 110.

    https://www.trade.gov/market-intelligence/japan-giga-update-edtech

  111. 111.

    https://www.bmftr.bund.de/SharedDocs/Kurzmeldungen/DE/2024/12/Digitalpakt.html?nn=909712

  112. 112.

    https://www.glasgow.gov.uk/article/3122/Thousands-of-Glasgow-pupils-to-benefit-from-digital-learning-strategy-pupil-iPad-roll-out-begins

  113. 113.

    https://www.theguardian.com/australia-news/2024/oct/15/technology-access-the-smith-family-national-device-bank-deborah-schoolwork-shopping-centre-wifi

  114. 114.

    https://fcl.eun.org/byod-europe-world

  115. 115.

    https://assets.ctfassets.net/75ila1cntaeh/68U0TqFbEjrucoQVIEp3yl/3ef9a68c403ff650e27c87701ef593ea/The%5FEconomic%5FCase%5Ffor%5FAI-Enabled%5FEducation%5FFinal%5Fv2.pdf

  116. 116.

    https://greendice.com/banks-used-computers-create-value-for-schools-and-individuals-across-estonia/

Newsletter

Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions