ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Building AI-ready skills

Zemina Hasham at Turnitin describes how universities can build workplace-ready AI skills to support UK innovation and competitiveness

As AI becomes central to business success and long-term competitiveness, UK organisations are under increasing pressure to build a workforce that can confidently use these technologies.   Skilled graduates should be part of this pipeline. However, a growing disconnect between university training and workplace expectations is making this difficult. 

 

A global study shows that 58% of employers believe universities are falling short in preparing graduates. And with 54% of organisations now requiring AI skills for all early-career roles, the gap is only widening. 

 

To better prepare students for work and strengthen a workforce already facing an AI skills shortage, universities need to ensure they are equipping students with the skills required to navigate a rapidly evolving technology landscape.

 

 

Employer expectations rising fast

Businesses are increasing their expectations around AI capabilities, and they are doing so at a time when graduates are entering the toughest job market since 2018. As conversational AI becomes part of daily work, already utilised by 39% of employers who use AI, organisations are increasingly seeking new hires who can use these tools thoughtfully and safely from day one.  

 

This pressure is intensified by an emerging internal capability gap: 61% of employers report having no staff currently working with AI, and only 11% offered AI‑related training in the last year. With limited upskilling happening in-house, companies are looking to new graduates to close the gap. 

 

Despite this, many graduates are still leaving university with limited practical experience using AI tools or understanding their operational impact. And in the absence of structured training, seven in ten alumni report having taught themselves AI informally. Without stronger AI literacy embedded into higher education, the divide between employer expectations and graduate readiness will continue to expand.

 

 

Upskilling educators

Universities have a crucial role to play in preparing students to use AI responsibly and effectively, both in academic and professional settings. However, to support students, educators themselves need to understand how best to use AI. This means understanding which tools are appropriate, how to design learning activities that foster critical thinking, and how to model responsible AI use in the classroom.

 

Recent findings show only 43% of educators feel confident using AI, rating their proficiency at just three out of ten, with over 60% asking for help applying AI to planning and support tasks. This presents a structural challenge: if educators lack the knowledge and assurance to use AI, they can’t successfully prepare students for practical workplace use.

 

Institutions should prioritise targeted professional development that helps staff understand how AI works, its advantages and limitations, and how to evaluate it critically. Once equipped, educators can create low‑stakes learning opportunities that help students experiment with AI as a support for problem‑solving. For example, an instructor might guide students in using an AI tool to break down a complex concept, demonstrating how AI can accelerate understanding without replacing independent thought.

 

 

Embedding AI into the student experience

With nearly half of students afraid to use AI due to concerns about violating academic rules, institutions can reduce ‘AI detector anxiety’ by clarifying their academic integrity policies. Clear guidance on ethical AI use can reduce student anxiety and encourage responsible adoption.

 

Once policies are in place, institutions can create learning environments where students understand not only how to use AI, but when and why. This transparency helps demystify the technology and ensures students develop the ethical judgment employers increasingly expect. In practice, this means modelling responsible AI use while encouraging critical thinking and reflection. Similarly, assessment frameworks capture the entire learning journey, from research, drafting, and revision, rather than just the final submission. 

 

Regular checkpoints can help reinforce this process-focused approach to learning. Tools that offer visibility into the writing and composition process allow educators to offer more targeted feedback and facilitate open conversations about AI use in assessments. These discussions help emerging talent build the skills businesses increasingly value: the ability to self-assess, apply sound judgement when using AI, and clearly articulate the rationale behind the decisions.

 

 

Closing the gap

The gap between academic preparation and business expectations is widening. But with clear policies, confident educators, and curriculum-embedded AI learning, universities can help close it.

 

By investing in staff development and equipping students with the skills to engage with AI critically and responsibly, institutions can produce graduates who are not only academically capable but ready to make an immediate impact in an AI-driven workplace. In doing so, universities help strengthen the talent pipeline that businesses increasingly depend on for innovation, competitiveness, and long-term growth.

 


 

Zemina Hasham is Chief Customer Experience Officer at Turnitin 

 

Main image courtesy of iStockPhoto.com and BlackJack3D

Business Reporter

Winston House, 3rd Floor, Units 306-309, 2-4 Dollis Park, London, N3 1HF

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543