Jonathan Sharp at Britannic advises organisations to build a rigorous strategy when implementing AI
If AI is not implemented correctly, then businesses risk employees becoming disconnected, disengaged, and demotivated. And this is inevitable without a clear strategy.
The pandemic created a culture of working from home, where employees were isolated and struggled to collaborate. The world of AI has made this worse by creating a new culture where employees have become too reliant on AI. This leads to a lack of critical thinking and creativity. As a result, employees are less challenged and motivated, which has a negative impact on both the company and the employee.
Walk before you run
Only 4 in 10 companies have an AI strategy (according to a Thomson Reuters survey). This is concerning, given the growth of GenAI users, which Microsoft has revealed has nearly doubled in the last six months, with 75% of global knowledge workers now using it.
To maximise AI’s effectiveness, companies need to have an AI strategy in place that includes objectives. This is the step that most companies overlook. Businesses need to understand why they want to use AI, what pain points they want to resolve, and what processes they want to improve.
All of these ‘going back to basics’ questions are essential to the success of AI and its ability to deliver tangible benefits to your business. If these issues are not addressed, implementing AI will be a wasted opportunity and could cause more harm than good.
Directing AI
Employees need to lose the mentality that “AI will do my job for me”. It is the employee who needs to direct AI and not the other way round. Humans should view AI as an assistant or co-worker by directing and managing it accordingly.
AI is a superhuman brain with the ability to analyse huge quantities of data and perform tasks in seconds, surpassing human speed. Unfortunately, because of these AI superpowers, many people have lost confidence in their abilities, thinking that AI produces better results than they do.
However, employees shouldn’t trust AI to do any job without checking its output. It may be factually incorrect. Or its outputs may read poorly, sounding like everyone else and lacking originality, and damaging the brand and reputation.
Training and prompt engineering
For an AI implementation to be successful, a comprehensive training and development programme should be run. Research from AI Quest recently revealed that 75% of employees do not fully understand how to harness AI in their daily tasks effectively.
Prompt engineering is one skill that unlocks value from AI, and employees should be trained on it. If you submit a vague request to a GenAI tool, you will receive a wide range of vague answers in return. However, if you use critical thinking to define precisely what you want to achieve, then AI will be more likely to craft the outcome you require. Prompt engineering should be an iterative process, with employees refining their prompts until they get the outputs they are satisfied with, outputs that are factually correct, unbiased and on-brand.
Avoiding silos
Working from home over the last few years has affected some managers who have lost the ability to manage effectively: they are used to being isolated and focused on their daily work rather than on managing their team. In addition, newly appointed managers may have missed out on coaching.
As a result, people, teams, and entire departments are more disconnected and siloed than ever. With employees who are off the radar, not accountable, not sharing work, and not collaborating, achieving objectives becomes increasingly difficult.
When AI is not governed effectively, silos will deepen as individuals use their preferred AI apps to work independently. Strong governance of AI use is necessary to ensure that employees adhere to rules, such as privacy compliance and brand guidelines, and to prevent the creation of deeper information silos.
Keeping creativity alive
Senior management has struggled with innovation ever since the demand for remote working increased. Now, with GenAI, there is a danger of even greater isolation, as employees work more closely with machines than with other humans. This can result in a loss of creativity as people no longer bounce ideas off their colleagues.
Instead, employees should be encouraged to share tips and ideas on how and where to utilise AI, allowing everyone to learn from one another and fostering a more collaborative and social culture.
Using human skills
One way to build confidence in the workplace is to develop the skills humans are good at, such as critical thinking, empathy, judgment and creativity. When these skills are developed and refined, organisations can use them to complement the powers of AI.
Businesses and employees need to understand how humans and AI can work together. AI can be a powerful tool that helps humans with tasks such as ideation by providing information and different options, and by analysing data at speed. However, it lacks the human experience, wisdom, and ability to understand context, nuance and the practical constraints of the real world. It lacks emotional intelligence and is unable to make ethical decisions or judgments in the same way humans do.
In other words, while AI can do many things well, there are some critical capabilities that it doesn’t have.
AI will not define us
AI is the most significant cultural and social phenomenon since the internet, and companies should not rush to implement it without a strategy. Deploying AI requires expertise to:
By deploying a strategy alongside employee training and strong governance, your business will be able to direct AI and not the other way round. With the right focus, AI can be used to bring employees closer together and not further apart. And with the right culture, teams across all functions will be able to discover innovative ways to use it, helping them to learn and develop their skills.
Organisations must create a culture where employees flourish while working with AI, and where they are confident to use those human skills that machines don’t have, such as critical thinking, empathy and creativity. By doing this, the looming subculture of over-reliance on AI and lazy thinking can be stamped out.
Jonathan Sharp is CEO at Britannic
Main image courtesy of iStockPhoto.com and Diy13
© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543