Developers given new tools to boost cyber security in AI models as cyber security sector sees record growth

  • UK government introduces two codes of practice to enhance cyber security in A.I and software, boosting the UK economy’s security and growth prospects
  • new measures aimed at developers will establish global standards for protection A.I models from hacking, helping businesses innovate and drive economic growth across the nation
  • launched as new figures show cyber security sector has grown 13% in last year and is now worth almost £12 billion

New measures which are expected to set a global standard on how to bolster protections of A.I models of hacking and sabotage have been unveiled today by the UK government (15th May).

During a speech at CYBERUK, the government’s flagship cyber security conference, Technology Minister Saqib Bhatti announced two new codes of practice which will help developers improve cyber security in A.I models and software, putting the UK economy on an even stronger footing to grow safely and helping the government achieve long-term growth for the British economy.

The codes set out requirements for developers to make their products resilient against tampering, hacking, and sabotage and will increase confidence in the use of A.I models across most industries, helping businesses improve efficiencies, drive growth, and turbocharge innovation.

In the last 12 months, half of businesses (50%) and a third of charities (32%) reported cyber breaches or attacks, and phishing remained the most common type of breach. The codes introduced today show developers how software can be built in a secure way, with the aim of preventing attacks such as the one on the MoveIT software in 2023 which compromised sensitive data in thousands of organizations around the world.

Technology Minister Saqib Bhatti said:

We have always been clear that to harness the enormous potential of the digital economy, we need to foster a safe environment for it to grow and develop. This is precisely what we are doing with these new measures, which will help make A.I models resilient from the design phase.

Today’s report shows not only are we making our economy more resilient to attacks, but also bringing prosperity and opportunities to UK citizens up and down the country. It is fantastic to see such robust growth in the industry, helping us cement the UK’s position as a global leader in cyber security as we remain committed to fostering the safe and sustainable development of the digital economy.

The new measures come as findings of a new report published today show the cyber security sector has experienced a 13% growth on the previous year and is now worth almost £12 billion, on par with sectors such as the automotive industry.

The findings are reported by the government’s annual Cyber ​​Sectoral Analysis Report and show the number of cyber security firms finding home in the UK has risen in 2023, strengthening the UK’s resilience to attacks and propelling sustainable economic growth.

The new codes of practice will improve cyber security in A.I and software, while new government action on cyber skills will help develop the cyber workforce and ensure the UK has the people it needs to protect the nation online.

NCSC CEO Felicity Oswald said:

To make the most of the technological advances which stand to transform the way we live, cyber security must be at the heart of how we develop digital systems.

The new codes of practice will help support our growing cyber security industry to develop A.I models and software in a way which ensures they are resilient to malicious attacks.

Setting standards for our security will help improve our collective resilience and I commend organizations to follow these requirements to help keep the UK safe online.

These measures are crucial for new businesses in the digital age, ensuring cybersecurity commitment, safeguarding personal data for users, and fostering global alignment for enhanced cyber resilience.

The A.I cyber security code is intended to form the basis of a future global standard.

Rosamund Powell, Research Associate at The Alan Turing Institute, said:

A.I systems come with a wide range of cyber security risks which often go unaddressed as developers race to deploy new capabilities. The code of practice released today provides much-needed practical support to developers on how to implement a secure-by-design approach as part of their A.I design and development process.

Plans for it to form the basis of a global standard are crucial given the central role international standards already play in addressing A.I safety challenges through global consensus. Research highlights the need for inclusive and diverse working groups, accompanied by incentives and upskilling for those who need them, to ensure the success of global standards like this.

Today also marks the publication of the Capability Hardware Enhanced RISC Instructions (CHERI) report, introducing a new microprocessor technology known as “magic chip,” which integrates advanced memory protections to prevent up to 70% of current cyber-attacks.

Alongside this, Minister Bhatti announced this morning new initiatives on how the government and regulators will professionalize the cyber security sector, such as incorporating cyber roles into government recruitment and HR policies.

The minister also spoke about his intention to foster cyber skills among young people and inspire them into cyber careers, with the UK launching a campaign to entries encourage a brand new national cyber skills competition for 18–25-year-olds later this year. The competition will give the winners the opportunity to represent the UK at international cyber competitions.

Notes to editors

This work is part of the government’s £2.6 billion National Cyber ​​Strategy to protect and promote the UK online.

The government is also launching a consultation on scaling up the impact of the successful CyberFirst scheme, which has helped improve the tech skills of 260,000 students across 2,500 schools.