Advancing knowledge. Driving growth. Securing the future of AI.

Cutting-edge research
Real-world innovation
Public-private collaboration

The Laboratory for AI Security Research (LASR) brings together academia, industry, and government to advance UK leadership in AI security research.
Partners
LASR pillars
Academia
Academic research forms the foundation of LASR. Universities and research groups help surface emerging questions, deepen understanding, and identify novel solutions.
Industry
Including both large and small organisations, their real‑world experience developing, deploying and securing AI reveals pressures, threats and gaps that help guide research focus and develop new capabilities.
Opportunity areas
.webp)
Pre‑trained models and complex dependencies can introduce hidden risks. Strengthening supply chains supports safer, more trustworthy AI.
As AI systems gain more agency, they face greater exposure to misuse or attack. Strong security foundations help maintain trust in real-world use.
AI depends on secure compute, networks and access controls. Protecting these layers helps ensure AI systems operate safely, reliably and at scale.

Live opportunities
Work with leading institutions on the pressing challenges.
OU XXX
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla.
TURING
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla.
Working on key research & innovation?
Tell us about your technology
Provide a short overview of the system, dataset or scenario you are exploring, including the context in which it is developed or deployed. This helps identify whether a potential collaboration aligns with LASR’s research focus.

