{"id":2053,"date":"2025-08-02T05:57:17","date_gmt":"2025-08-02T05:57:17","guid":{"rendered":"https:\/\/ft365.org\/index.php\/2025\/08\/02\/uk-leads-the-way-with-15m-ai-alignment-project\/"},"modified":"2025-08-02T05:57:17","modified_gmt":"2025-08-02T05:57:17","slug":"uk-leads-the-way-with-15m-ai-alignment-project","status":"publish","type":"post","link":"http:\/\/ft365.org\/index.php\/2025\/08\/02\/uk-leads-the-way-with-15m-ai-alignment-project\/","title":{"rendered":"UK Leads the Way with \u00a315m AI Alignment Project"},"content":{"rendered":"<div>\n<p><img decoding=\"async\" src=\"https:\/\/ft365.org\/wp-content\/uploads\/2025\/06\/localimages\/ea721ff9-8ba4-4d88-b386-57e9e1606077.jpg?width=64&#038;height=64&#038;mode=crop&#038;scale=both&#038;format=webp\" alt=\"Photo of Phil Muncaster\" loading=\"lazy\"><\/p>\n<\/div>\n<div id=\"cphContent_pnlArticleBody\" data-layout-id=\"2\" data-edit-folder-name=\"text\" data-index=\"0\">\n<p>The UK\u2019s AI Security Institute is teaming up with international partners to lead a \u00a315m project focused on researching AI alignment.<\/p>\n<p>The Alignment Project will also feature the Canadian\u00a0AI\u00a0Safety Institute, Canadian Institute for Advanced Research (CIFAR), Schmidt Sciences, Amazon Web Services, Anthropic, Halcyon Futures, the Safe\u00a0AI\u00a0Fund, UK Research and Innovation, and the Advanced Research and Invention Agency (ARIA).\u00a0<\/p>\n<p>It will pioneer new work designed to ensure AI systems always work as intended \u2013 a field that is becoming increasingly important as AI systems become more advanced and autonomous.<\/p>\n<p>Misalignment broadly means AI systems that act against the goals, policies and requirements of their developers. It can be intentional \u2013 i.e.,\u00a0a threat actor subverting an AI system to attack a target \u2013 or unintentional, which is when misalignment happens because appropriate AI guardrails haven\u2019t been put in place.<\/p>\n<p><em>Read more on AI safety: OWASP Launches Agentic AI Security Guidance<\/em><\/p>\n<p>According to Trend Micro, examples of misalignment could include:<\/p>\n<ul>\n<li>Model poisoning: Attackers inject or manipulate LLM training data, leading to biased outputs, incorrect decisions and sometimes injected backdoors<\/li>\n<li>Prompt injection: Threat actors craft a malicious prompt that overcomes the built-in guardrails of an LLM, effecting a type of system jailbreak<\/li>\n<li>Accidental disclosure: Poorly designed AI systems may inadvertently access and share privileged information to users<\/li>\n<li>Runaway resource consumption: If resource consumption is not properly bounded, AI components could work on sub-problems in a self-replicating manner, potentially DOSing the system<\/li>\n<\/ul>\n<p>Science, technology and innovation secretary, Peter Kyle, said advanced AI systems are already exceeding humans in some areas, making the project more urgent than ever.<\/p>\n<p>\u201cAI\u00a0alignment is all geared towards making systems behave as we want them to, so they are always acting in our best interests. This is at the heart of the work the institute has been leading since day one \u2013 safeguarding our national security and ensuring the British public are protected from the most serious risks\u00a0AI\u00a0could pose as the technology becomes more and more advanced,\u201d he added.<\/p>\n<p>\u201cThe responsible development of\u00a0AI\u00a0needs a co-ordinated global approach, and this fund will help us make\u00a0AI\u00a0more reliable, more trustworthy, and more capable of delivering the growth, better public services, and high-skilled jobs.\u201d<\/p>\n<\/p><\/div>\n","protected":false},"excerpt":{"rendered":"<p>The UK\u2019s AI Security Institute is teaming up with international partners to lead a \u00a315m project focused on researching AI alignment. The Alignment Project will also feature the Canadian\u00a0AI\u00a0Safety Institute, Canadian Institute for Advanced Research (CIFAR), Schmidt Sciences, Amazon Web Services, Anthropic, Halcyon Futures, the Safe\u00a0AI\u00a0Fund, UK Research and Innovation, and the Advanced Research and<\/p>\n","protected":false},"author":2,"featured_media":2054,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-2053","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized"],"featured_image_urls":{"full":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"thumbnail":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5-150x150.jpg",150,150,true],"medium":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"medium_large":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"large":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"1536x1536":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"2048x2048":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"morenews-featured":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"morenews-large":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"morenews-medium":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5.jpg",300,300,false],"crawlomatic_preview_image":["http:\/\/ft365.org\/wp-content\/uploads\/2025\/08\/2053-43b63f21-541c-481f-8524-60ac82a14ab5-146x146.jpg",146,146,true]},"author_info":{"display_name":"henry","author_link":"http:\/\/ft365.org\/index.php\/author\/henry\/"},"category_info":"<a href=\"http:\/\/ft365.org\/index.php\/category\/uncategorized\/\" rel=\"category tag\">Uncategorized<\/a>","tag_info":"Uncategorized","comment_count":"0","_links":{"self":[{"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/posts\/2053","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/comments?post=2053"}],"version-history":[{"count":0,"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/posts\/2053\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/media\/2054"}],"wp:attachment":[{"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/media?parent=2053"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/categories?post=2053"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/ft365.org\/index.php\/wp-json\/wp\/v2\/tags?post=2053"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}