ARIA: Mathematics for Safe AI - opportunity seeds

ARIA will fund high-potential proposals to leverage mathematics to ensure that powerful AI systems interact safely and as intended, with up to £500k each.

Opportunity Details

When

Registration Opens

15/01/2025

Registration Closes

11/02/2025

Award

£10k to £500k per project, for projects lasting up to 2 years.

Organisation

ARIA

Share this opportunity

Find out more and apply

Opportunity seeds support ambitious research aligned to ARIA‘s (Advanced Research & Innovation Agency) opportunity spaces. ARIA are looking to challenge assumptions, open up new research paths, and provide steps towards new capabilities, and will fund high-potential proposals with up to £500k each.

Can we leverage mathematics – from scientific world-models to mathematical proofs – to ensure that powerful AI systems interact safely and as intended with real-world systems and populations?

ARIA are looking for:

  • Ideas that sit within the Mathematics for Safe AI opportunity space. By this, we mean your proposal should show how your idea either aligns with or challenges the assumptions of the Summary, Beliefs, or Observations in the opportunity space;
  • Ideas that could change the conversation about what is possible or valuable;
  • Ideas that range from early stage curiosity-driven research through to pre-commercial science and technology.
  • ARIA welcome applications from across the R&D ecosystem, including individuals, universities, research institutions, small, medium and large companies, charities and public sector research organisations.

    If you are an overseas applicant you should note that our primary focus will be on funding those who are based in the UK or those willing to conduct all or part of the project from the UK. However, funding will be available to applicants outside the UK if we believe the proposed project can significantly benefit the UK.

  • ARIA provide funding from between £10k and £500k, inclusive of VAT (where applicable) and all associated costs (both direct and indirect). There is no minimum length for a proposed project; the maximum length for this call is two years.

    ARIA value speed, so if you think you could do the work faster with a greater budget, we’d prefer you to bias your proposal in that direction.

  • We don’t have preconceptions about what ideas you might send our way. You don’t need to contort your project to fit our goals: we want to hear what you really want to do.

    Examples of projects we’d be excited to fund include (but are not limited to):

    • Creating (largely) self-contained prototypes/minimal-viable-products of a Safeguarded AI workflow, similar to this example but pushing for incrementally more advanced environments (e.g. Atari games).
    • Analysing, developing, prototyping and/or applying AI and formal methods, to:
      • Scale up the production of verified software systems
      • Verifiably secure hardware, such as interval-bound solvers for Maxwell’s equations in semiconductor devices.
    • Developing benchmarks and evaluations of AI capabilities that accelerate or automate formal methods, e.g. in areas such as theorem proving, autoformalization, and verified software pipelines.
    • Mathematical (formal) theories for understanding AI training or internal representations.
    • Conceptual and mathematical work aiming to characterise “boundaries” of systems we want to protect from harm (e.g. Boundaries part 3a), or notions of “harm” (e.g. Quantifying
      Harm, or MWER – which are not yet compatible).
    • Creating pedagogical materials (e.g. educational videos, interactive tutorials, high-production-value expository blog posts, etc.) relevant to research produced in the context of Safeguarded AI’s Technical Area 1 (or other parts of the programme), potentially in collaboration with programme Creators. The goal is to help upskill a broader population of engineers and scientists to be able to derive value from using the programme’s modelling framework.
    • Field-building efforts aimed at building a flourishing, creative, diverse and inclusive community around ideas related to Mathematics for Safe AI, be that in the form of events, upskilling opportunities, etc.
    • Good-faith and constructive criticisms or adversarial collaborations regarding key assumptions underlying the thesis of Mathematics for Safe AI, and/or Safeguarded AI.

    Out of scope:

    • Ideas that live outside of the scope of the ‘Mathematics for Safe AI’ opportunity space.
    • Ideas that are within scope of the ‘Safeguarded AI’ programme, as these should be submitted to programme-specific funding calls.
    • Ideas that are undifferentiated or that are likely to happen without ARIA support.
    • Commercial products, or projects for which you wish to retain exclusive IP rights.
  • Before submitting an application, please read the call for proposals in full and the general ARIA funding FAQs, where you can find further guidance on how ARIA funds research, including more information on intellectual property, budgeting, and standard grant and contracting agreements. If you have any further questions relating to the opportunity seed call for proposals, please email clarifications@aria.org.uk. Any questions or responses containing information relevant to all applicants will be provided to everyone that has started a submission within the application portal and published on ARIA’s website.

    If you would like to find a collaboration partner, you can also contact Innovate UK Business Connect’s Industrial Mathematics or Artificial Intelligence teams.

Close

Connect with Innovate UK Business Connect

Join Innovate UK Business Connect's mailing list to receive updates on funding opportunities, events and to access Innovate UK Business Connect's deep expertise. Please check your email to confirm your subscription and select your area(s) of interest.