Send in your ideas. Deadline February 1, 2026

Policy on the use of Generative Artifical Intelligence for NLnet-funded projects

Valid as of: December 8, 2025

Version control

Date Version Comment
December 8, 2025 Version 1.0 Initial release

Generative Artificial Intelligence (GenAI) tools are rapidly emerging across the ecosystem. While we recognize the interest – and at times share the curiosity – surrounding these technologies, it is essential that they be used responsibly, transparently, and in full accordance with our established principles.

This policy describes how GenAI— including tools such as large language models and code assistants—may be used in applications submitted to NLnet and in all work delivered within NLnet-funded projects, covering both the preparation of proposals and the execution of funded activities. The policy concerns GenAI specifically and does not regulate other forms of automation.

Foundation of the policy

This policy is grounded in longstanding principles that apply to all NLnet-funded work. From these fundamental principles we have deduced what we consider common sense consequences with regards to the use of GenAI.

Fundamental principles

  1. FLOS licence: All projects must be free/libre/open source: all scientific outcomes must be published as open access, and any software and hardware developed must be published under a recognised free and open source licence in its entirety.
  2. No misrepresentation: Grantees and applicants should not claim work as their own, if it is not. This has always been true and GenAI doesn’t change that.
  3. Project quality: Grantees are expected to deliver project outcomes to the best of their ability.Tools may assist but do not replace human responsibility for correctness, clarity, and reproducibility.

Use of GenAI in the application process

  • Applicants may use GenAI tools in preparing proposals, but any such use must be disclosed. This applies both to written proposals and to materials provided during interactive evaluation.
    • Explanation: Disclosure allows evaluators to understand how the proposal was produced and ensures fairness. This includes drafting, translation, or summarisation.

Use of GenAI in project development

  • Grantees must ensure that all submitted work can be legally published under a FLOS licence. This includes verifying that GenAI-assisted outputs do not reproduce copyrighted or incompatible material.
    • Example: when using a code assistant, check the assistant’s terms of use, and ensure that outputs are not reconstructed from copyrighted sources.
    • Example: Under EU law, purely AI-generated outputs without substantial human intellectual contribution are not eligible for copyright protection [Note]. In any case, outcomes purely generated by AI are not allowed to be submitted as work eligible for payment (as part) of the grant.

  • Grantees must not present AI-generated content as if it were their own human-authored work.
    • Explanation: When we provide a grant to a person to develop a project, we expect that person to do the work. They should not outsource the work to another person while pretending they did it themselves. Similarly, grantees should not deliver GenAI outcomes and pretend it was their own human effort. Human contributors remain accountable for accuracy, originality, and integration of GenAI-supported work.

  • Use of GenAI must not reduce the quality, clarity, reliability, or reproducibility of the work.
    • Explanation: Tools may assist, but human responsibility for quality remains.

  • It is allowed to work on the topic of GenAI itself within the scope of a grant, but only if this is explicitly part of approved work.
    • Explanation: We are not against GenAI, so applicants should not hesitate to submit proposals that involve work on GenAI. However, work on GenAI only falls within scope if it is part of the approved project plan.

Transparency & logging

Use of genAI should be disclosed and transparent. For any substantive use of genAI that materially affects outputs, a prompt provenance log must be maintained. This log should list:

  • the model used,
  • dates of prompts,
  • the prompts themselves,
  • the unedited output.

Based on the use case, this log should be made available in the following ways:
- In the case of project outcomes, logs should be made available publicly in a location that is easily discoverable for users.
- In the case of a grant application being submitted through the proposal form, instructions about how to disclose are provided on the form.
- In the case of a grant application which is still under confidential evaluation, the prompt provenance log should be provided as a separate attachment to the application or emails.
Minor uses (e.g., spellchecking) do not require logging.

Non-Compliance

Failure to comply with the above policy may result in rejection of the proposal or ultimately in the termination of the running grant.

Scope of this policy

This policy explicitly deals with GenAI only (such as Large Language Models). NLnet is a strong proponent of automation and of deterministic and reproducible generation of source code, formal and symbolic proofs, etc. based on specifications and scientific and engineering rigor. Similarly, it does not in any way seek to prevent the use of other forms of machine learning, fuzz testing or other beneficial use cases. However, when in doubt please contact us.

Note

See: generative AI and Copyright page 93, a report requested by the European Parliament's Committee on Legal Affairs. The entire quote reads:
"Given this framework, it follows that purely AI-generated outputs—those created automatically by an AI system without substantial human intervention—are not eligible for copyright protection in the EU. Such outputs are considered to fall into the public domain, making them freely available for anyone to use, reproduce, or adapt without seeking permission or providing attribution. The legal and commercial implications of this are significant. For creators and companies investing in AI systems that generate music, art, or text, there is no proprietary right over the final output unless a human has contributed in a way that meets the “intellectual creation” standard."