|
This guidance outlines how students at the University may and may not use Artificial Intelligence (AI) tools in their academic work. It applies to all disciplines and levels of study. The goal is to support the responsible, transparent, and ethical use of AI in ways that enhance learning while maintaining academic integrity.
This guidance provides a broad overview, but all students must follow specific instructions in module assessment briefs as local guidance always takes precedence.
|
What is Generative AI?
Artificial Intelligence (AI) refers to tools that can analyse data, generate or translate text, summarise information, or provide feedback and explanations. Examples include ChatGPT, Gemini, Grammarly, Copilot, Claude, and DALL·E. AI can support your studies by helping you understand concepts, organise ideas, or improve clarity, but it must never replace your own critical thinking, analysis, or authorship.
It is your responsibility to assess an AI tool you want to use and decide whether it meets your own criteria of where you want your intellectual property shared and what impacts on the environment your use of AI may have. In many instances, there are other tools you can use to achieve your aims without the use of Generative AI.
Microsoft Copilot is the University's approved AI tool. Using Copilot when logged in with your university credentials protects you from accidentally sharing data or other sensitive information externally, as our enterprise license ensures that nothing entered is used to train public models.
N.B. Please note that some tools that did not previously use AI have evolved in ways that now produce substantial AI-generated content. As a result, any tool with 'embedded' AI functionality should be considered carefully in the context of this guidance. When in doubt, ask.
Key principles for using AI at Royal Holloway