Table of contents
  1. Academic Integrity and Plagiarism
  2. TurnItIn
  3. Translation tools
  4. ‘AI’ tools
    1. What counts as an AI tool?
    2. If you’re using AI tools, you need to understand the risks:
      1. You are responsible for AI tool hallucination
      2. If you use AI-generated code, you will still need to understand exactly what it’s doing
      3. You will still have to do enough work to pass the unit!
      4. Everything you input into LLMs can be used by them
    3. Want to know more about the University’s AI use policies?

Academic Integrity and Plagiarism

We expect all final year students to be familiar with and correctly follow the university’s rules on academic integrity and plagiarism.

Please read the University pages on Academic Integrity, including the 3 linked pages on plagiarism, collusion and contract cheating, including ‘AI’ tools.

You are expected to complete online academic integrity training - find this on the Academic Integrity: School of Computer Science 2023 site on Blackboard.

Plagiarism is an offence and can lead to a punishment even if you were unfamiliar with the rules or made an honest mistake.

We will check for academic integrity and plagiarism in three ways:

  1. We run all dissertations through TurnItIn to check that work submitted is your own or correctly referenced.
  2. Your supervisor (who is your first marker) is the subject expert and will know text that is not correctly referenced.
  3. If you cannot explain basic parts of your dissertation within the viva then you are likely to be investigated further.

TurnItIn

We use TurnItIn to check for plagiarism and AI usage. When you submit your dissertation, you will be able to see a TurnItIn report giving a similarity measure to other sources online and submitted in previous years.

You should not look at the similarity score by itself, instead look to make sure that large portions of your text do not overlap without proper quoting/citation. We never take the similarity score at face value, and it is always checked by the Unit Directors, who are looking specifically for this overlap.

If you have any questions about similarity score, talk to your supervisor and they can advise you.

Translation tools

As per University policy, you cannot write your dissertation in another language and translate to English using a tool (whether it is AI or not).

The degree program is in English, so we expect students to write dissertations in English. Translating your dissertation into English using a tool is Academic Misconduct and can add severe penalties to your degree classification and progress.

If you are concerned about the quality of your written English, the University Study Skills Team can help you in different ways with bookable tutorials, drop ins and support from the Royal Literary Fund Writing Fellows.

‘AI’ tools

From Faculty Policy and University Policy, you are not allowed to write your dissertation using an AI tool. However, you are allowed to use AI tools in other ways so long as it is cited correctly within your dissertation.

You should do this as references throughout your report, as with any other citations and in two specific sections of your dissertation: the AI use declaration at the start of your dissertation and the AI use appendix at the end.

The full Faculty policy is in the single page Faculty guidance, which you should read (you must be signed into Blackboard to access it, and can also access it if you scroll down to Academic Integrity in the Blackboard CS student handbook.

You wouldn’t be marked down for using AI tools - but you would be marked down if you use AI tools without thought behind why, what value they add and what the risks are, or if we find out you’ve been using them without telling us.

Using, for example, ChatGPT to write your Contextual Background because it’s faster and it is likely to be grammatically correct would not be a good reason - but using an AI tool to help bug-catch, as part of a suite of other bug-catching methods, or to speed up development so you can do more interesting experiments, which you explain in the AI declaration and appendix, would.

What counts as an AI tool?

Anything that uses Large Language Models (LLMs for example ChatGPT, Claude, CoPilot, DeepL etc) or Generative AI (for example to create pictures, eg Midjourney, Dall-E etc). Given how fast the sector is moving, there are bound to be new examples appearing all year, so if you’re uncertain if something counts, please ask.

If you’re using AI tools, you need to understand the risks:

You are responsible for AI tool hallucination

AI tools still hallucinate, and one of the main ways we spot their use (apart from the fact their writing sounds very similar) is in incorrect information and imaginary papers. If there are any mistakes in your dissertation because of tools, you are considered doubly responsible, because you chose to outsource your work AND did not check it.

If you use AI-generated code, you will still need to understand exactly what it’s doing

You will have to be able to answer detailed, technical questions about your code from your supervisor and in the vivas. This could include exactly what a section of the code does, or why the code is structured in a certain way.

It will be fine to say that you used a tool to help you write the code, if you have cited it correctly and explain how you made sure it was correct, but you will still need to understand it.

You will still have to do enough work to pass the unit!

You are expected to spend ~25 hours/week on your Individual Project. If you use AI tools to reduce time taken on tasks, you will still need to spend the total time elsewhere.

For example, there are tools that can convert Figma outputs into React JS to make mobile apps. If you use these to save time, you will be expected to do a lot more user testing and evaluation, so you are doing enough work to pass.

Remember, one of the assessment criteria in the mark scheme is about on the amount of work you’ve done, so if you find a shortcut in one area, you will be expected to push yourself in others.

Everything you input into LLMs can be used by them

Everything you enter into AI tools is instantly visible to and usable by the service provider, and is instantly out of your control. This means you need to be very careful if you are using datasets/APIs with conditions of use or other people’s work.

Want to know more about the University’s AI use policies?

If you’re interested in the University policies on ‘AI’ tools and the reasoning behind them, including some of the risks, read the Study Skills Resource.