General Policy

Use of generative artificial intelligence at UNIGE

The arrival of generative artificial intelligence (AI) tools has been effective and democratized since late autumn 2022, with the availability of chatGPT to the general public.

The University of Geneva takes note of this development and of the likely use of these technologies in the activities of employees and students. These tools are undoubtedly of interest, and can bring significant added value to teachers, researchers, staff and students in their activities.

However, the limits of these generative tools are real, and the ethical and deontological challenges associated with their use far from over. It is the responsibility of all members of the university community to make informed, critical and responsible use of these tools. The institution has a duty to propose support measures and recommendations adapted to the technical, economic and social evolutions surrounding the development of AI. This framework is of course open-ended, and will be amended in line with future developments.

The use of these tools must be accompanied by training that enables users to

  • understand their characteristics (operation, limits, ethical issues),
  • learn how to make the most of them (formulation of prompts),
  • master the integration of content produced by these tools into the final output (methodology, citation).

 Training courses are already underway to support teachers in the use of generative AI in their teaching. Support activities aimed specifically at UNIGE students and staff will be set up in the coming months.

Teaching and studying

Today, we are faced with a new technology that completes a range of tools whose authorization for use in the context of studies must be in line with the objectives and skills targeted.

The completion of academic work (homework, essays, reports, etc.) and the assessment of learning (exams, continuous assessment, dissertations, etc.) must be accompanied by clear instructions regarding the possibility of students using generative AI tools. Their use must be supervised.

Learning assessment must be designed to take account of the existence of these tools, and methods adapted accordingly. Faced with the advent of AI tools, three strategic solutions can be identified:

  • implementation of assessments focused on individual knowledge, without AI assistance (oral exams, table-top written exams, computer-based exams with access restrictions) ;

  • implementation of assessments for which the use of generative AI loses its relevance (exams integrating practical experience, observations, expert videos with commentary, etc.).

  • implementation of assessments that integrate the use of AIs as a means of analyzing students' ability to manipulate course knowledge and content. The evaluation of learning has already been the subject of in-depth reflection, notably through the work of a recently published think-tank.

It is up to faculties, centers and institutes to set the framework conditions for the use of these tools, and for teachers to provide clear instructions for the pedagogical activities and assessments they propose.

Full transparency regarding the use of AI tools in academic work is formally expected through appropriate use of citation rules. These can be accompanied by a description of the methodology used to carry out the work with the support of the AI tool.


  •     Train students and teachers in the use of chatGPT or related tools.
  •    Establish a clear framework at faculty and teaching level
  •     Recall the notion of academic integrity and the objectives of a university education
  •     Reiterate the importance of writing skills in university training
  •     Valuing the notion of the author of a scientific/academic work and the responsibility he or she bears


AI is not only about text generation, but also about image generation. The following specific recommendations should apply in this case:

  •      Explicit mention of AI use in image credits or captions
  •     Use for purely illustrative purposes, replacing, for example, generic images taken from commercial databases.
  •     AI-generated images within UNIGE do not include recognizable real people