Your U of T Resource Finder
Skip to Content Feeling Distressed?

Guidance on the Appropriate Use of Generative Artificial Intelligence for Graduate Academic Milestones

Overview

In response to the rapidly evolving landscape of generative artificial intelligence (AI)[1] use in academic and educational settings, this preliminary guidance has been produced to address frequently asked questions (FAQ) in the context of graduate academic milestones at the University of Toronto, including the completion of comprehensive exams, candidacy requirements, language requirements, the doctoral thesis, and the final oral examination. More detailed guidance on this topic, as well as new or updated policies may be issued in future, in which case this preliminary guidance will also be updated. The FAQs below outline important considerations for graduate students, supervisors, supervisory committees, and graduate units on the use of generative AI tools in graduate student research, thesis writing, and other key academic milestones while upholding the core principles of academic quality, research integrity, and transparency. The FAQs cover requirements both for approval and for documentation of the use of generative AI tools in graduate-level scholarly activities, as well as risks and other considerations in using generative AI in graduate-level scholarly activities.

Innovative and creative uses of generative AI may support scholarly activities and help facilitate high quality research, particularly in certain disciplines. However, it may also hinder the quality of research produced. Graduate students and faculty supervisors are expected to strive for the highest standards of academic quality and research integrity in all scholarly activities, and therefore the use of generative AI tools to support the achievement of graduate academic milestones must always take place with full transparency. This includes transparency between students, their supervisors, and graduate units, who must agree in advance how any generative AI tools will be used in the context of each academic milestone; as well as transparency between graduate students and the audiences of their work, who must be provided a clear and complete description and citation of any use of generative AI tools in creating the scholarly work.

It is recommended that students seek and document in writing unambiguous approval from their supervisor(s) and supervisory committee in advance of the use of generative AI tools in research, writing, or other scholarly activities relevant to graduate academic milestones. Unauthorized use of generative AI tools for scholarly work at the University of Toronto may be considered an offence under the Code of Behaviour on Academic Matters, and research misconduct as defined in the Policy on Ethical Conduct in Research and the Framework to Address Allegations of Research Misconduct. Furthermore, careful attention must be paid to appropriate citation and description of the use of generative AI tools that took place in the research, writing, or other scholarly processes, in line with disciplinary norms. This includes, for example, using generative AI tools in searching, designing, outlining, drafting, writing, editing or producing audio or visual content for comprehensive exams, theses or other scholarly works relevant to academic milestones, and may include other uses of generative AI. Even when engaging in authorized generative AI use, faculty and graduate students must be aware of the academic risks in using such tools, some of which are discussed below.

The School of Graduate Studies (SGS) will continue to work with faculties and graduate units to help develop specific and clear requirements or restrictions regarding the use of generative AI in some or all phases of the graduate research lifecycle. Individual graduate units may therefore issue additional guidance outlining field-specific appropriate uses of generative AI tools for graduate academic milestones. This could include, for example, guidance on use in reviewing literature, writing text, conducting analytical work, reporting results (e.g., tables or figures), writing computer code, demonstrating language competence, or completing comprehensive exams and / or the final oral examination. Graduate units issuing additional guidance should take into account the issues discussed in the FAQ below. Additional relevant guidance and further reading can be found in the FAQs and guidance on syllabi and assignments (PDF) issued by the Office of the Vice-Provost, Innovations in Undergraduate Education, and in the guidance on generative AI in the classroom from the Centre for Teaching Support & Innovation.

Whether the decision-maker is the student, the supervisor, the graduate program leader or the university leader, the advice is largely the same: be guided by foundational principles, recognize every context may be unique, understand your own changing landscape, set shared expectations early, disclose what you are doing, and be prepared to uphold the standards of high quality research and scholarly activities.


[1] In referring to generative AI in this document, we include tools that use predictive technology to produce new text, charts, images, audio, or video. For example uses and more detail, please see the FAQs and guidance on syllabi and assignments issued by the Office of the Vice-Provost, Innovations in Undergraduate Education, and the guidance on generative AI in the classroom from the Centre for Teaching Support & Innovation.

Frequently asked questions (FAQ)

Last Updated: July 4, 2023

Graduate students who make use of AI tools and include the output in their research and written work are ultimately responsible for the content. This applies to work submitted as part of degree requirements, as well as in scholarly publishing or the use of pre-print servers. Graduate students and their co-authors must understand the terms and conditions of any submission of their work and for any tools they use, as these often hold the user responsible for the content. This means graduate students may find themselves in a position where they face allegations of perpetuating false or misleading information, infringement of intellectual property rights, violating the conditions of research ethics approval, other research misconduct, infringement of privacy rights, or other issues that carry academic, civil, or criminal penalties.

Last Updated: September 5, 2025

The School of Graduate Studies (SGS) Doctoral Thesis Guidelines state that students must produce a thesis that demonstrates academic rigour and makes a distinct contribution to the knowledge in the student’s field. The University expects that a thesis submitted to satisfy degree requirements at the doctoral level is the work of the student, carried out under the guidance of the supervisor and committee. The SGS Guidelines specify Key Criteria of the Doctoral Thesis that students must meet, in addition to the Ontario Council of Academic Vice-Presidents’ Doctoral Degree Expectations for Doctoral Students in Ontario (PDF). The Key Criteria of the Doctoral Thesis include presenting the results and analysis of original research, and demonstrating that the thesis makes an original contribution to advancing knowledge. These originality requirements may not be met by work produced using generative AI tools, which rely on existing sources to generate content-based probabilistic or other predictive functions that may not result in sufficiently original content to meet the criteria.

The use of generative AI tools in any aspect of researching or writing of the thesis must be done with the prior approval of the supervisor(s) and supervisory committee. This is consistent with how other decisions about the thesis, including structure and format, are decided, as detailed in the SGS guidance. (See also the Guidelines for Graduate Student Supervision & Mentorship for more detail on the supervisor’s and committee’s roles in guiding students to produce research of high quality and integrity). Careful attention must be paid in the thesis to appropriately citing and describing any use of generative AI tools in the research process. It must be clear to the reader which generative AI tools were used, as well as how and why they were used. In the same way that analytical tools and specific analytical approaches are identified and described in the thesis, generative AI tools and interactions with them must be equivalently described.

When supervisors and committees approve student use of generative AI in any aspect of producing the doctoral thesis, it must be clear how the student’s versus the AI tool’s contributions will be identified, and it must be possible for the student to provide sufficient evidence that they themselves have met the Key Criteria of the Doctoral Thesis and demonstrated the doctoral level degree expectations. It must be clear to the student what evidence they need to provide to clarify their own contributions and how they made use of any AI tools, and how their work will be assessed by the supervisor and committee at each supervisory committee meeting. (Consult the Guidelines for Departmental Monitoring of the Progress of Doctoral Students and the Guidelines for Graduate Student Supervision & Mentorship for more detail on responsibilities in student evaluation and monitoring of doctoral student progress.) Students are responsible for any content generated by AI that they include in their thesis. Note also that at the University of Toronto, the outcome of the final oral examination is based not only on the submitted written thesis, but also the student’s performance in the oral examination. Students must be able to describe and defend any use of generative AI, as well as the contents of the thesis during their final oral examination.

Last Updated: July 4, 2023

Graduate units and individual supervisors who embrace the use of generative AI tools in research methods may still wish to restrict the use of such tools in other aspects of writing or editing papers or theses. There must be clear guidance for graduate students on what degree of engagement with generative AI in writing is acceptable (if any). If there are specific tools that are (un)authorized, these should be explained. Graduate students must seek and document approval from their supervisors and committees for the use of generative AI in writing even if they already have approval to use generative AI tools in their research.

Last Updated: July 4, 2023

The same principles that apply to the use of generative AI tools to produce or edit text also apply to the use of these tools to produce or edit figures, images, graphs, sound files, videos, or other audio or visual content. It should be noted, however, that some publication policies permitting the use of AI-generated text in certain contexts apply more stringent criteria to image content, in some cases completely prohibiting such content, for example, see the editorial policy on the use of AI-generated images at Nature.

Last Updated: July 4, 2023

SGS does not regulate master’s-level theses, Major Research Papers, or qualifying / comprehensive exams. However, SGS recommends that graduate units make use of the principles outlined for doctoral-level work in articulating requirements for any master’s-level research-based works.

The Code of Behaviour on Academic MattersPolicy on Ethical Conduct in Research and Framework to Address Allegations of Research Misconduct also apply to master’s research, theses, Major Research Papers, and other graduate-level scholarly activities required to meet academic milestones including qualifying and comprehensive exams, candidacy requirements, language requirements, and the final oral examination. Faculties and graduate units should issue specific guidance on the use of generative AI in such graduate-level scholarly activities. The FAQs and guidance on syllabi and assignments issued by the Office of the Vice-Provost, Innovations in Undergraduate Education may be more relevant in the context of some papers or projects, and also apply to the undergraduate context.

Last Updated: September 5, 2025

AI is now integrated into many of the everyday tools we use for research and writing, including word processor software, search engines, and more. Nonetheless, different disciplinary norms are beginning to emerge around the appropriate use of generative AI in research, including in fields in which the focus of the research is not specifically the development and implementation of AI. In this rapidly evolving environment, it must be clear to faculty and students which uses of AI are acceptable, and which are not. Supervisors should seek clarification from their graduate unit if uncertain about a particular use of generative AI in graduate-level scholarly activities.

Last Updated: September 5, 2025

If a graduate unit permits the use of generative AI in research, the graduate unit should ensure discipline-specific norms regarding description of the method of use and appropriate references are clear. For example, is it adequate to include the prompts provided to a tool along with excerpts of responses? Should students save or include the full text of their interactions with AI tools in an appendix? Different citation style guides are starting to include specific information on how to cite generative AI tools, for example, see the American Psychological Association Style Blog. Links to major style guides can be found on the University of Toronto Library citation webpages and Citing Artificial Intelligence (AI) Generative Tools (including ChatGPT) resource.

Last Updated: September 5, 2025

Learning the practices of disciplinary scholarly writing is a key aspect of graduate education. The use of generative AI could hamper the development of graduate writing skills because writing capacity is highly dependent on practice. Novice scholarly writers need to write frequently, with in-depth feedback from members of their disciplinary community. Using AI to lessen the burdens of writing could undermine the development of these invaluable skills. This diminished capacity in writing could have serious consequences for graduate students, who need to be able to use writing as a vital element of their overall research process. The act of drafting and revising scholarly work often entails an essential deepening of the engagement with their research. Most writers learn about their own thinking through the iterative act of writing; if AI is doing some part of that writing, writers may be missing a crucial opportunity to cement their own scholarly expertise.

Last Updated: September 5, 2025

Generative AI may produce content that is wholly inaccurate or biased. AI tools can reproduce biases that already exist in the content they are trained on, include outdated information, and can present untrue statements as facts. Students remain responsible for the content of their thesis and other academic works (e.g., comprehensive exams), no matter what sources are used (see also Who is responsible for AI-generated content used in research or other scholarly work?). Generative AI tools have also been shown to reference scholarly works that do not exist, and to generate offensive content. Therefore, AI-generated content may not meet the academic or research integrity standards expected at the University of Toronto. Generative AI tools are also predictive, and may not generate the type of novel content expected of graduate students, nor arrange existing knowledge in such a way as to reveal the need for the novel contribution made by the research underlying a graduate student thesis. (See also the information on originality in Can students use generative AI tools to research or write a doctoral thesis?)

Last Updated: September 5, 2025

Please see the following resources on generative AI and security and privacy considerations:

Last Updated: July 4, 2023

Most major journals and scholarly publishers now have policies regarding the use of generative AI in publication. These policies vary widely, and researchers must ensure they are adhering to the specific policies of the pre-print server, journal, or publisher to which they are submitting. For example, some publishers allow use of generative AI in the research process, with appropriate description, references, and supplementary material to show the interaction with the AI tool, but do not allow the inclusion of AI-generated text. Others allow the inclusion of AI-generated text, but not images.

Emerging consensus is that generative AI tools do not meet the criteria for authorship of scholarly works, because these tools cannot take responsibility or be held accountable for submitted work. These issues are discussed in more detail in the statements on authorship from the Committee on Publication Ethics, and the World Association of Medical Editors, for example.

Graduate units, supervisors, and students must be familiar with and adhere to the requirements in their field regarding authorship and use of AI in works submitted to pre-print servers or for publication.

Last Updated: September 5, 2025

Please see the following resources on generative AI and copyright and intellectual property considerations:

Relevant policies and further reading