Teach with Generative AI
Resources for faculty
Overview
When it comes to the future of education, virtually no recent technology has sparked as much debate as generative AI (GenAI) and large language models (LLMs). Some have seen this technology as destructive, with school districts from Los Angeles to New York initially banning its use, and others have touted its transformative impact and possibility of changing the game for educators and students alike.
Harvard has consistently tried to embrace new technology across our classrooms, residential and virtual. GenAI has been no different.
Faculty and students have access to a range of tools. Some of these tools are free and open to all faculty, students and staff behind Harvard Key (Harvard Sandbox) other tools require a license or approval. See a list of tools.
As our faculty and students have engaged with these technologies we have invited faculty to reflect on questions such as the following:
- What is the challenge you were trying to address?
- How did you use generative AI tools to tackle it?
- What did you learn?
Out of this there have been several learnings worth considering:
- GenAI tools have raised concerns about how they may compromise student assessments, promote academic dishonesty, and facilitate “lazy learning.” Our faculty colleagues who experimented with these tools were not oblivious to these concerns; indeed, many share them. At the same time, faculty are looking to understand how GenAI tools can enhance the educational experience and build more vibrant classrooms.
- Several colleagues are leveraging LLM features that everyone should keep in mind:
- Beyond text: For visual aids and images, coding, analysis, games, simulations, and more.
- Prompt design: There’s an old saying: “garbage in, garbage out.” The output of LLMs is only as good as the input, and it’s essential to learn (and perhaps teach) how to write a prompt that works. This is highlighted through discussions on the critical role of deliberate prompt formulation, from having students iterate on their prompts through the course, to engaging students in debate on the ethics of AI use, to making advanced statistical concepts accessible to diverse learners. Our new System Prompt Library offers a range of effective prompts that can be used by educators.
- Interrogate hallucinations: Errors arise not just because of algorithmic or data limitations but, importantly, because LLMs are fundamentally probabilistic. Faculty have found that errors can be reduced through detailed prompt engineering and balanced AI-human partnership.
- Some consistent patterns and learnings emerge from how GenAI has been implemented for use by our colleagues:
- Going beyond the simple question-and-answer interface: Sal Khan popularized the idea of using LLMs to ask questions of a student, not just answer them. Several faculty colleagues take this further, illustrating how LLMs can be used to simulate any persona you want, and to ask anything of them. Examples include simulating experts, peers, graders, course designers, and personal chatbots.
- More than the “first draft”: GenAI needn’t compromise student creativity; in fact, it can augment it. Some colleagues are using it to help students refine project prototypes and polish final drafts.
- Work alongside what you already have: Many faculty used LLMs to improve different (and sometimes mundane) aspects of existing teaching and learning “workflows,” such as producing course materials, personalizing feedback, generating assignments, summarizing real-time student responses, and tutoring students.
- Identify, and overcome, hidden or invisible barriers: Students and educators sometimes confront hidden prerequisites that present barriers for teaching and learning. GenAI can assist with overcoming these skill gaps: coding for a business class, foreign languages for research, art skills for building visual aids, and even building games for class engagement.
- Reimagining the classroom: While we’re still in the early days of GenAI, some of these examples already start to surface more profound questions: What does a class with GenAI at its core look like? Ultimately, what is the role of a teacher?
- Questions around genAI’s efficacy on learning arise: can we use GenAI tools—specifically tutor bots—to improve the way students learn? One faculty member created a tutor bot that answered questions like course staff. Beyond such research on students’ interactions with genAI tools, it might be helpful to imagine how it can help you and your students now, in other ways as well: by increasing task efficiency, improving student engagement, increasing their confidence, or even improving learning outcomes.
- The risks of LLMs present valid concerns. While popular debates often focus on “big picture” concerns like algorithmic biases, digital divides, and fake content, some faculty explore the risks at a micro scale, within our classrooms, such as hallucinations, failed reasoning, or superficial thinking, pushing students to understand these issues more deeply.
Video interviews that fed these reflections are featured here in the first Harvard GenAI Library for Teaching and Learning. And across Harvard there have and will continue to be convenings large and small bringing together faculty, students, staff, and administrators. As you think about what’s relevant for your course, support exists across Harvard to help you and your team experiment as well.
Frequently asked questions
The following information offers advice for educators interested in using generative AI tools in their teaching and course preparation. As this technology is constantly evolving, this page will be updated frequently with new resources and advice.
Creating materials for courses—syllabi, lesson plans, assignments—takes time. Generative AI tools aren’t just useful at broad, general prompts, but, as our colleagues have shown, they are useful in tackling the preparations before a student even arrives in the classroom:
- Preparing to teach: Starting a syllabus or a lesson plan from a blank page is daunting. AI can help you outline your course, create learning objectives, and suggest assignments or in-class activities, while making content that fits your course by feeding it specific reference materials.
- Assignments: Reusing the same assignments across multiple years can be time efficient, but it creates challenges for assessments. Some faculty have explored how AI tools can help write, modify, or create question sets. The more information you put in about the structure and concepts you want it to use, the better it will be. And it can even make a rubric for it.
- Course content: Classes often include large amounts of content for reading, from case studies to readings to slideshow text. Just a few sentences of a prompt using GenAI can generate summaries of relevant readings or videos you might want to include for students, brand new cases to discuss, and even what your slideshow for a lecture should include. Some faculty have gone a step further, inputting all the course materials in the materials that trained a teaching assistant chatbot. Hear what this faculty member learned from students’ interactions with this “faculty copilot.”
How to engage your students in the classroom is an age-old question. Some faculty have used GenAI’s real-time thinking and range of outputs to help offer contemporary solutions, such as a content-generator, a data analyst, or a personal tutorbot.
- Activity leader: Creating interactive classes is easy to advocate for, but hard to do. Some of our faculty have used GenAI to act like a peer student to stimulate critical thinking, perform real-time analysis of student responses to make them feel heard, or even help make games that align with class content,no coding needed!
- Personal tutor: You (or your TA) can’t be everywhere at once, but GenAI might be able to. Feeding it your syllabus, lectures, and an in-depth prompt can help make a personal tutorbot, generate unlimited practice problems, or even remind students of course-specific information.
- Custom reviewer: LLMs can be used to provide initial personalized feedback to your students, so you can focus on the big picture. Some faculty used them to quickly summarize student responses before office hours, or even point out areas in student responses that need improvement.
- Skill leveler: Classes often have hidden prerequisites: familiarity with coding, texts barred in foreign languages, or even art skills. GenAI tools can be leveraged creatively to help students overcome such barriers, like teaching a business school class about data analysis without requiring every MBA student to learn code, or analyzing trends of thousands of photographs without having to do so manually.
Since LLMs can write essays, respond to readings, and finish problem sets, how can one not be concerned about misuse? For you and your students, addressing this concern means first making sure we know what GenAI can and can’t do—then creating assessments that emphasize skills where AI tools fall short.
Here are some strategies faculty have found useful:
- Human-based learning: Design tasks and assessments that require creativity, practical application of concepts, and critical thinking. For example, instead of asking your students to summarize perspectives on a given issue, you may ask them to critically analyze which perspective is most convincing to them and explain why; to relate their answers to class discussions; or to assess their peers’ performance during a live problem-solving session.
- Process-based assessments: Another approach is to test intermediate steps in the learning process, instead of just the final product. (It’s easy to fudge your report card to your parents; it’s harder to fudge not having gone to school for the past two months.) Testing evidence of original thinking, planning, peer-to-peer conversations, etc. can make relying on genAI less attractive.
- Establishing norms: Emphasize original work and academic honesty, and at a minimum provide clear guidelines about the use of AI-generated content in assignments and assessments.
Here are the most important ones to keep in mind:
- AI models can make mistakes: We’ve all had an incident (or two!) where a GenAI tool seems to have lost its mind, yielding garbled or entirely made-up answers. These are called AI hallucinations. It’s tempting to think these will get eliminated over time as technology improves. But since LLMs are fundamentally probabilistic rather than deterministic, this may not be the case.
- AI models can be biased: AI adopts the biases of the material and data it was trained on. Good AI use involves being aware of, checking for, and making efforts to correct such biases.
- AI models can violate privacy: AI is very good at doing what you want, but it is also very bad at knowing if what you want it to do is allowed. Personal data is not supposed to be fed to GenAI models. Make sure you are aware of Harvard’s data protection policies and FERPA.
- AI models can be misused: Of course, AI could be used to plagiarize assignments. Unless you are interested in grading robots, you should shift the kind of assignments you are providing students (see above) as well as enforce academic dhonesty policies.
GenAI’s ability to meet learners where they are, both in terms of prior knowledge and learning progress, can increase students’ understanding, and AI-powered educational games can increase student motivation and engagement, particularly in STEM courses. One fascinating study showed that when students tutored by AI are pitted against students taught in the traditional classroom setting, the AI-tutored students performed as well or better.
It is natural for instructors, particularly successful ones, to wonder: GenAI may be useful for the average educator. But my classes are great; why would I need it?
One way to think about this is in terms of the efficiency benefits of GenAI tools—they can save time, facilitate meaningful non-classroom learning experiences, and make classroom discussions more interactive. For example:
- Utilize highly thoughtful prompt engineering with GenAI to build a tutorbot that gives students an unlimited number of interactive statistical problems.
- Challenge students with DIY interactive simulation games created on short order—and without any coding prerequisite!
- Empower students to experiment with visualizations of interactions with just a few minutes prompt engineering with DALL-E.
- Where to start experimenting: If you’re ready to jump right in, then the best place is to start is the Harvard AI Sandbox. Like any sandbox, it’s a great place to play around with tools; unlike any sandbox, it has five large-language-models to choose from and is accessible by request here.
- Where to modify images: If you’re looking to use GenAI to manipulate and create images, then you can use download Adobe Firefly through the Harvard Adobe Creative Cloud license. For more information, see Getting started with prompts for image-based Generative AI tools.
- Where to learn more: If you want to level up your GenAI knowledge before you start creating, check out the FAS Division of Science Generative AI Resources or the Bok Center’s Artificial Intelligence page.
- Where to find higher level tools: API access to tools like Azure OpenAI, Google Vertex, and AWS services is available by request from HUIT. If you don’t know what these are, that’s okay; teachers have made remarkable tools just using ChatGPT and a well-crafted prompt!
When using large language models (LLMs) in your classroom, it’s essential to be aware of the University’s guidelines designed to ensure responsible and effective use of these technologies and to refer to School-specific policies and resources. It is also important to remember that other existing policies, such as Harvard’s Information Security Policy, Digital Accessibility Policy, and Intellectual Property Policy, also apply to GenAI and the use of LLMs.
Across Harvard, there’s a strong emphasis on using LLMs ethically and in ways that uphold academic integrity. The use of generative AI must align with the principles of honesty, respect, and responsibility, ensuring that students’ work remains original and reflective of their understanding and skills. In crafting a response to the use of LLMs in the classroom, it’s crucial to strike a balance between leveraging these tools for educational enhancement and ensuring that they do not compromise educational objectives.
At the course level, Schools within Harvard allow for the customization of AI usage policies by individual instructors, provided these are clearly communicated. Each encourages innovative and thoughtful integration of AI in teaching and learning practices, including learning to use generative AI productively. Above all, students and faculty are encouraged to be transparent about the use of generative AI in academic work. This includes proper attribution when AI-generated content or assistance is utilized in the creation of academic materials. Consider co-creating course-specific norms around the use of generative AI with your students.
For any faculty who wish to simply explore the use of these tools, we encourage you to engage with peers and organizations at Harvard that are interested in these topics.
School-based resources
Visit your School’s website for the latest policies and guidance around using GenAI in the classroom. This list will be updated as further School-specific guidance becomes available.
- Harvard Business School
- Harvard College
- Harvard Division of Continuing Education
- Harvard Graduate School of Design
- Harvard Divinity School
- Harvard Kenneth C. Griffin Graduate School of Arts and Sciences
- Harvard John A. Paulson School of Engineering and Applied Sciences
- Harvard Kennedy School
- Harvard Medical School