Faculty Feature: Dr. Richard Colby Discusses Teaching with AI

Faculty Feature: Dr. Richard Colby Discusses Teaching with AI

As AI continues to transform the teaching and learning landscape, we’re reaching out to faculty and staff across campus to hear about your thoughts and experiences with AI integration. Whether you’ve already begun exploring AI tools in your classroom, have tips and tricks about how to use it well, or want to share any assignment ideas or insights, we’re eager to hear from you! If you’re interested in sharing your thoughts on AI use, please reach out to the OTL at otl@du.edu. 

In this blog post, DU Teaching Professor and Director of First Year Writing, Dr. Richard Colby, PhD, share his thoughts on teaching with generative AI: 

How might we teach with generative AI?  

It’s no surprise that students are using generative AI (genAI) in their courses. It’s also no surprise that many instructors are still struggling to respond. Seeing writing outsourced to algorithms that assemble the most common word, phrase, and sentiment neighbors from pools of billions of words rankles many who prize the written medium as a window into intellectual engagement. Strategies abound to curb this auto-generation, from in-class writing and blue books to creating “unique” assignments. There are even some crafty instructors who hide secret “Trojan Horse” words in white font color so copied and pasted prompts into genAI will reveal students uses of generative AI. Frankenstein would approve of such bananas. There are even some who believe that reflective writing requires authentic responses that cannot be replicated in genAI—clearly, these idealists have never asked ChatGPT to write a reflection.  

These strategies and many more commonly shared run counter to the ethos of many instructors even as they reflect a longstanding tradition of surveillance and distrust of students. Faculty bring their authentic selves to the classroom, encouraging and caring about student learning, their voices, and their development as scholars and professionals. We trust our students because they also care about these same things, despite the institutional practices such as proctoring software, Turnitin, and zero tolerance policies for plagiarism. To be clear, I’m not criticizing faculty or students here. I’m merely saying that faculty have first to understand why students are using ChatGPT, Gemini or Copilot to do intellectual work that faculty themselves see as instrumental to learning. 

To that core question—why—we need to listen to students. I don’t mean that as a platitude, as in listening, in general. Instead, I mean participating in conversations with students about how they approach assignments (including their use of genAI) that we know are important to intellectual development.  

Answering “why” first requires our acknowledgement that genAI is now as ubiquitous as templates and design styles in computer productivity software. Personally, I have found it strange that for years design templates are integrated throughout productivity software, without acknowledgement, ignoring the intellectual contributions of graphic designers and programmers, and word processors have been suggesting grammar changes, ignoring nonstandard dialects as well ignoring the sources of those rules (sorry, but grammar rules are not universal, and style guides differ). However, as soon as the written expression doesn’t seem to have been human-generated, people are quick to point fingers and call phony. 

This also requires that we, as instructors, teach students how to use genAI which also requires that we learn how to use it ourselves. Learning genAI means acknowledging that it can be a collaborative partner, a concept that we also need to convey to learners. For many assignments, the first step should be to ask genAI to do it first, right there in class. Students should see the prompt, understand how the genAI responds to an assignment. Such a process removes the lowest common denominator of the rushed or stressed student. Borrowing from Stuart Seber’s theory of multiliteracies, this is functional literacy—how do you do something. 

The next step requires helping students to understand the limitations and strengths of genAI responses. This “critical literacy,” gets students to see how the genAI is based in common tropes, expressions, and sentiments that often ignore alternative voices and perspectives. Contributing to this understanding also means reviewing EULA’s and privacy agreements of these tools (something perennial to technology use that often goes unaddressed—how many know the privacy agreements of Google Docs or Canva, two common tools students turn to?) The critical literacy stage also invites learners to engage in moments of integrative learning, such as asking not only for the assignment itself but asking learners why something works. 

The final stage of multi literate development would be rhetorical literacy—using the technology effectively, which includes acknowledging use of genAI in the final product. While acknowledgement is important, the more important element is how it was used and in what ways. Here, I am returning to my initial call that we need to listen. Inviting learners to explain how, why, and what they are doing can help us build better student intellectual engagement both with and without genAI.  

Tips for integrating generative AI into classrooms 

  • Feed an assignment into genAI in class, in front of students. This acknowledges that you know about genAI. Students can see the results which can lead to a discussion about the strengths and weaknesses of what is generated. 
  • On assignments or quizzes, immediately share the genAI answer, and ask students to describe why the answer is correct (or incorrect). 
  • Demonstrate how to use genAI appropriately—as a collaborator rather than automaton. That might mean teaching how to do classroom relevant tasks. For example, on a writing assignment, walk through how to have a conversation with genAI to help generate ideas or different approaches. Rather than “write an essay about x,” ask “what are the common sentiments towards x?” and “this is a poor argument about x. Where did it come from?” 

For more resources on how to navigate the use of AI technologies in the classroom, check out the OTL’s Artificial Intelligence (AI) in the Classroom teaching resources, or schedule a consultation with an OTL staff member