Back due to popular demand: AI Forum

Using Artificial Intelligence to Enhance Teaching & Learning

Tuesday, January 27 | 3:30 – 5:00 pm
Hunter Library 156 (CFC) or via Zoom

You’re invited to an open forum for faculty across disciplines to share ideas, challenges, and best practices focused on uses of AI in the classroom. The forum will be structured into two parts. Part 1: Faculty will share ideas regarding how they are using generative AI for teaching innovation and to ease course administration burdens. Part 2: Faculty will share ideas for setting and maintaining clear expectations regarding ethical and responsible student use of AI and the impacts of AI on student learning processes. We will also reflect on how faculty and students can set competency-based learning goals that emphasize distinctive human skills. Come ready to share your ideas and to learn from and be inspired by others.

We Recommend: Talk is cheap: why structural assessment changes are needed for a time of GenAI

Rethinking Assignment Design in the Age of GenAI

January 2026

Recommended by April Tallant, Director  

Generative AI is reshaping higher education, and our assessment practices must evolve to keep pace. Many institutions have introduced frameworks like traffic light systems, AI use scales, and mandatory declarations. These are helpful first steps because they give us language and structure while we find our bearings. But as Corbin, Dawson, and Liu (2025) argue in Talk is Cheap: Why Structural Assessment Changes Are Needed for a Time of GenAI, these approaches are limited; they rely on student compliance with unenforceable rules. The authors call these approaches discursive changes, or modifications that work through instructions without altering the tasks. Discursive changes alter the communication about the assignment, not the assessment itself. A simple example of discursive change is adding ‘GenAI use is not permitted in this assessment’ to existing assessment instructions. 

The authors argue that discursive changes to assessments are well-intentioned but flawed because they assume students understand ambiguous rules and will comply even when non-compliance is advantageous. Discursive changes also work on the assumption that compliance can be verified, but current AI-detection tools are limited. The authors state, “current detection tools are fraught with false positives and negatives, creating uncertainty and mistrust rather than clarity and accountability” (p. 1092). 

By contrast, Corbin, Dawson, and Liu argue that structural changes, “create assessment environments where the desired behavior emerges naturally from the assessment design” (p. 1093). In other words, structural changes modify the tasks, not the instructions. An example of a structural change provided by the authors include adding a “checkpoint in live assessment requiring tutor signoff on lab work.” Structural changes focus on the process, not the outcome. One example the authors offer: Rather than a final essay, students might participate in live discussions about their idea development and how their thinking developed based on feedback. Another structural change example includes designing assessments that connect throughout the term. Students build on their earlier work, demonstrating their learning across touchpoints, not from one task alone. 

The authors conclude that long-term solutions require rethinking assessment design so that validity is built into the structure, not just explained in instructions. The challenge of assessment design continues as GenAI advances. Our time as educators is better spent on structural redesign of assessment to ensure assessment validity that demonstrates student capabilities. 

 

Action item 1:

Have you modified your assessments with a structural approach? We’d love to hear from you! Join us for the AI Forum on Tuesday, Jan 27, 3:30 – 5:00 pm either in person or on Zoom to share your experience. 

 

Action item 2:

After reading the article, consider the following questions:  

    • The article suggests shifting from product-focused to process-focused assessment. What “authenticated checkpoints” could you realistically build into your lessons or modules to capture a student’s developmental process? 
    • Think of an assessment you believe works well. What about the task itself encourages the kind of learning you want?
    • What are barriers to making structural changes to assignments? How can we overcome them?
  1. Do you want to chat about this article? Send me an email (atallant@wcu.edu) and I’ll stop by your office or meet you on Zoom. 
  2.  

Action item 3:

Consider registering for CFC’s assignment re-design workshop in February! 

 

Corbin, T., Dawson, P., & Liu, D. (2025). Talk is cheap: why structural assessment changes are needed for a time of GenAI. Assessment and Evaluation in Higher Education50(7), 1087–1097. https://doi.org/10.1080/02602938.2025.2503964 

A group of people gather around a table covered with papers, collaborating on a project.

Photo credit: Canva Pro; Monkey Business Images.

Things instructors should know about the WaLC’s AI policy

If you teach at Western, you might be wondering how peer educators at the WaLC (Writing and Learning Commons) navigate conversations around AI with students seeking support. The WaLC team would like instructors to know that an internal AI policy exists, and that it was developed to be as faculty forward as possible and while the policy allows for some flexibility, it’s conservative in its application.  

By default, peer educators ask students who come to the WaLC what their instructor has written in their syllabus (or assignment instructions) concerning the course’s AI policy. If there is no policy to be found, peer educators won’t encourage students to utilize AI in any way. If there is a statement that allows for AI use, peer educators can assist “in areas provided on the syllabus, or if not stated, in areas such as brainstorming, gathering information, interpreting feedback, outlining, and quizzing” (see AI Decision Tree, developed by the WaLC). 

Additionally, the WaLC team has open conversations with their own peer educators who feel uncomfortable using AI themselves (see Step 1 of the AI decision tree). If they encounter a student whose professor requires their students to use AI and they neither are familiar nor comfortable using AI, they can refer the student to a peer educator who is.  

Decision Tree if peer educators should use AI.

One recommendation both the WaLC and the CFC have is for instructors to have a conversation with students of what is to be considered acceptable AI use in addition to having a course specific AI policy in the syllabus. For example, some students are still unsure if they can use certain software (Grammarly is a common example) in all of their course work, only for some activities, or not at all.  

The WaLC’s goal is to empower students to achieve their academic goals. To that end, the WaLC’s peer educators have come up with some creative ways in which they have utilized AI to help students. In one case, a recreational therapy major struggled using person-first language when discussing a patient’s disability. With the help of AI, they curated a vocabulary list consisting of words and phrases that helped the student expand their repertoire and become a more confident writer. Another example includes a student who couldn’t make sense of their instructor’s track changes and suggestions on their first draft since there weren’t any additional comments provided. They asked AI why the paper was marked up the way it was and concluded that a lot of their writing had been repetitive.

Sometimes, students do come into the WaLC with AI-generated writing. Peer educators are trained to approach those conversations similarly to the conversations about suspected plagiarism. These conversations all begin by asking open-ended, non-judgmental questions: “Do you know how to cite?” or “Did you forget to add attribution here?” to detect if plagiarism they noticed occurred unintentionally due to a lack of knowledge. Likewise, when a peer educator suspects AI use, they also ask probing questions to find out if and how AI was used by the student. Peer educators will always advise students to adhere to the instructor’s policy, and they remind students if they notice when AI was used, their professors will notice too.  

If you have questions about the WaLC’s AI policy, feel free to reach out to Haylee Melton at wilkieh@wcu.edu; if you would like support on how to approach your own AI policy, reach out to Coulter Faculty Commons at cfc@wcu.edu.

NEW: AI Ethics Course Module

The CFC is launching a module instructors can import into their Canvas course. The goal of this module is to help students learn the importance of using AI ethically in their college studies. It is a self-contained module that is intended to be customized by each instructor as desired to fit their teaching needs. Instructions are included in non-published pages of the module.  

If you are interested in piloting the module, import the “AI Course Ethics Module” from Canvas Commons. For step-by-step instructions, review Importing a Resource from Canvas Commons. 

The CFC would like to extend its thanks to Haylee Melton,
Associate Director of the Writing and Learning Commons, in collaborating on this article.

Opportunities to Learn about Artificial Intelligence

AI in the WCU Classroom

Join colleagues for an open discussion on using AI in teaching and learning on Tuesday, Nov 18, 3:30-5:00 pm. This two-part program will include sharing strategies for integrating generative AI into instruction and course management, as well as approaches for setting clear expectations around ethical student use. We’ll also consider how to design learning goals that emphasize key human skills. For more information and to register, read the blog post.

UNC System Pilot: Student AI Literacy

The UNC System Office is piloting a new AI Foundational Skills program, developed collaboratively by faculty, librarians, and instructional designers across the System, in Spring 2026 to strengthen student AI literacy and workforce readiness. This collaborative initiative gives students practical AI experience and critical evaluation skills while connecting them with industry perspectives. We’re seeking faculty to integrate the training into their courses and provide feedback, which will help refine the program. Faculty will receive a stipend. Interested? Complete this interest form to receive more information. Questions? Contact Dr. Heather McCullough, Director, Learning Technology and Open Education, hamccullough@northcarolina.edu

Using Artificial Intelligence to Enhance Teaching & Learning

Tuesday, November 18 | 3:30 – 5:00 pm in Hunter Library 156 (CFC)

You’re invited to an open forum for faculty across disciplines to share ideas, challenges, and best practices focused on uses of AI in the classroom. The forum will be structured into two parts. Part 1: Faculty will share ideas regarding how they are using generative AI for teaching innovation and to ease course administration burdens. Part 2: Faculty will share ideas for setting and maintaining clear expectations regarding ethical and responsible student use of AI and the impacts of AI on student learning processes. We will also reflect on how faculty and students can set competency-based learning goals that emphasize distinctive human skills. Come ready to share your ideas and to learn from and be inspired by others.

Everyday Academic Integrity: Virtual Events for Educators

The International Center for Academic Integrity (ICAI) seeks to cultivate cultures of integrity in academic communities throughout the world. If you are interested in connecting and learning about academic integrity from other higher education professionals and institutions, the center is offering the following virtual events.

  • October 1: The International Day of Action for Academic Integrity Pre-Event is “designed to spark ideas and showcase global perspectives.”
  • October 15: The International Day of Action for Academic Integrity is a full-day of programming focused on academic integrity across institutions and cultures. Please note time zone differences.

Find more information and registration links to both events on the International Day of Action for Academic Integrity website.

  • October 24: The Southeastern ICAI Fall Virtual Conference is themed It Takes a Village: Holistic Education in the Era of AI. Join practitioners, students, and administrators to explore how AI is shaping education.

Find more information and registration link to the Southeastern Conference on the International Day of Action for Academic Integrity Regional Conference website.