Article

Educators Speak: Rethinking Assessment in Light of AI

Brett Christie, Ph.D., VP for Educational Innovations, Alchemy
Zach Justus, Ph.D., Director of Faculty Development, CSU Chico

On August 28, 2025, nearly 500 higher education professionals gathered with us for the Alchemy-hosted webinar “Rethinking Assessment in Light of AI.” With over 1,000 registrants, this session reflected widespread interest in navigating how AI is reshaping the educational assessment landscape. As the session description noted:

 

“AI is rapidly changing how students may complete assignments, challenging educators to rethink not just what they assess, but how they assess it.”


The webinar did more than explore emerging frameworks. It gave participants the opportunity to reflect, share, and connect over shared struggles and successes. Through a pre-webinar poll and an interactive
Padlet, we gained insight not only into how far educators have come, but also the shape their journeys are taking.

Snapshot: Where Are We on the Journey?

Prior to the webinar, 709 participants responded to the poll question “Your Current Approach to Assessment in Light of AI.

48% said they’ve made targeted updates to specific assignments
28% reported having significantly redesigned multiple assessments
16% indicated no or minimal changes yet
8% shared that AI is integrated into nearly all assessments

This distribution illustrates a clear movement away from the status quo, with 76% of participants indicating some level of adaptation. But numbers only tell part of the story.

Stories from the Field: What Educators Are Doing

To give participants space for reflection and sharing with one another, we offered a Padlet prompt: “To What Extent Have Your Assessment Practices Changed in Light of AI?” Their responses were honest, diverse, and often courageous. They illustrated everything from early experimentation to full course redesigns, and showed that many of these educators are not only responding to AI, but actively shaping new norms around assessment.

Following are key themes that emerged from the Padlet columns, each of which corresponded to a different level or category of change:

 

Minimal Change, Cautious Observation

Some faculty are still finding their footing, opting for observation and reflection rather than action. This is often due to institutional uncertainty or lack of educator time or capacity.

“I haven’t changed anything yet. I want to see what others are doing and how students are responding to AI before I make a move.”

“I’ve made small tweaks, more in my own mindset than in assignment design, but no major changes yet.”


This group sees the terrain shifting but remains in a
diagnostic phase, watching and listening before engaging in redesign.

 

Targeted Revisions to Assignments

The most commonly reported level of change involved making selective updates to individual assignments.

“I modified my essay prompts to include more reflective components and personalized examples so AI output would be less effective on its own.”

“I now ask students to submit a process log alongside their final work to show how their thinking evolved.”


This kind of redesign reflects an emerging strategy: focus on
making student thinking visible, not just the end product.

 

Redesigning Assessments at Scale

Others have moved well beyond tinkering and are reimagining entire assessment strategies.

“I revamped my final exam into a portfolio of iterative assignments that include checkpoints for AI use, peer feedback, and self-assessment.”

“In my business course, I’ve shifted from multiple-choice tests to real-world case analyses that include scenario role-play and reflection on ethical use of AI.”


These examples signal a paradigm shift from assessment-as-judgment to assessment as a
learning process. This is yet another area where AI has highlighted a system of evaluation that was already not working well and compelled us to pay attention to the intention of our practices rather than just relying on what we have always done.

 

Fostering Critical Engagement and Future-Ready Skills

In this column, participants shared how they are no longer just adjusting around AI, but intentionally integrating it into their course design to enhance learning. Faculty described intentional uses of AI that promote transparency, reflection, and deeper engagement with course content. Examples included:

“Students must describe how they used AI and compare it to their final work.”

“I ask students to submit AI-generated work alongside their own and explain differences.”

“AI is allowed for brainstorming and outlining, but final submissions must show their independent thinking.”


Across these examples, instructors are guiding students to treat AI as a tool to engage with critically, not just use passively. The goal is to build AI literacy that extends beyond the classroom and into professional contexts, where ethical and effective AI use is increasingly essential.

These faculty aren’t waiting for policy or precedent, they’re enabling real-world readiness through principled experimentation. One of the powerful features of this webinar was the cross-institutional and international reach. So much of what we hear about AI is framed in terms of geopolitical rivalry, but this webinar was a reminder that around the world and in every conceivable type of institution, we are all engaged in similar work and the challenges significantly overlap.

 

Reframing Assessment Philosophy

A final theme emerged from educators who are shifting not just their practices, but their underlying beliefs about assessment.

“I’m realizing that AI is exposing how many of my assessments were already ineffective. The redesign is long overdue.”

“My goal now is to create assessments that can’t be completed meaningfully without personal insight, synthesis, or judgment.”


These reflections show educators using AI as a catalyst for broader pedagogical reflection, seeing this moment as an opportunity for course renewal with further enlightened instructional values.

Conclusion: The Journey Is Underway

The community Padlet responses reveal that the higher ed community is neither paralyzed nor panicked. In the face of limited institutional direction, many educators are engaging in iterative, collaborative, and reflective shifts in practice.

While the opening poll provided a numerical snapshot of where people are, the Padlet offered substantive texture that faculty are:

  • Creating more resilient assessments that emphasize process, not just product
  • Integrating AI critique and reflection into student tasks
  • Replacing high-stakes tests with low-stakes, feedback-rich activities
  • Developing shared language and rubrics across programs
  • Embracing a mindset of growth, not surveillance/detection

The Padlet responses reflected the real-time energy of the webinar chat, where faculty not only engaged with the session content but also shared concrete strategies, tools, and questions with one another. Across both formats, the variety of contributions highlighted a common thread: faculty are not waiting for institutional directives. They are stepping forward on their own terms, redesigning assignments, trying new approaches, and reflecting on what works. This active engagement stands in clear contrast to popular narratives that often portray faculty as hesitant or reactive in the face of rapidly evolving AI tools.

Whether taking small steps or major leaps, these educators are not waiting for perfect policies or guidance. They’re building a path forward, one assignment at a time. 

Check out the full webinar recap and recording to learn more. 

Recap

December 12, 2025

Strategies for Connecting Learners to the Workplace

Replay
Article

November 26, 2025

Authentic Assessment in the AI Era: What Educators Are Doing Now, What Comes Next

Read More
Recap

November 20, 2025

Authentic Assessment in the Era of AI

Replay