Written assignments used to feel like a window into student thinking. Today, they’re just as often a window into how well students can prompt an AI.
That tension sat at the center of our recent webinar, AI-Resilient Assessment at Scale, where Breakout co-founder Steve Walters was joined by Dr. Olav Sorenson (UCLA Anderson) and Dr. Nicole Jones Young (Franklin & Marshall College). Together, they explored a simple question with big implications: How do we assess what students actually know in an age where AI can ace the written work?
Across the hour, a pattern emerged. Instructors aren’t trying to ban AI. They’re trying to re-center assessment around live thinking, discussion, and accountability — and they’re doing it by replacing written work with oral assessments and structured small-group conversations.
If you don't have time to watch the full webinar recording, this recap pulls out key ideas from the session — plus short clips you can watch when you only have a few minutes.
The Problem: Written Assignments are Broken
Students are less prepared, more reliant on tools like ChatGPT, and written work no longer offers a clear signal of understanding. When students can click “create” with little thought, traditional written assignments stop being the most effective way to assess critical thinking.
What Now: From Threaded Boards to Live Oral Discussions
Most “discussion boards” have drifted into a familiar pattern: students type something once, maybe skim a peer or two, and increasingly rely on AI to do the heavy lifting. Breakout flips that model by turning the same prompts and materials into live, small-group conversations grounded in the instructor’s existing content. Instead of posting and disappearing, students have to show up, talk to each other, and demonstrate understanding in real time.
Under the Hood: Rubrics & Feedback
Breakout's Discussion Quality Rubrics measure what actually matters in a conversation: participation, peer engagement, critical thinking, evidence use, and more. Students get targeted feedback on how they contributed, while instructors see rich aggregate and individual data that can feed directly into grading and class planning. With Advanced Insights, faculty can even query their discussions — asking questions like “Who spoke the most?” or “What were the main themes in Group 3?” — plus surface moments worth revisiting.
Case Study: How Olav Turned His Venture Capital Midterm into an Oral Practice Exam
In Dr. Olav Sorenson’s venture capital course, traditional take-home midterms and short written answers no longer felt viable in a ChatGPT world. He rebuilt part of his assessment strategy by offering a practice oral exam on Breakout, where students talk through their valuation reasoning step by step instead of just submitting a final number. Students who chose to use the practice tool performed a full standard deviation better on the actual exam.
The Power of Saying It Out Loud: Social Pressure and Deeper Prep
Speaking answers out loud changes how students prepare. One of Olav’s students described feeling “social pressure” to truly understand the material when walking through their reasoning verbally, as if talking to a real person, rather than quietly working a problem set. When students know they’ll be in a small-group discussion, they spend more time with the material — not because of points alone, but because they don’t want to be the one who shows up with nothing meaningful to contribute.
Designing Rubrics: Let AI Grade Like a Good Professor
Olav walks through how he built rubrics that don’t just check if students got the “right” answer, but credit the key steps in their reasoning. By layering process-based criteria into his existing problems and answer keys, he’s able to let AI reliably evaluate both outcome and approach, giving students clear insight into where their thinking went off track and effectively “tutoring at scale” for large classes.
Case Study: How Nicole Moved Beyond Discussion Boards
Dr. Nicole Jones Young's courses used to rely heavily on traditional discussion boards. She shifted to Breakout by pairing multimedia content — case studies, films like Fyre Festival and American Factory, and short videos — with custom questions that students explore in small-group discussions. Pre-quizzes and Breakout modules ensure students actually engage with the material before class, and the resulting data gives her a clear sense of where to lean in during the in-person debrief.
What Students Actually Think: Engagement and Transparency
Students aren’t just tolerating the change — they’re asking to keep it. Nicole’s surveys show strong ratings for Breakout assignments, and when she asks whether they want to go back to static PDFs, the answer is a decisive no. Because students can see how they were evaluated, and even reflect on what the AI may have missed, they feel the process is transparent and fair — and many are comfortable having Breakout assessments count toward their grade.
Beyond Single Assignments: NextBook and Full Course Design
Once discussion and oral assessment are in place for a few assignments, the next question is what a whole course could look like with those elements at the center. Three patterns are emerging across institutions: swapping in Breakout modules for existing assignments, using the Breakout library of cases and simulations, and designing full “NextBook” course experiences built around dynamic pre-work and discussion. In one Trailhead-style course at Michigan State University, that approach led to meaningful jumps in engagement, comfort in group discussion, and confidence in collaboration.
Getting Started: Low-Lift Pilots and Design Support
None of this has to start with a complete course redesign. Nicole describes how she began by piloting Breakout in a single course, using materials she already had, and working with Breakout’s instructional design team to shape questions and rubrics. A single vulnerable assignment or discussion board is enough for a first step —and that the Breakout team can help with the heavy lifting so faculty can focus on the pedagogy, not the plumbing.
The throughline of AI-Resilient Assessment at Scale wasn’t “AI is bad.” It was that assessment has to catch up.
When written work can be outsourced to a chatbot in seconds, the most valuable signals come from what students do in the moment: how they explain their reasoning, respond to peers, defend a position, and apply concepts out loud. That’s what oral assessments and structured discussions are built to surface.
Olav and Nicole showed that you don’t need to redesign your entire course to move in this direction. You can start with one vulnerable assignment, one discussion board, or one practice exam — and use tools like Breakout to handle the logistics, feedback, and data.
Curious about replacing a written assignment with an AI-resilient small-group discussion or oral assessment? Connect with the Breakout Learning team to see a live walkthrough!
Up next
Jon Garfinkel Expands His Reach with Breakout's Discussion Platform
Jon Garfinkel, Professor of Finance at University of Iowa’s Tippie College of Business, is making small-group discussions happen at scale. With the power of Breakout’s AI platform, he can get detailed insights into every student’s progress, which is a...
Laura D’Antonio Levels Up Her Classes with Breakout’s Discussion Platform
Laura D’Antonio, Teaching Assistant Professor of International Business at George Washington University, has taken her courses to the next level with Breakout’s award-winning discussion platform. By creating custom Breakout Dialogs around her course...