27:12 Jackson: Miles, as we wrap up our conversation, I keep thinking about what testing and assessment might look like in the future. We've covered so many of the problems with current approaches—the bias, the anxiety, the way measurements can corrupt what they're trying to measure. But I'm curious about your vision for how we might do this better.
27:30 Miles: You know, Jackson, I'm actually pretty optimistic about where we're headed, despite all the challenges we've discussed. I think we're moving toward more holistic, authentic, and humane approaches to assessment that could address many of the problems we've talked about.
27:43 Jackson: What does that look like in practice? I mean, we still need ways to evaluate learning, make hiring decisions, and ensure quality in various fields. We can't just abandon measurement altogether.
27:54 Miles: Absolutely not! But I think the future of assessment is going to be much more personalized, continuous, and integrated into the learning and working process itself. Instead of these artificial, high-stakes testing events, we're seeing movement toward portfolio-based assessment, real-world problem-solving, and ongoing feedback loops.
28:13 Jackson: That sounds so much more aligned with how people actually learn and grow. Instead of cramming for a test and then forgetting everything afterward, you'd be constantly demonstrating and developing your capabilities through meaningful work.
1:22 Miles: Exactly! And technology is going to play a huge role in making this possible, but in a much more sophisticated way than just digitizing traditional tests. We're starting to see AI systems that can analyze complex work products, provide personalized feedback, and track learning progress over time without reducing everything to simple scores.
10:49 Jackson: That's fascinating! So instead of taking a test about project management, you might actually manage a real project and have an AI system analyze your decision-making process, communication skills, and problem-solving approach?
8:22 Miles: Right! And the beautiful thing about that approach is that it eliminates the artificial separation between learning and assessment. You're not studying for a test about project management—you're actually doing project management and getting continuous feedback that helps you improve.
29:10 Jackson: And I imagine that would be much less stressful and more engaging than traditional testing. When assessment is integrated into meaningful work, it doesn't feel like you're being judged—it feels like you're being supported in your growth.
0:09 Miles: Absolutely! And this connects to something really important about intrinsic versus extrinsic motivation. When people are working on projects they find meaningful and getting feedback that helps them improve, they're much more likely to be genuinely engaged rather than just trying to game the system or avoid punishment.
30:31 Jackson: What about the fairness and bias issues we discussed? How do these new approaches address those concerns?
30:36 Miles: That's a great question, and I think there are several ways these approaches could be more equitable. First, by focusing on actual performance rather than proxy measures, you reduce the advantage that comes from test-taking skills or cultural familiarity with specific question formats. Second, by using multiple forms of evidence over time, you get a much richer and more accurate picture of someone's capabilities.
30:57 Jackson: And presumably, if assessment is happening continuously in real-world contexts, it's less likely to be thrown off by having a bad day or being in an artificial testing environment that doesn't suit your learning style.
1:22 Miles: Exactly! Plus, when assessment is more transparent and ongoing, people have more opportunities to understand what's being measured and how they can improve. Instead of getting a mysterious score weeks after taking a test, you're getting real-time feedback that you can actually use.
31:22 Jackson: This all sounds incredibly promising, but I imagine there are still challenges to work through. How do you ensure consistency and fairness when assessments are more individualized and context-dependent?
31:32 Miles: You're absolutely right that this creates new challenges. We'll need to develop more sophisticated ways of ensuring that different assessment approaches are measuring comparable skills and that evaluators are calibrated appropriately. But I think these are solvable problems, especially as we get better at using technology to support human judgment rather than replace it.
31:50 Jackson: And what about the cultural shift that would be required? So much of our educational and professional culture is built around traditional testing approaches. How do we help people—students, teachers, employers, parents—adapt to these new ways of thinking about assessment?
32:03 Miles: That's probably the biggest challenge, honestly. Change is always difficult, especially when it involves something as fundamental as how we measure and validate learning and competence. But I think we're already seeing early adopters who are demonstrating the benefits of these approaches, and that success will gradually influence broader adoption.
32:19 Jackson: Well, Miles, this has been such a thought-provoking conversation. I feel like we've really explored the complexity of testing—both its problems and its potential. For our listeners who want to continue learning about this topic, what would you recommend?
32:32 Miles: I'd encourage people to pay attention to how assessment affects their own learning and performance, and to advocate for approaches that focus on growth and authentic demonstration of capabilities. Whether you're a student, educator, or professional, you have the power to influence how assessment happens in your own context. And I think the more we can shift toward assessment approaches that support learning rather than just measuring it, the better off we'll all be.
32:54 Jackson: Absolutely! And to everyone listening, we'd love to hear about your own experiences with testing and assessment. What approaches have worked well for you? What changes would you like to see? Thanks for joining us on this exploration of how we measure what matters, and we'll see you next time!