26:04 Jackson: Alright, Miles, we've covered a lot of ground here—from methodologies to psychology to culture. But I know our listeners are probably thinking, "This all sounds great, but where do I actually start?" Can we put together a practical playbook for someone who wants to improve their testing approach?
0:10 Miles: Absolutely! Let's break this down into actionable steps that people can actually implement, whether they're just getting started with testing or looking to level up their existing practices.
26:30 Jackson: Perfect! So where should someone begin if they're looking at their current testing situation and thinking it needs improvement?
26:37 Miles: The first step is always assessment. You need to understand where you are before you can figure out where you're going. I'd recommend starting with three key questions: What are we testing? How are we testing it? And what are we missing?
26:49 Jackson: Those seem deceptively simple, but I bet the answers can be pretty revealing.
0:59 Miles: Exactly! For the "what" question, create an inventory of all your current test cases and map them to your requirements or user stories. You might discover gaps where critical functionality isn't being tested at all, or areas where you have redundant tests that aren't adding much value.
27:09 Jackson: And for the "how" question?
27:11 Miles: Look at your current mix of manual versus automated testing, the techniques you're using, and how long your testing cycles take. Are you spending too much time on repetitive manual tests that could be automated? Are you missing opportunities for exploratory testing because everything is scripted?
27:25 Jackson: What about that third question—what are we missing?
27:28 Miles: That's often the most eye-opening one! Look at your production bugs and customer complaints. What types of issues are slipping through your testing? Are they integration problems? Performance issues? Usability problems? This tells you where to focus your improvement efforts.
27:42 Jackson: So once you've done this assessment, what's the next step?
27:45 Miles: Pick one area to focus on first. I know it's tempting to try to fix everything at once, but that usually leads to overwhelm and abandonment. Choose the area that will give you the biggest impact with the least disruption—what we call the "low-hanging fruit."
27:58 Jackson: Can you give me an example of what that might look like?
5:56 Miles: Sure! Let's say your assessment reveals that you're spending 80% of your testing time on repetitive regression tests, and bugs are still slipping through to production. You might start by automating your most stable, frequently-run test cases. This frees up time for more exploratory testing while also improving consistency.
18:09 Jackson: That makes sense. What about if someone is starting completely from scratch?
28:21 Miles: If you're starting from zero, I'd recommend beginning with test planning. Before you write a single test case, make sure you understand what you're trying to achieve. What are the biggest risks to your users? What are the most critical user journeys? What would cause the most damage if it broke?
28:36 Jackson: So you're prioritizing based on impact and risk?
0:59 Miles: Exactly! This is where that risk-based testing approach we talked about earlier comes in handy. You might create a simple matrix with probability of failure on one axis and impact of failure on the other. Focus your initial testing efforts on the high-probability, high-impact areas.
28:54 Jackson: What about tools and technology? When should someone start thinking about automation?
3:18 Miles: Great question! I usually recommend getting your manual testing process solid first. If you automate a bad process, you just get bad results faster. But once you have stable test cases that you're running repeatedly, automation starts making sense.
29:11 Jackson: Any specific advice for choosing automation tools?
29:14 Miles: Start simple and grow gradually. Don't try to boil the ocean with a complex framework right away. Pick a tool that your team can actually learn and maintain. It's better to have a simple automation suite that people use and trust than a sophisticated one that sits on the shelf because it's too complicated.
29:29 Jackson: What about the human side of this transformation? How do you get people on board?
29:33 Miles: That's crucial! Start by involving people in the assessment phase. When team members help identify problems and gaps, they're much more invested in the solutions. Also, celebrate early wins. When that first automated test catches a regression that would have taken hours to find manually, make sure everyone knows about it.
29:50 Jackson: So you're building momentum and buy-in along the way?
0:59 Miles: Exactly! And be transparent about the learning curve. Testing transformation isn't just about new tools and processes—it's about developing new skills and mindsets. Give people time and support to grow into these new approaches.
30:05 Jackson: What about measuring progress? How do you know if your transformation efforts are working?
30:09 Miles: I'd recommend tracking both leading and lagging indicators. Lagging indicators might be things like reduction in production bugs or faster time-to-market. Leading indicators might be test automation coverage, time spent on exploratory testing, or team satisfaction with the testing process.
30:24 Jackson: Those leading indicators seem really important for staying motivated during the transformation.
0:10 Miles: Absolutely! Transformation takes time, and you need ways to see progress before the final results show up. Plus, leading indicators can help you course-correct if something isn't working.
30:38 Jackson: Any common pitfalls people should watch out for during this transformation?
30:41 Miles: One big one is trying to change too much too fast. Another is focusing only on tools and ignoring the people and process aspects. And definitely avoid the "silver bullet" mentality—there's no single tool or technique that will solve all your testing challenges.
30:55 Jackson: What about sustaining the transformation once you've made progress?
30:58 Miles: Make it part of your regular rhythm. Build testing discussions into your planning meetings, include testing metrics in your regular reviews, and create opportunities for the team to share what they're learning. The goal is to make continuous improvement of your testing approach just part of how you work.
31:13 Jackson: This is really helpful, Miles. It sounds like the key is to be systematic but also patient with the process.
17:15 Miles: That's exactly right! Testing transformation is a journey, not a destination. The goal isn't to achieve perfect testing—it's to continuously get better at understanding and managing quality risks. And remember, every organization is different, so adapt these ideas to fit your specific context and constraints.
31:35 Jackson: I love that perspective. It takes the pressure off trying to get everything perfect right away and focuses on making steady progress.
0:59 Miles: Exactly! And the beautiful thing is that as you get better at testing, you also get better at understanding your software, your users, and your risks. It's an investment that pays dividends far beyond just finding bugs.