4
The Controlled Drill: Ethical Hacking the Right Way 10:55 Lena: Okay, Miles, let’s talk about the "Red Team" side—Ethical Hacking. This is the part that usually gets people excited. It’s the "cool" side of cyber, right? But I know it’s also the side where beginners can accidentally cross a line. How do we approach this "underwater" perspective without becoming the very threat we’re trying to stop?
11:15 Miles: It’s all about the "Ethical" part of the title. The biggest mistake beginners make is thinking that because they’re "learning," it’s okay to scan their school’s network or try a basic exploit on a random website. In 2026, that will get you a knock on the door from the authorities faster than you can say "SQL injection." The difference between a criminal and an ethical hacker isn't the tools—it’s permission. You only touch systems you own, like your home lab, or systems where you have explicit, written authorization.
11:44 Lena: Right, like the "supervised drill" in the pool. You’re practicing the rescue hold, but everyone knows it’s a drill. So, if I’m in my lab and I have my Kali Linux machine ready, what’s the actual workflow? It’s not just "pressing a button and you’re in," is it?
11:59 Miles: Not even close. Real ethical hacking is about 90% preparation and 10% action. We follow a very specific sequence: Reconnaissance, Scanning, Exploitation, Post-Exploitation, and Reporting. Think of it like a heist movie. You don't just run into the bank. You spend weeks watching the guards, checking the locks, finding the weak points. In cyber, that means using tools like Nmap to see which "doors" or ports are open on a server. Then you use a vulnerability scanner to see if any of those doors have "old locks" that haven't been patched.
12:29 Lena: And that’s where the "Exploitation" part comes in—actually using that weakness to get inside?
5:32 Miles: Exactly. But here’s the thing that surprises most people: once you’re "inside," you don't just start deleting files. As an ethical hacker, your job is to prove the impact. You might try to "pivot" to another machine on the network to show how far an attacker could go. Or you might try to access a dummy "sensitive" file to show that the data is at risk. But you treat that data like it’s a medical record—you don't touch it, you don't share it, and you keep your exposure to a minimum.
13:03 Lena: And then the most important part—the part that actually gets you paid—is the report.
13:09 Miles: You nailed it. If you can’t explain what you found, why it matters, and how to fix it, you’re just a hobbyist. In 2026, organizations are looking for "professional literacy." They want people who can translate technical flaws into business risk. If you can tell a CEO, "I found a way to bypass your login because of a misconfigured API, and here’s a three-step plan to fix it," you are incredibly valuable. That’s why programs like Nucamp’s bootcamp or the OSCP certification put so much emphasis on the reporting side. It’s the bridge between the "underwater" world and the "lifeguard stand."
13:42 Lena: It’s interesting how much these two sides—Red and Blue—actually need each other. I was reading about "Purple Teaming," which sounds like a mix of the two. Is that where the industry is heading?
13:53 Miles: It absolutely is. Purple Teaming is where the Red Team and the Blue Team stop treating each other like enemies and start working together. Instead of the Red Team doing a "secret" attack and the Blue Team trying to catch them, they sit in the same room. The Red Team says, "I’m going to try this specific technique now," and the Blue Team watches their monitors to see if it shows up. If it doesn't, they figure out why. "Oh, our firewall isn't logging this specific type of traffic—let's fix that." It’s an incredibly fast way to improve security. In 2026, being a "Purple Team" thinker—someone who understands both the attack and the defense—is a huge career differentiator.
14:32 Lena: It sounds like a much more collaborative way to work. But for someone just starting their 12-month roadmap, they probably aren't doing full Purple Team exercises yet. They’re still in that "Phase 3" of choosing a focus. Does it matter which one you pick first?
14:48 Miles: Not really, as long as you stay grounded in the fundamentals. Some people are naturally drawn to the "detective" work of the Blue Team—they love sifting through data and finding that one "needle in the haystack." Others love the "puzzle-solving" aspect of the Red Team—finding that one tiny crack in a system and seeing how far it goes. Both paths are wide open in 2026. Ransomware alone is causing billions of dollars in damage every month, and it’s projected that a ransomware attack will hit somewhere on the internet every two seconds by 2031. Whether you’re stopping those attacks or finding the holes before they’re exploited, you’re doing vital work.
15:23 Lena: The stakes are just so high. It’s not just about "tech" anymore; it’s about business survival. Which I guess explains why the salaries we were looking at earlier are so significant. If you’re the one keeping a $4.9 million data breach from happening, you’re a hero.
15:39 Miles: You really are. And that’s the "Job-Ready" part of the roadmap. Once you’ve built the skills and practiced the drills, you have to prove it to the world. That’s where certifications and your portfolio come into play. But before we get to the "hiring" part, I think we should talk about the specific tools that everyone keeps hearing about. You mentioned Kali and Wireshark—let's pull back the curtain on those.