Reviewing the Practical Network Penetration Tester (PNPT) Course Pt. 3

Previously I covered the initial sections of the Practical Ethical Hacking — The Complete Course: Before We Begin, and Introduction sections. And last week I covered three sections: Notekeeping, Network Refresher, and Setting Up Our Lab. That review touched on the importance good note-keeping can have when you’re learning material, documenting findings, compiling post-test reports, and when referring back to previous tests you’ve conducted. It also discusses several of the fundamental topics of network infrastructure such as IP addresses, the OSI Model, common TCP/UDP ports, and subnetting. Additionally, the Setting Up Our Lab section walked us through how to properly download all necessary programs and files to run an Virtual Machine (VM) that will be used throughout the course, ensured our VM was properly configured and updated to make the practice and labs run smoothly.

Let’s dive into the Practical Ethical Hacking — The Complete Course: The Ethical Hacker Methodology, and Reconnaissance

This week’s material starts off by explaining the typical methodology and process an ethical hacker might take when working through a testing engagement. For those who aren’t familiar with the penetration testing / ethical hacking process, covering the stages of a typical engagement is incredibly valuable. A typical engagement will usually occur over five key areas: Reconnaissance, Scanning & Enumeration, Gaining Access, Maintaining Access, and Covering Tracks. The majority of testing engagements kickoff with the Reconnaissance phase, a combination of active and passive surface-level information gathering about your target. Information is typically gathered through tools such as WhoIS and NSLookUp for the validation of your target, NMap and Sublist3r for finding subdomains, BuiltWith and NMap for server/application fingerprinting, and HaveIBeenPwned and BreachParse for finding information from previous data breaches. Based on whether you’re conducting a physical on-site test of a company’s security procedures, or you’re performing a test on an online application, there is always information that you can learn that will make your testing experience much smoother and reduce difficulties later on.

A great quote from Heath delivered during the module was:

The better scanning and enumeration you can do, and the better information gathering you can do, the better hacker you’re going to be, and the better you’re going to be at your job.

To start putting this methodology into practice, we move into the course module dedicated to the first stage of the Ethical Hacking methodology: Information Gathering

To make the course more relatable and easier to follow along, Heath demonstrates the reconnaissance methodology on Tesla (via their authorized BugCrowd Bug Bounty Program). When practicing these skill outside of isolated or dedicated test environments (i.e. VMs or dedicated websites) it is important to ensure you follow the proper guidelines set forth by the company. Bug Bounty Programs are a great way to practice and hone your skills, but ensuring you aware of what items are in scope and can be tested on, ensure you’re legally performing your scans and tests.

Starting out the module, Heath demonstrates different tools that can be utilized to perform Open-Source Intelligence (OSINT) gathering. To discover email addresses, online resources such as Hunter.io, Phonebook.cz, and EmailHippo can be used. Once you’ve identified email addresses associated with your target, you can then start to try and locate passwords in a similar manner. Breach Parse, a tool designed by Heath does an excellent job combining and parsing email credentials that were contained in previous breaches such as the Rock You hack, allowing the user to search through lists of information to identify associated passwords that could be used in gaining access to the application being tested. Next, we move on to gathering information for web applications.

Gathering and enumerating information for these sites can be useful not only for external tests, but can also be incredibly useful for internally-hosted sites as well. This information can be gathered through both passive and active searching. Subdomains are incredibly valuable to find because they can often be used for internal employee access, developmental versions of the site, or resources that weren’t meant to be accessed by the general public. These subdomains expand your attack surface, giving you more opportunities to find vulnerabilities. A tool built into Kali Linux that is incredibly useful for finding subdomains from a large number of common search engines and online resources is Sublist3r, additionally, an online resource that can be used to find subdomains via Certificate Fingerprinting is crt.sh by searching for certificates that have been registered by the target. A couple of other popular tools that are suggested by Heath is the OWASP Amass Project and TomNomNom’s HTTProbe, however they do not get used within the course.

Moving on from application subdomains, we start to look at identifying what tools and technologies were used to build the target site, and a great online tool to do this with BuiltWith. The information returned by this tool give us a better idea of what tech is being used by the website such as the different widgets, analytics platforms, supported languages, CDNs and CMSs, and even the underlying frameworks. An additional browser plug-in that can be useful in learning about a website’s makeup is Wappalyzer. A great features of this plug-in is the additional information it provides such as version numbers, and more detailed and more confident component information. For locally-run tools, WhatWeb can be used to return information similar to online tools, while potentially providing more insight. These resources can be used alone, or in conjunction to help build a detailed profile of a target website, and help in the identification of vulnerabilities. Additionally, Heath covers the information that can be provided by the web proxy tool Burp Suite, a tool which will see a larger deep-dive in a later module within the course.

A typically undervalued, but incredibly useful tool that can be a great resource for information gathering is Google (I don’t think a hyperlink is necessary here). Often called Google Fu, the search operators available can be used to narrow down results and be used to find subdomains, identify different file types hosted 0n their website domains, and is surprising to see the types of results you get when you use a bit of Google Fu techniques.

Another great quote from Heath, that I think applies to more that just this module, this course, or the information security industry as a whole is:

Before you ask anybody a question, no matter how complex, I challenge you to Google it first. Make sure you have done your research, then ask somebody.

Before finishing out the Reconnaissance section of the course, it makes sense to highlight the importance Social Media has on the world around us, and the immense amount of resources and information it can provide us. Social media can be a great way to find thing like photos of badges, corporate structure information, software used by the company, the people that work for the company, and can be used in conjunction with other research tools to provide a more in-depth picture of the information or provide additional pathways for identifying vulnerabilities and potential exploits. It’s important to remember, people are the weakest point of an organization. People are typically lazy, and it’s much easier to compromise a person, or utilize a person’s weakness (such as a weak password) than it is to compromise software or a website.

Moving on from this section, we will begin to start the more technical aspect of the course, scanning and enumeration, and diving into real hacking. While we are moving away from the information gathering process, a lot of the techniques and resources we’ve covered as well as the information we’ve obtained will continue to be of importance and used throughout the remainder of the course.

Weekly Wrap-Up

As I continue working through the first of four course to prepare for the PNPT exam and so I’m incredibly impressed with the material and with how the TCM Team and Heath explain what could easily become overwhelming or lose people’s interest. I look forward to exploring the next set of subjects and to explore the next stages of the penetration testing process.

Again, keep an eye out on Twitter for day-to-day updates and information about my BuyMeACoffee giveaway goal, any upcoming Twitter Spaces or podcasts featuring other infosec community members, Cyber Security Awareness Month giveaways, as well as opportunities for you to get involved in the community.

For those new to the community or interested in joining it, throughout the week members of the InfoSec Twitter community participate in unofficial conversations and events such as #CyberMentoringMonday and #FF where you can find and connect with some incredible people and strengthen your infosec network.

Leave a Comment