Skip to content

Introduction

Last week we looked at Reconnaissance, and mapping the visible elements of an application. This is our first stage in identifying potential attack surfaces, and vulnerblitilites.

We are going to continue with reconnaissance this week, looking at how we could discover "hidden" elements of a site. We will also see how we can use tools and automation to "fuzz" parameters to infer how user input behaves.

Getting infrastructure information Passively.

First we will look at how me might be able to identify information about the servers and services on a network passively. We will look at tools like DNS enumeration, and the google cache / wayback machine.

Mapping Hidden Files

We have seen how mapping a website and identifying its endpoints and parameters is valuable when planning an attack. However, we have focused on manual scanning, and only looked at areas visible from the main site.

Often the more "interesting" elements of a site are "hidden"1 from standard users. Often links to these pages are only visible to authenticated users with the correct authorization. Sometimes they may be easy enough to find, for example the admin page link being commented out in the source, but usually we will have to hunt for them.

Once we have found these hidden elements we can apply the same mapping process we have on the more public elements of the site. This allows us to continue to identify endpoints and parameters, and building a better picture of the attack surface.

Topics to Cover

  • Common Metadata
  • Fuzzing Concepts
  • Directory Busting
  • Sub Domain Enumeration
  • Fuzzing / Bute Forcing Parameters

  1. And Security through obscurity isn't the best idea. 

Back to top