[Web Application Hacking] Ch 4. Mapping the Application

노션으로 옮김·2023년 8월 14일
0

Up to Chapter 3, the content has mainly been theoretical explanations about the technology, and I didn't feel the need to summarize much.

And as of Chapter 4, Practical content included by it, so I intend to begin summarizing the content.




Chapter 4. Mapping the Application


The first step in the process of attacking an application is gathering and examining some key information about it to gain a better understanding of what you are up against.

Web Spidering

Web Spidering is a technique used to grasp all resources and the structure of the target web application. This technique becomes achievable by recursively fetching all contents linked from the main page.

robots.txt

The robots.txt file, situated at the web root, comprises a roster of URLs that the website wishes to evade web spiders and search engines from visiting or indexing.
It is important because it occasionally contains critical URL information.

Automated Spidering

Automated Spidering refers to the process in which a spider tool spiders the target webserver entirely without human intervention.
Because this technique utilizes the resources exposed externally to find out the structure of web server, there is an issue where the spider cannot verify certain resources when user interaction is required or when links to access the resource are hidden.

User-Directed Spidering

It refers to the process in which the user navigates through the application in a typical manner using a standard web browser, and the spider tool captures this interaction to build a sitemap of the application.
Because the interaction with the web application is carried out by the user, no issues arise when the Automated Spidering technique is employed.

User-Directed Spidering Processing

  1. Configure the browser to use either Burp or WebScrab as a local proxy.

  2. Browse the entire application normally, attempting to visit every link/URL you discover.

  3. Review the site map generated by the proxy/spider tool

Intruder of Burp

Burp Suite's Intruder can be utilized for performing Automated Spidering.

Let's look at an example of usage. When you define parameters within the request, Intruder sends the request to the destination server with randomized values for those parameters:

It then collects the responses:

HACK TIPS

  1. Identify any naming schemes in use. For example, if there are pages called AddDocument.jsp and ViewDocument.jsp, there may also be pages called EditDocument.jsp and RemoveDocument.jsp

  2. If there are identifiers distinguishing resources, modify them to find undiscovered resources.
    (like AnnualReport2009.pdf and AnnualReport2010.pdf)

  3. Review all client-side code to identify any clues about hidden server-side content such as HTML and JavaScript.

  4. Search for temporary files such as the .DS_Store file, which contains a directory index under OS X, file.php~1, which is a temporary file created when file.php is edited, and the .tmp file.

Content Discovery of Burp

The Content Discovery feature of Burp Suite automates these processes.

Let's follow the menu:

Subdirectory exploration, file names, and extensions can be easily configured.

Google Dorks

Google Dorks also provide a wide range of information necessary for web server attacks:

  1. Searching for specific file extensions
    filetype:pdf site:example.com
    This query searches for PDF files uploaded to example.com

  2. Searching for access.log
    intitle:"index of" intext:"access.log" site:example.com
    This Dork would enable you to search for access log files of example.com

  3. Finding developer-written source code
    intitle:"index of" intext:"server at" site:example.com

  4. Discovering links to a specific website
    link:www.wahh-target.com
    This Dork would return all the pages on other websites that contain a link to the target.

Above will help you discover the web application's server-side source code and the version of the framework in use

3개의 댓글

comment-user-thumbnail
2023년 8월 14일

개발자로서 성장하는 데 큰 도움이 된 글이었습니다. 감사합니다.

답글 달기
comment-user-thumbnail
2023년 8월 16일

오랜만에 글을 쓰셨네요. ㅎㅎ 원서를 읽고 계신 건가요?

1개의 답글