How can we help you today?

How to configure the Website Scanner

Looking for vulnerabilities in your web application? Let Website Scanner do that for you! In this article, we explain all the features and options available to help you make the finest vulnerability reports.
Written by Silvia Balan
Updated 2 weeks ago

The Website Vulnerability Scanner is a custom tool written by our team which helps you quickly assess the security of a web application. It is a full-blown web application scanner, capable of performing comprehensive security assessments against any type of web application.

We recommend you do not change the default settings. However, if you have any specific requirements, such as a very large application, or you need to exclude several parts of the application from the scan, you can configure these settings as described below.

Furthermore, if you need to run only specific checks, such as SQL or XSS Injection testing, you can run the scan applying these settings only.

Initial tests

These tests are recommended for all applications. You can skip any of these, depending on the target application typology. The scan duration will vary depending on the number and the complexity of the tests you select to perform.

The resource discovery part is the most time-consuming, so we recommend you run this test at a later stage, or for when you have time to leave the scan running.

You can schedule a scan for later using the scheduling feature. For more details, please check out our support article on how to schedule a scan.

Read more about each test in the dedicated article

Engine Options

You can configure the following options to determine how deep you want the scan to crawl the application or set some paths that you want the scanner to avoid.


The Approach section notifies the scanner of which type of spidering method to use.

  • Classic Spider – Used to crawl classic websites.
  • SPA Spider– Used to crawl single page application (JavaScript heavy) websites. We are still working on this feature and it will be released in a later version.


By adjusting the Spidering depth you are letting the scanner know the number of subpaths (‘/’) it should crawl and scan, meaning to what extent the search engine indexes the website’s content.

A greater crawl depth might get a lot more injection points than a site with a lower crawl depth, but it will also affect your scan duration. We recommend that you keep the default value.

You can also decrease the number or Requests per second that the scanner sends to the target website.

Excluded URLs

Excluded URLs is a list of URL test names to ignore when scanning. By default this is an empty list representing no paths should be excluded. You can enter each URL on a new line. Make sure to enter the full path of the URLs.

Exclude URLs:

Tip: You can resize the input box by dragging the bottom-right corner.

Attack Options

Attack Options represent tests the scanner engine is performing on every new Injection Point it detects during the scanning process. An Injection Point is a target URL paired with unique parameters. It is considered validated after the scanner sends a request to it and checks if the response is valid.

For example “” is a unique Injection Point that is checked with all the selected modules.

There are Active and Passive checks. Both types of tests use the validated Injection Points from the request engine.

The difference between them is that active checks send a large number of requests against an Injection Point with specific payloads that should trigger certain behaviors from the target that indicate whether it is vulnerable or not. 

The latter use the Injection Points detected directly, therefore passive checks are not sending additional requests. They analyze the server’s response for specific configurations and behaviors that prove the target is vulnerable to different attacks

While the passive checks generate a maximum of 20 HTTP requests to the server, the active checks are more aggressive and send up to 10,000 HTTP requests. This may trigger alarms from IDS but you should know that it is not a destructive scan.


If your application requires authentication to access certain parts of the website, it is highly recommended to enable the authenticated scanning. Thus, the scanner covers more application functionality and pages than the unauthenticated scan.

If you wish to learn more about why you should perform an authenticated scan, you can check out this dedicated article from our Learning Center.

The “Check authentication” button is optional for the first three methods and disabled for the “Headers” method, so you can start scanning directly.

Our Website Vulnerability Scanner supports four methods for performing authenticated scans:

  1. Recorded – Recording-based Authentication
  2. Automatic – Form-based authentication
  3. Cookies – Cookie-based authentication
  4. Headers – Headers authentication

You’ll know that the authentication was successful if you get an additional “Authentication complete” message in the final scan report. Furthermore, the Spider results should contain more crawled URLs than the unauthenticated scan.


You can configure notifications when your scan matches certain conditions (is Finished, found High Risk, discovered some open port, etc).

You can find more details in our dedicated support article.

Did this answer your question?