Showing posts with label Penetration Testing. Show all posts
Showing posts with label Penetration Testing. Show all posts

Vulnerability Analysis in Web Application using Burp Scanner

Hello friends! Today we are going to use Burp Suite Scanner which is use for website security testing to identify certain vulnerability inside it. It is the first phase for web penetration tesing  for every security tester.

Burp Scanner is a tool for automatically finding security vulnerabilities in web applications. It is designed to be used by security testers, and to fit in closely with your existing techniques and methodologies for performing manual and semi-automated penetration tests of web applications.

Lets Start with burp proxy in order to intercept request between browser and website. From screenshot you can perceive that we have forwarded the intercepted data for “an active scan”.

Note: Always configure your browser proxy while making use of burp suite to intercept the request.

Through a window alert it will ask to confirm your action for active scan; press YES to begin the active scan on targeted website.

Issue Activity
The issue activity tab contains a sequential record of the Scanner's activity in finding new issues and updating existing issues. This is useful for various purposes:

·         An index number for the item, reflecting the order in which items were added.
·         The time that the activity occurred.
·         The action that was performed.
·         The issue type.
·         The host and URL path for the issue.
·         The insertion point for the issue, where applicable.
·         The severity and confidence of the issue.

From screenshot you can observe that it highlighted 8 types of issues found inside website from scanning result as following:
1.       Cross-site scripting (reflected)
2.       Flash cross-domain policy
3.       SQL injection
4.       Unencrypted communications
5.       Cross-domain Referer leakage
6.       Email addresses disclosed
7.       Frameable response (potential Clickjacking)
8.       Path-relative style sheet import

Active Scan Queue

Active scanning typically involves sending large numbers of requests to the server for each base request that is scanned, and this can be a time consuming process. When you send requests for active scanning, these are added to the active scan queue, in which they are processed in turn.

·         An index number for the item, reflecting the order in which items were added.
·         The destination protocol, host and URL.
·         The current status of the item, including percentage complete.
·         The number of scan issues identified for the item.
·         The number of requests made while scanning the item.
·         The number of network errors encountered.
·         The number of insertion points created for the item.
·         The start and end times of the item's scanning.

One by one we are going to demonstrate these vulnerabilities in details using request and response.

 Advisory on Cross-site scripting (reflected)

It gave your brief detail of vulnerability and idea to exploit it.
Cross-site scripting (reflected)

The value of the cat request parameter is copied into the HTML document as plain text between tags. The payload was submitted in the cat parameter. This proof-of-concept attack demonstrates that it is possible to inject arbitrary JavaScript into the application's response.

 Inside request tab we will get Inject payload with intercepted data in order to receive certain response of generated request. In given image you can observe that it has injected JavaScript inside URL with Cat parameter

As response we can see the injected payload get submitted inside database. Now it will generate an alert prompt on screen when get executed on website.

Let’s verify it manually on running website.
Execute following script inside URL with cat parameter As result you will receive prompt 1 as alert window.

 Advisory on SQL injection
Similarly test for other vulnerability
SQL injection

The cat parameter appears to be vulnerable to SQL injection attacks. The payload ' was submitted in the cat parameter, and a database error message was returned. You should review the contents of the error message, and the application's handling of other input, to confirm whether vulnerability is present.

The database appears to be MySQL.

 Under request tab single code () will pass with cat parameter to break the SQL statement in order to receive database error as response. 

Under response tab you can read the highlighted text which is clearly point towards SQL vulnerability inside database.

 Advisory on Flash cross-domain policy

Flash cross-domain policy

The application publishes a Flash cross-domain policy which allows access from any domain.
Allowing access from all domains means that any domain can perform two-way interaction with this application. Unless the application consists entirely of unprotected public content, this policy is likely to present a significant security risk.

Similarly as above it has generated the request through GET method using crossdomain.xml

 It has receive successful response over its GET request , inside highlighted text you can read it has allow to access this site from any domain with any port number and security is set as False.
In this way we can see how the burp suite scanner tests the security loop holes in a website.

3 ways to scan Eternal Blue Vulnerability in Remote PC

Hello Friends! As we all known that Microsoft windows 7 are exploitable by eternal blue with SMBv1. Then Microsoft patches this vulnerability by updating the SMB version. Still there are a large number of windows 7 users who didn’t update their system. Now if a security tester wants to separate vulnerable system from update system he requires some scanning to identify vulnerable system.

Eternal scanner is an network scanner for Eternal Blue exploit CVE-2017-0144 .

Target: Windows 7
Attacker: Kali Linux

Open the terminal in your kali Linux and type following command to download it from github.

git clone && cd eternal_scanner

After then when it gets successfully install you need run the script for in Oder to lunch the scanner on terminal by typing following:

Once the scanner is lunched inside the terminal further it will ask to enter target IP or you can also add a range of IPs for scanning.

We have given only single IP for scanning i.e. as target.

Then it will start scanning and dumb those IP which are vulnerable in given IP range; from screenshot you can observe it has dump as vulnerable IP with SMB port 445 and save the output inside /root/eternal_scanner/vulnr.txt

When you will open the output file you will observe vulnerable IP as well as name of exploit “MS17 -010”as shown in given image.
Similarly you can scan the target using NMAP and Metasploit


Attempts to detect if a Microsoft SMBv1 server is vulnerable to a remote code execution vulnerability (ms17-010, a.k.a. EternalBlue). The vulnerability is actively exploited by WannaCry and Petya ransomware and other malware.

The script connects to the $IPC tree, executes a transaction on FID 0 and checks if the error "STATUS_INSUFF_SERVER_RESOURCES" is returned to determine if the target is not patched against ms17-010. Additionally it checks for known error codes returned by patched systems.
Tested on Windows XP, 2003, 7, 8, 8.1, 10, 2008, 2012 and 2016.

Following command will scan the SMB vulnerability using in-built certain scripts and report according to the output result.
nmap -T4 -p445 --script vuln

You can observe from given screenshot that port 445 is open and vulnerable. The target is exploitable to MS17-010 moreover Rate of Risk is High which mean it is easily vulnerable.

We can direct scan for SMB vulnerability for MS17-010 using NMAP script using following NMAP command:

nmap -T4 -p445 --script smb-vuln-ms17-010

From given screenshot you will observe that it has only scan for MS17-010 and found target is vulnerable against it.

From both result of NMAP we have concluded that, the target is vulnerable due to Microsoft SMBv1

Uses information disclosure to determine if MS17-010 has been patched or not. Specifically, it connects to the IPC$ tree and attempts a transaction on FID 0. If the status returned is "STATUS_INSUFF_SERVER_RESOURCES", the machine does not have the MS17-010 patch. If the machine is missing the MS17-010 patch, the module will check for an existing DoublePulsar (ring 0 shellcode/malware) infection. This module does not require valid SMB credentials in default server configurations. It can log on as the user "\" and connect to IPC$.

msf > use auxiliary/scanner/smb/smb_ms17_010
msf auxiliary(smb_ms17_010) > set rhosts
msf auxiliary(smb_ms17_010) > set lhost
msf auxiliary(smb_ms17_010) > set rport 445
msf auxiliary(smb_ms17_010) > exploit

From screenshot you can perceive that host is vulnerable to MS17-010
Great!!! Now use MS17-010 to exploit your target.

5 Ways to Crawl a Website

A Web crawler, sometimes called a spider, is an Internet bot that systematically browses the World Wide Web, typically for the purpose of Web indexing .

A Web crawler starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit.  If the crawler is performing archiving of websites it copies and saves the information as it goes. The archive is known as the repository and is designed to store and manage the collection of web pages. A repository is similar to any other system that stores data, like a modern day database.

Let’s Begin!!
This auxiliary module is a modular web crawler, to be used in conjuntion with wmap (someday) or standalone.

use auxiliary/crawler/msfcrawler
msf auxiliary(msfcrawler) > set rhosts
msf auxiliary(msfcrawler) > exploit

From, screenshot you can see it has loaded crawler in order to exact hidden file from any website, for example about.php, jquery contact form, html and etc which is not possible to exact manually from website using browser. For information gathering of any website we can use it.

HTTrack is a free and open source Web crawler and offline browser, developed by Xavier Roche
It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. 

Type following command inside the terminal
httrack -O /root/Desktop/file

It will save the output inside given directory /root/Desktop/file

From given screenshot you can observe this, it has dumb the website information inside it which consist html file as well as JavaScript and jquery.

This Web spider utility detects and displays detailed information for a user-selected Web page, and it offers other Web page tools.
BlackWidow's clean, logically tabbed interface is simple enough for intermediate users to follow but offers just enough under the hood to satisfy advanced users. Simply enter your URL of choice and press Go. BlackWidow uses multithreading to quickly download all files and test the links. The operation takes only a few minutes for small Web sites.

You can download it from here.

Enter your URL in Address field and press Go.

Click on start button given on left side to begin URL scanning and select a folder  to save the output file.
Fromscreenshot you can observe that I had browse C:\Users\RAJ\Desktop\tptl in order to store output file inside it.

When you will open target folder tptl you will get entire data of website either image or content, html file, php file and JavaScript all are saved in it.


Website Ripper Copier (WRC) is an all-purpose, high-speed website downloader software to save website data. WRC can download website files to local drive for offline browsing, extract website files of a certain size and type, like image, video, picture, movie and music, retrieve a large number of files as a download manager with resumption support, and mirror sites. WRC is also a site link validator, explorer, and tabbed anti pop-up Web / offline browser.

Website Ripper Copier is the only website downloader tool that can resume broken downloads from HTTP, HTTPS and FTP connections, access password-protected sites, support Web cookies, analyze scripts, update retrieved sites or files, and launch more than fifty retrieval threads

You can download it from here.

Choose “web sites for offline browsing” option.

Enter the website URL as and click on next.

Mention directory path to save the output result and click run now.

When you will open selected folder tp you will get fetched css,php,html and js file inside it.

Burp Spider is a tool for automatically crawling web applications. While it is generally preferable to map applications manually, you can use Burp Spider to partially automate this process for very large applications, or when you are short of time.
For more detail read our privious articles from here.
From given screenshot you can observe that I had fetched the http request of http://; now send to spider with help of action tab.

The targeted website has been added inside the site map under target tab as a new scope for web crawling.  From screenshot you can see it started web crawling of the target website where it has collected the website information in the form of php, html and js.

5 ways to Banner Grabbing

Banner are refers as text message that received from host. Banners usually contain information about a service, such as the version number.
Banner grabbing is a process to collect details regarding any remote PC on a network and the services running on its open ports. An attacker can make use of banner grabbing in order to discover network hosts and running services with their versions on their open ports and more over operating systems so that he can exploits it.

A simple banner grabber which connects to an open TCP port and prints out anything sent by the listening service within five seconds.
The banner will be shortened to fit into a single line, but an extra line may be printed for every increase in the level of verbosity requested on the command line.

Type following command which will fetch banner for every open port in remote PC.
nmap -sV --script=banner

From screenshot you can read the services and their version for open ports fetched by NMAP Script to grab banner for the target

Following command will grab the banner for selected port i.e. 80 for http service and version.
nmap -Pn -p 80 -sV --script=banner
As result it will dumb “http-server-header: Apache/2.2.8 (Ubuntu) DAV/2”

Curl –I is use for head in order to shown document information only; type following command to grab HTTP banner of remote PC.
curl -s -I | grep -e "Server: "
As result it will dumb “http-server-header: Apache/2.2.8 (Ubuntu) DAV/2”

Type following command to grab SSH banner of remote PC.
telnet 22
As result it will dumb “SSH-2.0-OpenSSH_4.7p1 Debian-8ubuntu1”

Type following command to grab SSH banner of remote PC.
nc –v 22
As result it will dumb “SSH-2.0-OpenSSH_4.7p1 Debian-8ubuntu1”

DMitry (Deepmagic Information Gathering Tool) is a UNIX/(GNU)Linux Command Line Application coded in C. DMitry has the ability to gather as much information as possible about a host. Base functionality is able to gather possible subdomains, email addresses, uptime information, tcp port scan, whois lookups, and more.

Dmitry –b is use for banner grabbing for all open ports; Type following command to grab SSH banner of remote PC.

dmitry -b
From screenshot you can see it has shown banner for open port 21, 22, 23 and 25.
In this way Attacker can grab the services and their version for open ports on remote PC.