The mine casting team

Member Information:

Thr0cyte, Gr33k, Hua Hua, MrTools, R1ght0us, 7089bAt

Preface:

Information gathering is a very interesting process, and the information collected will largely affect the size and success of your attack surface. Some time I will share your elder brother and my information collection method, not the absolute process, then take it out for everyone to open the way of thinking, the rest of the road still have to rely on their own, we refueling duck!

Additional some website, everybody thinks method how to connect, I am not quite say, you know!

Chapter two: Reconnaissance

introduce

2.1 passive information collection

2.2. Use recon-ng to collect information

2.3. Use Nmap to scan and identify services

2.4. Identify the Web application firewall

2.5. Determine HTTPS encryption parameters

2.6 Use the browser’s development tools to analyze and change the basic behavior

2.7. Obtain and modify cookies

2.8. Use robots.txt

 

introduce

Penetration testing has the same set of processes, whether for a network or a Web application. This process increases the likelihood that we will find vulnerabilities and exploit as many as possible to affect the system. Such as:

  • Information collection

  • Assets list

  • The exploit

  • Maintain access

  • cleanliness

 

In penetration testing, information collection is one of the things that testers must do. Information collection needs to collect assets existing in the network, such as firewalls, IDS, IPS, and so on. Gather as much information as you can about your company, network, and employees. For Web penetration testing, this phase focuses on gathering web application, database, user, and server information.

The quality of information collected depends on the success of penetration testing. The more information we get, the more targets we have to test. There are more options to find vulnerabilities and exploit them.

 

2.1 passive information collection

Passive information collection refers to collecting information from third parties such as search engines and database caches without affecting the target system.

In this section, we’ll get information from a number of online services, [collections of publicly sourced data also known as Open Source Intelligence (OSINT)]. Passive information gathering allows us to get an overview of our goals in testing public websites or applications and discover information useful to penetration testers.

  

To prepare

To obtain information from public resources, we need to have the Kali virtual machine connected to the Internet and configured in NAT network address translation mode. For this step, refer to the method described in Chapter 1, which will teach you to set up the KALI and target environment and set NAT mode instead of host-only mode.

  

What to do…

As our target, we will use the domain name ZoneTransfer. me, created by Robin Wood at digi. Ninja to demonstrate the consequences of allowing public DNS zone transfers.

 

1. We first use WHOIS to get its registration information:

# whois    zonetransfer.me

                      

2. Another tool is DIG, which can get domain name information and DNS resolution information.

# dig  ns  zonetransfer.me

 

3. Once we have the DNS server information, we can try a zone transport attack to get all the host names resolved by the server. Still use dig:

# dig  axfr @nsztm1.digi.ninja zonetransfer.me

 

Fortunately, DNS servers support zone transport, and we have a complete list of subdomains and their resolution. You can then choose a vulnerable target to infiltrate.

4. You can now use TheHarvester to get email, host name, IP address information for target websites.

# theharvester -b  all  -d  zonetransfer.me

 

5. If you do not want to query the server directly to obtain the software version information, you can use Netcraft. landing

https://toolbar.netcraft.com/site_report

Enter the domain name you want to query:

6. Getting information about previous pages of a site can also be useful in testing. Can be found in https://archive.org/web/web.php

This site traces a static copy of a previous version of the site.

 

How it works…

In this tutorial, we use a number of different tools to gather target information. In Linux command line using WHOIS to query the website registration information, but also obtained the website DNS server information and the administrator’s personal information, such as the administrator’s email, company name, phone and so on. Whois can also query information about the owner of an IP address. We then use DIG to get the target DNS server information and use zone transport to get all the subdomains (DIG zone transport only works for DNS servers that are not properly configured)

Harvester is a harvester server that has a host name, a host name, and an IP address. Harvester is a harvester server that has a host name

We then used Netcraft to get the technical information used by the target site and update the situation before. This enabled us to no longer query the real site in further testing.

A Wayback Machine website is a server that stores a static copy of a website and keeps records. Here, you can see information published in older versions, and sometimes updates to web applications can leak sensitive data.

 

 

other

In addition, we can use Google’s advanced search options

(https://support.google.com/websearch/answer/2466433)

To find information about the target domain without having to access it directly. For example, by using

Site: site_to_look_into “target_domain”

For a search like this, we can look for the presence of the target domain in the pages of recently discovered vulnerabilities, leaks, or successful attacks. Check out the following sites that might be useful to you:

  • Openbugbounty.org: Open Bug Bounty is an independent site for security researchers to report and publish vulnerabilities (cross-site scripting and cross-site request forgery only) on public-facing websites. So a search in Google will return all mentions to “zoneTransfe” which is what OpenBugbounty.org does.

  • Pastebin.com: A very common method used by hackers to anonymously filter and publish information obtained during attacks.

  • Zone-h.org: Zone-h is a site where malicious hackers brag about their achievements, mainly the destruction of the site.

In view of the relevant content, the xuan soul has written “DNS information collection” reorganized and issued together with this article, hope to play a supporting role, pay attention to view!

—————————————————————————————–

For more exciting content, follow our wechat subscription account xuanhun521.

Google Hack query syntax

Kali Linux Web Penetration Test Manual (2nd edition) – 1.3 – Target installation

Kali Linux Web Penetration Test Manual (2nd edition) – 1.2 – Some common plug-ins are installed in the Firefox browser

Kali Linux Web Penetration Test Manual (2nd edition) – 1.1 – Penetration Test Environment setup