Nmap Development mailing list archives

Re: Web App Scanner - GSoC 2009


From: "Rob Nicholls" <robert () everythingeverything co uk>
Date: Sat, 28 Mar 2009 14:31:56 -0000 (UTC)

This sounds like it would make a couple good NSE scripts

I've attached a script of mine that only checks a few files and folders at
the moment and would need to be expanded over time (especially for more
advanced checks when the server returns 200 for non-existent files - 
although that might just be a "nice to have" - and identifying folders
that allow directory listings), but I thought I'd share it and perhaps
others will have some suggestions/improvements in mind.

Here's some sample output:

Interesting ports on aurora.apache.org (192.87.106.226):
PORT   STATE SERVICE
80/tcp open  http
|  http-enum: /dev/ Possible development directory
|  /icons/ Icons directory
|  /images/ Images directory
|_ /mail/ Mail directory

Interesting ports on scanme.nmap.org (64.13.134.52):
PORT   STATE SERVICE
80/tcp open  http
|_ http-enum: /icons/ Icons directory

Interesting ports on wwwtk2test2.microsoft.com (207.46.193.254):
PORT   STATE SERVICE
80/tcp open  http
|  http-enum: /beta/ Beta directory (Access Forbidden)
|  /data/ Data directory (Access Forbidden)
|_ /test/ Test directory (Access Forbidden)

Interesting ports on xxx.xxx.xxx.xxx (xxx.xxx.xxx.xxx):
PORT    STATE SERVICE
443/tcp open  https
|_ http-enum: /webmail/ Webmail directory

I've used quite a few web application tools over the last few years and
I'm not sure that a command line tool like Nmap would be the best place to
add some of the suggested functionality.

Tools like Nikto certainly have their use, and perhaps we can produce a
similar NSE script so Nmap can match most of the functionality (and
possibly licence CIRT's database?); but for "serious" web application
testing I'd personally want a dedicated tool that lets me do things like
script logins, complete forms interactively in a browser, can follow
JavaScript links, test sites that use multiple servers/subdomains, lets me
manually crawl a website in a browser, doesn't produce too many false
positives, and (most importantly) lets me see the request and response
when I'm writing a report.

Being able to compare the results against some form of vulnerability
database sounds good, but medium-large sites often return several
gigabytes worth of traffic and this might not be healthy for Nmap (or the
host) to hold in memory and could be tricky to store into XML files etc.
(other tools I've used will typically store data in some form of SQL
databases).

A separate tool, similar to how Zenmap/Ncat has been developed by Nmap
developers, might be better so results can be managed with a GUI and data
stored in a database, rather than trying to extend Nmap.

Rob

Attachment: http-enum.nse
Description:


_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org

Current thread: