Bugtraq mailing list archives

RE: WebVulnCrawl searching excluded directories for hackable web servers


From: "Michael Scheidell" <scheidell () secnap net>
Date: Wed, 29 Mar 2006 07:51:19 -0500

Just a quick followup and clarification:

-----Original Message-----
From: Michael Scheidell 
Sent: Wednesday, March 15, 2006 8:38 AM
To: bugtraq () securityfocus com
Subject: WebVulnCrawl searching excluded directories for 
hackable web servers


What he is doing is a violation of the RFC's (governing 
robots.txt.. Yes, hackers do that also)

There was an RFC proposed and looked at in 1996, but never adopted.


The robots.txt file is NOT AN ACCESS CONTROL LIST, and SHOULD 
NOT BE USED TO 'HIDE' DIRECTORIES. ALL DIRECTORIES SHOULD BE 
PROTECTED AGAINST Directory listing.

Someone mentioned that sometimes you want directory listings.
That should have suggested turning off directory listing for any
directories you don't want listed.
(I don't know why you would put them in robots.txt)

WebVuln Blog stated he was only hitting .com sites.
I have evidence he has moved to .org sites, and in fact, has hit a US
government site as well.
I would hope this US government IT security folks would know not to use
robots.txt as an ACL, the web folks aren't always security folks (web
aplications themselves are sometimes prone to SQL injextion, XSS
attacks, PHP coding errors) and since there is a large gap between
applications and web development, the chances of accidentially gathering
information that should not be gathered is huge.

Every security person should review the robots.txt file on their web
site for implications.


Further, dshield shows them portscanning the net also, 
looking for unpublished information on unpublished servers.
http://www.dshield.org/ipinfo.php?ip=216.179.125.69&Submit=Submit

So does mynetwatchman:

http://www.mynetwatchman.com/LID.asp?IID=178401366

-- 
Michael Scheidell, CTO
561-999-5000, ext 1131
SECNAP Network Security Corporation


Current thread: