WebApp Sec mailing list archives

RE: How to list all the URLs on a web server


From: "Ofer Shezaf" <Ofer.Shezaf () breach com>
Date: Fri, 7 Jan 2005 17:28:08 -0500


The first thing that comes to mind is brute forcing. It is effective if
you have some knowledge of the pattern of the file names. 

So if you know that the files are all references using something like
/docs/tempXXXXX.pdf Where XXXXX is a random string, A simple Perl script
would allow you to iterate through all options

In many cases you will find that there is logic in the file names that
makes brute forcing much faster. 

Another option is XSS. If you have some XSS vulnerability, than what you
would want to steal from other users are these filenames.

On a higher level, this problem is a lot like the more usual session
management problem. The file name is sort of a "session identifier". If
names are very long and very random it makes brute forcing difficult
just as long and random session identifiers do. And just as session
identifiers are prime target for XSS so are these file names.

Ofer Shezaf
CTO, Breach Security

Tel: +972.9.956.0036 ext.212
Cell: +972.54.443.1119
ofershezaf () breach com
http://www.breach.com

-----Original Message-----
From: Lists [mailto:sakaba () alexandria cc]
Sent: Friday, January 07, 2005 6:35 PM
To: webappsec () securityfocus com
Subject: How to list all the URLs on a web server

Hi Everyone,

I am auditing a system where files are stored on a web server and
accessed without authentication directly by an application that knows
each file URL.  I don't like it but the app owner wants me to
demonstrate that someone could guess the URLs.  I have tried a number
of spider tools but they are based on links so they don't pull up
anything.

I am wondering if there is a tool or another method where I could find
out all the URLs on the web site.   The funny thing is I saw this same
kind of system with the same explanation just the other week at
another
company.  Maybe its a new trend...

Regards,
sakaba


Current thread: