Nmap Development mailing list archives

RE: NSE: http-phpself-xss - Finds PHP files with reflected cross site scripting vulns due to unsafe use of the variable $_SERVER[PHP_SELF]


From: King Thorin <kingthorin () hotmail com>
Date: Fri, 1 Jun 2012 15:11:28 -0400


To: nmap-dev () insecure org
From: Paulino Calderon <paulino () calderonpale com>


Date: Thu, 31 May 2012 01:19:56 -0500







Hi list,

Here is a script for detecting reflected XSS in PHP files that don't 
sanitize the variable $_SERVER["PHP_SELF"]:
description=[[
Crawls a web server looking for PHP files that use the variable 
$_SERVER["PHP_SELF"] unsafely.
This script crawls the webserver to create a list of PHP files and then 
sends an attack vector/probe to identify PHP_SELF cross site scripting 
vulnerabilities.
PHP_SELF XSS refers to reflected cross site scripting vulnerabilities 
caused by the lack of sanitation of the variable 
<code>$_SERVER["PHP_SELF"]</code> in PHP scripts. This variable is
commonly used in php scripts with forms and when the current URI is needed.

Examples of Cross Site Scripting vulnerabilities in the variable 
$_SERVER[PHP_SELF]:
*http://www.securityfocus.com/bid/37351
*http://software-security.sans.org/blog/2011/05/02/spot-vuln-percentage
*http://websec.ca/advisories/view/xss-vulnerabilities-mantisbt-1.2.x

The attack vector/probe used is: <code>/'"/><script>alert(1)</script></code>
]]
---
-- @usage
-- nmap --script=http-phpself-xss -p80 <target>
-- nmap -sV --script http-self-xss <target>
-- @output
-- PORT   STATE SERVICE REASON
-- 80/tcp open  http    syn-ack
-- | http-phpself-xss:
-- |   VULNERABLE:
-- |   Unsafe use of $_SERVER["PHP_SELF"] in PHP files
-- |     State: VULNERABLE (Exploitable)
-- |     Description:
-- |       PHP files are not handling safely the variable 
$_SERVER["PHP_SELF"] causing Reflected Cross Site Scripting vulnerabilities.
-- |
-- |     Extra information:
-- |
-- |   Vulnerable files with proof of concept:
-- |     
http://calder0n.com/sillyapp/three.php/%27%22/%3E%3Cscript%3Ealert(1)%3C/script%3E
-- |     
http://calder0n.com/sillyapp/secret/2.php/%27%22/%3E%3Cscript%3Ealert(1)%3C/script%3E
-- |     
http://calder0n.com/sillyapp/1.php/%27%22/%3E%3Cscript%3Ealert(1)%3C/script%3E
-- |     
http://calder0n.com/sillyapp/secret/1.php/%27%22/%3E%3Cscript%3Ealert(1)%3C/script%3E
-- |   Spidering limited to: maxdepth=3; maxpagecount=20; 
withinhost=calder0n.com
-- |     References:
-- |       https://www.owasp.org/index.php/Cross-site_Scripting_(XSS)
-- |_      http://php.net/manual/en/reserved.variables.server.php
-- @args http-phpself-xss.uri URI. Default: /
-- @args http-phpself-xss.timeout Spidering timeout. Default:10000

--
Paulino Calderón Pale
Website: http://calderonpale.com
Twitter: http://twitter.com/calderpwn

Attachment:
http-phpself-xss.nse

 








Would there be a way or would it make sense to implement a method by which HTTP scripts can hook into a single crawler 
and test things page by page in order to avoid crawling/spidering the same content for all (or each selected) HTTP 
script over and over again?
                                                                                  
_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


Current thread: