Full Disclosure mailing list archives

Re: Re: Anyone know IBM's security address? + Google Hack


From: Jason Coombs PivX Solutions <jasonc () science org>
Date: Sat, 07 Aug 2004 11:42:33 -1000

Aaron Gray wrote:
It turns out I was going about the process of vulnerability notification all wrong. I should have gone to the United States Computer Emergency Readiness Team to report them. The US-CERT home page provides an email address cert () cert org for

Aaron,

So you've found yet another zero-day vulnerability in a closed-source software product. If you're not going to share it with the rest of us, please consider keeping it to yourself.

Vulnerability notification is a very complex subject with no single right or wrong answer. It is, as you say, a "process".

Full-Disclosure offers a venue whereby public, fair, fast (depending on how close to the bottom of the subscriber list you are, perhaps less-fast) and unmoderated public notification of vulnerabilities can occur. The vendor in question is a member of the public and whether or not they are CC'ed, public full disclosure of a new vulnerability will reach them through this channel.

Considering that direct notification of vendors or those directly affected by a vulnerability without the help of an attorney has, in the past, caused severe legal problems and jail time for the vulnerability researcher, it is a good idea to consider the alternatives before rushing headlong into any single party's preferred vulnerability disclosure "process".

You are posting to full-disclosure because you have considered the potential value of immediate public disclosure, and you are right to consider this option. Depending on the circumstances and the technical nature of the problem it can be the best option. One virtue of full-disclosure that is absent from each and every other vulnerability disclosure process is that it allows members of the public who are at risk to take immediate action to defend their vulnerable systems and examine them forensically to look for evidence that the vulnerability has previously been exploited by somebody who was aware of it while it was still being kept secret. Full disclosure puts an end to the secrecy and creates waves of incident response protective action, similar to that which is supposed to occur every time a vendor releases a patch.

A primary difference between incident response when a patch is released and incident response when public full disclosure is done is that the patch creation process leaves a lengthy period of vulnerability "in secret" or vulnerability "by rumor". Full public disclosure lets everyone know at once that there is a vulnerability "in fact" and the details of the vulnerability allow independent third party researchers to analyze and discover the true scope and root cause of vulnerability.

Full disclosure also allows those who are at risk to determine the extent of the risk given their unique circumstances, and avoids the common problem where a vendor and the vulnerability researcher who discovers a new vulnerability fail to take into consideration the unique circumstances that exist in a particular deployment scenario where the risk posed by the vulnerability in question is aggravated. The only party who can make this determination of aggravating circumstance is the party with intimate knowledge of their own operating environment. Full disclosure gives those with intimate knowledge the ability to identify aggravating circumstances that amplify their risk exposure, while patch disclosure and security alerts that attempt to communicate the existence of risk but fail to provide details leave everyone guessing.

Often times a vendor will misunderstand the vulnerability. Most vendors misunderstood cross site scripting for many years, and didn't react properly to its threat. Windows XP Service Pack 2 contains the first truly significant resolution of certain XSS flaws in IE that I have seen delivered by Microsoft, and the difficulty that the security research community has had in communicating with the vendor about the XSS risk must have contributed to the number of years that it took to arrive at a more adequate technical resolution that *should* be capable of blocking an entire class of attacks in advance. Scob and Download.Ject may actually have helped clarify the true nature of the XSS risk ironically.

With a patch from a vendor to fix the symptom of vulnerability demonstrated by a single vulnerability researcher such as yourself but no opportunity for other researchers to peer review and conduct further research based upon your own analysis and the analysis of the vendor, both of which are usually filtered to remove any proof of concept or detailed technical analysis of the flaw, nobody will know whether the root cause of the vulnerability has in fact been discovered and repaired. The public will know there is a patch, and will have to hope that the vendor, third party security researchers, and organizations such as CERT are correct in their assessment of the flaw and have chosen the right words to communicate clearly its severity.

Incident response to the release of a security patch is thus inherently flawed by its nature. Incident response to full disclosure of vulnerability details and proof of concept code has been shown to be free of flaws that impact clear understanding of security risk. The resistance that many people have to full disclosure revolves around their own business interests, which, as a security researcher, I have to say are not compelling arguments in my view. The argument that it is better to "tell the vendor" than to tell "the public" because there are bad people with bad intent out there in "the public" is somewhat odd and looks to me like artificial hysteria. The security researcher has no way to know who is listening to, and spreading, the communication they are having with "the vendor". We know that people talk to other people, and that keeping secrets secret is quite difficult. Many closed source vendors can't even manage to keep their source code secret.

Not telling the public results in the release of a vague danger alert (patch + advisory) that calls attention to the area of danger, where bad people know they can go to discover the technical details of the flaw given enough effort. Binary analysis of the patched versus unpatched code will show an attacker exactly what a proof of concept would have shown the public, but now only bad people have full understanding of the threat. This gives attackers an advantage over the public because the public is always inadequately informed.

Vendors of closed source products can't advocate full disclosure, legally, or they could be held liable for harm done by the disclosure. The vendor cannot be held liable, today, for harm done by the existence of a vulnerability, or for creating an unlevel playing field where attackers always have the advantage over the public. But strangely they could be held liable for encouraging the public disclosure of vulnerability details if they simultaneously attempt to keep their "source code" secret.

Opening all source code to public inspection would solve the legal liability problem with respect to discussions and research around vulnerabilities in that source code. Until source code publication is mandatory for anyone who expects legal protection under copyright and other intellectual property law, we will continue to see irrational resistance to full disclosure. The security research community understands that the full disclosure process is superior in many respects, and could be made substantially better in every respect when compared to any other process of vulnerability disclosure through additional infrastructure and some sensible changes (or just clarifications) to applicable laws, and that today it is up to the individual researcher to decide how, and whether, to proceed with notification and publication or to do as many researchers have concluded they must do for the time being, just sit and watch and shake their head in disbelief at the absurd comedy of errors that is modern day computer security.

So you've found yet another zero-day vulnerability in a closed-source software product. If you're not going to share it with the rest of us, please consider keeping it to yourself.

Sincerely,

Jason Coombs
jasonc () science org

_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.netsys.com/full-disclosure-charter.html


Current thread: