Bugtraq mailing list archives
Re: cache cookies?
From: Thomas Reinke <reinke () E-SOFTINC COM>
Date: Thu, 14 Dec 2000 02:06:48 -0500
Clover Andrew wrote:
http://www.princeton.edu/pr/news/00/q4/1205-browser.htmor is it snakeoil?Well it *can* work. But I don't think the release's claims of being 'very reliable', 'very dangerous [to privacy]' and 'countermeasure-proof' are justified.
Actually, it *does* work. We have on our site a working demonstration of the exploit, showing whether or not you've visited one or more of more than 80 different well known sites. The URL is http://www.securityspace.com/exploit/exploit_2a.html We've found with the demo that a) It is as reliable as the ability to find an image that would be cached by the browser. In fact, the timing is very accurate, but other factors can fool the mechanism. Out of the 80 odd sites we tested, we had 3 false negatives. b) Dangerous is subjective - a malicious site CAN find out what sites you have visited. How much they can do with it? Well..that's up to the imagination. Certainly I doubt (hope?) that larger organizations wouldn't stoop to this trick, but I honestly see nothing preventing advertising orgs and so on from not doing this, other than the uproar it would cause in the industry. c) Countermeasure proof...short of forcing all caching to be disabled, not sure what can be done. The technique demonstrated used JavaScript. It could have been done with Java, and it could even have been done to a certain extent without any scripting at all.
This can easily be foiled by turning off JavaScript on untrusted sites or setting cache policy to check for newer versions of documents on every access. It is already likely to be confused by shared proxy caches and setups where there is no local cache.
Javascript isn't required. It can be done also with Java, and can even be done (albeit with less accuracy and much more noticeably) without any scripting at all. Forcing to reload documents on every access however would seriously slow down network access in many cases.
Calling it a 'cache cookie' is overselling it a bit IMHO - it can't contain a value, only a yes/no response for each possible key (URL), and an unreliable one at that. Trawling many URLs at once would be slow, and the user would be more likely to notice it.
Cache cookie - clever terminology I agree. But as you note, it can store yes/no responses. As for the user noticing it? Not likely. If a page limited itself to trawling 10-15 sites at a time, and only after the current page had loaded, the user might never notice at all.
Since the act of running the cache-bug will itself cache the target URL, it's also likely to get confused by reporting false cache hits caused by itself and possibly other cache bugs.
That is actually trivial to bypass through a simple flag that indicates what has and has not been checked.
I wouldn't be too worried though. JavaScript implementation bugs and client-side-trojan problems are currently far worse in my opinion.
I agree.
-- Andrew Clover Technical Support 1VALUE.com AG
-- ------------------------------------------------------------ Thomas Reinke Tel: (905) 331-2260 Director of Technology Fax: (905) 331-2504 E-Soft Inc. http://www.e-softinc.com Publishers of SecuritySpace http://www.securityspace.com
Current thread:
- Re: cache cookies? Clover Andrew (Dec 14)
- Re: cache cookies? Thomas Reinke (Dec 15)
- Re: cache cookies? James N. Potts (Dec 16)
- Re: cache cookies? Dan Harkless (Dec 16)
- Re: cache cookies? MadHat (Dec 18)
- Re: cache cookies? Steve Shockley (Dec 16)
- Re: cache cookies? Rossen Raykov (Dec 16)
- Re: cache cookies? Nick Lamb (Dec 18)
- Re: cache cookies? Thomas Reinke (Dec 18)
- Re: cache cookies? Kee Hinckley (Dec 16)
- Re: cache cookies? Szilveszter Adam (Dec 18)
- Re: cache cookies? James Taylor (Dec 19)
- Re: cache cookies? Szilveszter Adam (Dec 18)
(Thread continues...)
- Re: cache cookies? Thomas Reinke (Dec 15)