Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
HSTS Super Cookies (radicalresearch.co.uk)
197 points by Thomashuet on Jan 3, 2015 | hide | past | favorite | 53 comments


This privacy risk is actually documented in the official HSTS specification, section 16.9 of

https://www.rfc-editor.org/rfc/rfc6797.txt

However, the spec doesn't propose a mitigation for it. I'm afraid many new security policy mechanisms can actually be used to track users or devices this way, because you can experiment to see whether the browser has heard about a particular security policy by observing its behavior when you ask it to violate the policy. If you tell different devices about different policies, their behavior will be different (as if you told different kids who were going to visit a park about different rules for how to behave in the park, and then observed who obeyed and who violated which rules as a way of identifying individual kids).

For example, you can also get tracking out of public key pinning, by selectively pinning certs for some subdomains and not others, and then seeing which subresources are successfully loaded when you present a huge number of pin violations. (I think that's also documented in the HPKP spec.)


The HPKP tracking problem is described in section 5

https://datatracker.ietf.org/doc/draft-ietf-websec-key-pinni...

which also includes another description of this HSTS problem.


We also have a pretty comprehensive discussion of this and many other vectors in:

http://www.chromium.org/Home/chromium-security/client-identi...


Thanks, I didn't know about that article.


See also the Firefox bug from 2011: https://bugzilla.mozilla.org/show_bug.cgi?id=648186


Firefox stores HSTS entries in a SQLite database, which you can query by running:

  echo "SELECT * FROM moz_hosts WHERE type='sts/use';" | sqlite3 permissions.sqlite
from inside your profile directory.

To clear HSTS entries (which the "Clear recent history" UI does not delete), you can do:

  echo "DELETE FROM moz_hosts WHERE type='sts/use';" | sqlite3 permissions.sqlite
I've been periodically monitoring this database for HSTS supercookies over the last couple years and have yet to see any in the wild.


They were recently moved to the SiteSecurityServiceState.txt file: https://bugzilla.mozilla.org/show_bug.cgi?id=775370


Good grief. They've replaced a SQLite database with a text file that's loaded into an in-memory hash table because "adoption of HSTS is not very widespread yet" so "using any kind of off-the-shelf database to store this would be inefficient and overly complex." This required a patch that took over a year to review, during which time issues were raised with the text file parser they had to write from scratch. Loading the whole table into memory has clear DoS implications, so they're limiting the table to 1024 entries (making the use of a hash table rather silly), with an eviction strategy[1] that is going to favor older entries and effectively prevent newer entries from being added.

[1] https://hg.mozilla.org/releases/mozilla-aurora/rev/b339d53f9...


Ran the above query on my Fx 34.0.5 profile, which is set to clear all history, cookies, cache, etc. on close of each session.

Among the entries was one named track.nextuser.com

"About NextUser: We believe every user should have an experience personalized for them..."

That doesn't sound very promising.


> Among the entries was one named track.nextuser.com

In order to be a supercookie they need more than one entry (since each entry only stores 1 bit of information). Do you see any other entries that look like they could be associated with this one?


Helpful. Thanks.


> The impact is that it's possible for a site to track you even if you choose to use "incognito" or "private" browsing features in an effort to avoid such tracking.

I've always thought that (despite user hopes) the point of 'private' browsing was explicitly and only to avoid leaving traces on the user's computer anyway. (For example, I used it when shopping for Christmas presents.) The Firefox new private window has a warning to this effect:

> While this computer won't have a record of your browsing history, your employer or internet service provider can still track the pages you visit.


Google employees have explicitly agreed with this in the past, indicating that Incognito is explicitly not meant to anonymize the user: https://code.google.com/p/chromium/issues/detail?id=142214#c...


Many people use it to spawn what is effectively a "guest session" in which they can log into some site with account B when they're already normally logged in with account A, without having to log out of account A. Or, similarly, to temporarily deactivate such things as Google's per-user search results personalization. Sites that leak credentials into such "guest sessions" break their usage for such purposes. (In fact, since in Chrome incognito windows are just a special case of user profile switching, HSTS probably leaks credentials between user profiles as well, completely mooting the point of them.)


I don't think HSTS completely moots user profiles. Most "good" sites are not going to be storing session data encoded as HSTS timestamps.


>However, unlike cookies, existing HSTS flags are still shared with sites when using "incognito" or "private" windows.

fwiw, though Firefox is listed in there as "leaks across private mode", I get an entirely new ID when I open a private window. v34.0.5


again FWIW, Chrome v39.0.2171.95 on OSX leaks


34.0 and no leak here either


This has been known for quite a while. I managed to find a case where HSTS allowed information leakage between private/non-private frames within the same browser in Firefox, but I think that's been fixed.

In general, the browser vendors seem to think that HSTS is worth the potential privacy leak. I've also heard some people say they're monitoring to see if anyone does it and will respond if it becomes a problem.


I'm actually pretty irritated that this researcher makes it out as an iOS thing only, it feels like he/she just didn't care to try on anything other than the device they had in front of them.

Chrome on Android behaves the same way the researcher described (fingerprinting works in Incognito tabs), but Chrome, Opera, Firefox, and IE on Windows all get different IDs.


Chrome 39.0.2171.71 on OSX 10.9.4 is also vulnerable.


I got the same ID on Chrome on Windows.


Uh-oh, looks like I made a mistake. I get the same behavior on Chrome on Windows too.


There's a nice survey paper from 2012 that lists dozens of supercookie vectors, including HSTS.

https://cyberlaw.stanford.edu/files/publication/files/tracki...

FTA: "A website can encode a globally unique pseudonymous device identifier into any stateful web technology so long as it persists at least log2 n bits, where n is the number of Internet-connected devices (presently roughly 5 billion, requiring 33 bits)."


I have Chromium set to delete browsing data on close. The HSTS Cookie survives that. Manually deleting browsing data kills it.


That would somehome indicate to me (without having looked at the code), that this has been implemented like this on purpose.


The basic background problem is that in the "normal" case, HSTS is a security and privacy protection rather than a tracking mechanism. That's why one would typically want it to persist as much as possible. But it has the potential for tracking effects too (as this project demonstrates). I guess the current browser behavior is indeed an attempt to project user intentions based on that.


My point was the subtle difference between automated data clean at end of browsing session and the reported manual action. Nothing else.


Seems Chrome has addressed the issue with incognito mode - if you open the page in incognito mode you get a different code.


Not for me. Mac OS X 10.10.1, Chrome 39.0.2171.95. I get the same code even in incognito mode.


Well that's interesting. This may shed light on some of the seemingly conflicting results. On Mac OS X 10.10.1, Chrome 41.0.2264.2 (Canary):

Steps:

1. Open browser, open [1] in new tab. Get code X.

2. Open [1] in new incognito window. Get code Y.

3. Reload that incognito window. Get code Y.

4. Close incognito window.

5. Open [1] in second new incognito window. Get code X.

Subsequent iterations of opening/closing regular and incognito windows and/or restarting Chrome all yielded code X.

[1] http://www.radicalresearch.co.uk/lab/hstssupercookies/


Perhaps you started (2) before (1) finished?


Same here, got same code on OSX 10.9.5 Chrome 39.0.2171.95


Chrome has different SSL validation on each operating system - on Linux it uses it's own NSS instance, on OS X and Windows it uses the OS level certificate validation routines. Not sure how HSTS plays into that.


Not on my Chromebook Pixel.


Confirmed for me on Mac OS X 10.9.5, Chrome 39.0.2171.95


Not for me running Chrome 39.0.2171.95 on Fedora 21.


Same code for Chrome 39.0.2171.95 on Windows...


I'm not quite sure if I wouldn't _expect_ the incognito mode to respect HSTS. I'd think that you would use incognito mode for ~sensitive~ tasks.

Defaulting to https due to a known HSTS flag seems good in this case, otherwise every incognito session would start out blank, right? (I'm ignoring the white list from the browser vendor)


It sounds like HTTPS Everywhere is overlapping functionality with HSTS. Is there some way that HTTPS Everywhere could just inject HSTS rules rather than looking up every URL and rewriting it before sending a request?


HSTS can only change the protocol, while HTTPS everywhere can do more complex rewritings. So what you propose could only work in a very limited set of use cases among those handled by the plugin.


It seems like the main use case for HSTS is with the site being requested by the user in the URI bar, for protecting cookies and login credentials associated with that domain.

It does not seem like there's a major use-case for secondary resources: images, css, javascript, etc loaded on the page itself, and which serve as the vector in this attack. Such resources must be requested via https on a https site itself anyways.

So, wouldn't it be better to just restrict the usage of HSTS protocol overrides to just the main domain being requested by the user in the URI bar?


In Firefox 34.0 on OS X, both Incognito Mode and clearing history from the past hour appeared to defeat this attack.


Worked a charm on my iPad between regular and incognito mode. Fascinating, and a bit unfortunate.


Am I right to understand that this can even be a server side cookie, i.e. that it can't even be killed by disabling javascript (since the server can tell if there was a redirect)?


Wow, can't get it to go away on Firefox 33.0 Ubuntu. I was able to clear it by manually deleting the info from the permissions.sqlite3 database as described by agwa.

Very clever!


Under History > Clear Recent History, make sure "Site Preferences" is also checked.


Using Firefox 34.0 on Ubuntu, I get a different code in the private browsing window. No add-ons are enabled.


It's targeted to iOS devices, so that is not very surprising.


imho it's not reasonable to perform dozens of http requests in order to create a device fingerprint. especially on mobile networks this will require a lot of time until all requests are through.


If it happens out of band of the page doing anything else, the time required won't matter much, especially given the value of a truly persistent tracking cookie.


Poses no problem to Firefox 34 on GNU.


firefox private browsing breaks it




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: