[tor-dev] Improving Private Browsing Mode/Tor Browser

Mike Perry mikeperry at fscked.org
Thu Jun 23 16:55:45 UTC 2011


Thus spake Georg Koppen (g.koppen at jondos.de):

> >> And why having again add-ons that can probably be toggled on/off and
> >> are thus more error-prone than just having an, say, Tor anon mode?
> >> Or is this already included in the Tor anon mode but only separated
> >> in the blog post for explanatory purposes?
> > 
> > If we operate by upgrading private browsing mode, we'll effectively
> > have the "toggle" in a place where users have already been trained by
> > the UI to go for privacy. Torbutton would become an addon that is only
> > active in private browsing mode. 
> 
> Okay. That means there is no additional toggling of Torbutton in this
> enhanced private mode. The user just enters it and Torbutton is running
> and doing its job and if the user does not want it anymore she does not
> toggle anything but leaves this enhanced private browsing mode and
> that's it, right?

That's correct. If the user wants their regular private browsing mode
back, they would presumably uninstall the extension.

> >> If one user requests
> >> google.com, mail.google.com and other Google services within the 10
> >> minutes interval (I am simplifying here a bit) without deploying TLS the
> >> exit is still able to connect the whole activity and "sees" which
> >> services that particular user is requesting/using. Even worse, if the
> >> browser session is quite long there is a chance of recognizing that user
> >> again if she happens to have the same exit mix more than once. Thus, I
> >> do not see how that helps avoiding linkability for users that need/want
> >> strong anonymity while surfing the web. Would be good to get that
> >> explained in some detail. Or maybe I am missing a point here.
> > 
> > We also hope to provide a "New Identity" functionality to address the
> > persistent state issue, but perhaps this also should be an explicit
> > responsibility of the mode rather than the addon..
> 
> Hmmm... If that is the answer to my questions then there is nothing like
> avoiding getting tracked by exit mixes in the concept offered in the
> blog post. Okay.

That is not entirely true. Because identifiers would be linked to
top-level urlbar domain, gone are the days where exits could insert an
iframe or web-bug into any arbitrary page and use that to track the
user for the duration of the session, regardless of page view.

Instead, they would be pushed back to doing some sort of top-level
redirect (which we hope would be way more visible), or maybe not even
that, depending on how we define redirects with respect to
"top-level".

So no, we are not completely abandoning exits as an adversary with
this threat model. If I'm wrong about something, or you think there
are still attacks exits can perform that we should address somehow,
let me know.

> How should the "New Identity" functionality work? Is
> that identity generated automatically after a certain amount of time has
> passed or does a user have to click manually on a button every time?

I don't know the answer here. This may vary by browser and use case.
For a communications-suite style use case, I think we probably want to
detect inactivity and ask the user if they want to clear state,
because communications-suites are heavy and a pain to relaunch (hence
once opened, they probably will stay open).

For something lighter, like Chrome's Incognito, we may just rely on
the user to leave the mode. This divergence is one of the reasons I
didn't mention the feature in the blog post. 

If you want to track what solution we ultimately deploy for TBB, here
is the ticket you should follow:
https://trac.torproject.org/projects/tor/ticket/523
 
> >> Assuming I understood TorButton's
> >> Smart-Spoofing option properly: Why is it not applied to the
> >> referer/window.name anymore? In other words: Why is the referer (and
> >> window.name) not kept if the user surfs within one domain (let's say
> >> from example.com to foo.example.com and then to foo.bar.example.com)?
> > 
> > I don't really understand this question. The referer should be kept in
> > these cases.
> 
> That sounds good. Then we probably had just different concepts of SOP in
> mind. I was thinking about
> http://tools.ietf.org/html/draft-abarth-origin-09 (see: section 3 and
> 4). That would treat http://example.com, http://foo.example.com and
> http://foo.bar.example.com as different origins (let alone mixing
> "http://" and "https://" and having different ports).

Yeah. The reality is we're basically picking an arbitrary heuristic
for squelching this information channel to find some sweet spot that
minimizes breakage for maximal gain. True same-origin policy may or
may not be relevant here.

Since I personally believe any heuristic squelch is futile against bad
actors, I haven't thought terribly hard about the best "sweet spot"
policy. I just took what Kory Kirk came up with for a GSoC project and
tweaked it slightly to make it symmetric:
https://trac.torproject.org/projects/tor/ticket/2148

This policy will appear as a non-default option in 1.4.0 (it is
already in 1.3.x-alpha), but I think we should make a real decision
about the behavior soon, because having an option just creates a
fingerprinting opportunity as I said in the blog post. I believe the
fingerprinting effect of an option to be worse than doing nothing at
all to referer, since it is global linkability, not just an
information channel between two parties:
https://trac.torproject.org/projects/tor/ticket/3100

I'm still pretty convinced the best solution for Tor is "leave referer
alone", at least until someone shows me the breakage results of a
thorough crawl (which would include web-app site use).

> > Hence it doesn't make sense to "clear" them like cookies. Instead, It
> > makes more sense to prohibit information transmission through them in
> > certain cases.
> 
> I am not sure about that as "clearing" them for *certain contexts* seems
> a good means to prohibit information transmission *in these contexts*:
> If there isn't any information it cannot be transmitted (at least not by
> referer or windows.name).

Perhaps. Right before I abandoned the toggle model for torbutton, one
of the last fixes I did to it was to "clear" window.name on toggle:
https://trac.torproject.org/projects/tor/ticket/1968

I now believe that is the wrong way to think about things.

I think the better solution is to "clear" window.name when the user
enters a new url in the urlbar, which gets covered by making
window.name behave like referer in all cases:
https://trac.torproject.org/projects/tor/ticket/3414

> Another question came to my mind: You seem to be at pains not to break
> parts of the web even in the anon mode even if that boils down to not
> implement features that would better fit the needs for people looking
> for unlinkability (one thing that comes to my mind here would be having
> a context being tab dependent additionally). Why? Why not saying: "This
> is Tor's anon mode. It is meant for people that strive for unlinkability
> and might break some functionality. You may still use Tor in normal or
> private browsing mode though (providing no or less unlinkability on the
> browser level)." Do you think that's not worth the effort as Tor's IP
> unlinkability is enough here (especially combined with the things you
> suggested in the blog post)? I do not know Tor's user base very well but
> could imagine that it contains a lot of users that would like to have
> more unlinkability than the "we do not want to break any (or almost any)
> part of the web for a better anonymity" fraction.

I wish I had better science to give you here on the trade-off we're
going for, but the reality is that we're best-guessing over a very
complex cost/benefit landscape.

We do know for a fact that the easier Tor is to use (which includes
installation, configuration, overall intuitiveness/"familiarity",
compatibility, and performance), the more people will use it
regularly.

We also know for a fact that the more people use Tor, the better the
baseline privacy, anonymity, and censorship resistance properties all
become.

Hence, I tend to make decisions in favor of the usability direction
over minor details, especially ones that don't really prevent bad
actors/adversaries from accomplishing their goals.

The need for science especially comes in on the fingerprinting arena.
Some fingerprinting opportunities may not actually be appealing to
adversaries. Some may even appear appealing in theory, but in practice
would be noticeable to the user, too noisy, and/or too error-prone.
Hence I called for more panopticlick-style studies, especially of
Javascript features, in the blog post.

-- 
Mike Perry
Mad Computer Scientist
fscked.org evil labs
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: <http://lists.torproject.org/pipermail/tor-dev/attachments/20110623/4227a8ca/attachment.pgp>


More information about the tor-dev mailing list