[tor-dev] Improving Private Browsing Mode/Tor Browser

Georg Koppen g.koppen at jondos.de
Sun Jul 10 14:20:24 UTC 2011


>> Hmmm... If that is the answer to my questions then there is nothing like
>> avoiding getting tracked by exit mixes in the concept offered in the
>> blog post. Okay.
> 
> That is not entirely true. Because identifiers would be linked to
> top-level urlbar domain, gone are the days where exits could insert an
> iframe or web-bug into any arbitrary page and use that to track the
> user for the duration of the session, regardless of page view.
> 
> Instead, they would be pushed back to doing some sort of top-level
> redirect (which we hope would be way more visible), or maybe not even
> that, depending on how we define redirects with respect to
> "top-level".
> 
> So no, we are not completely abandoning exits as an adversary with
> this threat model. If I'm wrong about something, or you think there
> are still attacks exits can perform that we should address somehow,
> let me know.

See my last mail.

>> Another question came to my mind: You seem to be at pains not to break
>> parts of the web even in the anon mode even if that boils down to not
>> implement features that would better fit the needs for people looking
>> for unlinkability (one thing that comes to my mind here would be having
>> a context being tab dependent additionally). Why? Why not saying: "This
>> is Tor's anon mode. It is meant for people that strive for unlinkability
>> and might break some functionality. You may still use Tor in normal or
>> private browsing mode though (providing no or less unlinkability on the
>> browser level)." Do you think that's not worth the effort as Tor's IP
>> unlinkability is enough here (especially combined with the things you
>> suggested in the blog post)? I do not know Tor's user base very well but
>> could imagine that it contains a lot of users that would like to have
>> more unlinkability than the "we do not want to break any (or almost any)
>> part of the web for a better anonymity" fraction.
> 
> I wish I had better science to give you here on the trade-off we're
> going for, but the reality is that we're best-guessing over a very
> complex cost/benefit landscape.

That's true.

> We do know for a fact that the easier Tor is to use (which includes
> installation, configuration, overall intuitiveness/"familiarity",
> compatibility, and performance), the more people will use it
> regularly.

That seems to hold for every piece of software, I guess.

> We also know for a fact that the more people use Tor, the better the
> baseline privacy, anonymity, and censorship resistance properties all
> become.

Agreed.

> Hence, I tend to make decisions in favor of the usability direction
> over minor details, especially ones that don't really prevent bad
> actors/adversaries from accomplishing their goals.

That is definitely a good approach. But maybe there is research to be
done here as well. Just a rough (and in part research) idea that I had
in mind while asking you the question above: What about if we first
started looking at different services offered in the web whether they
can be deployed anonymously *at all* (or maybe more precisely (but not
much): that can be deployed in a way that there is either no linkability
at all or the linkability is not strong enough to endanger the user)
(that would be worth some research, I guess)? We would probably find
some services where we had to say: "Well, there is no way to get them
used anonymously due to their nature and the power of the companies
and/or owners behind them." (Facebook comes here to my mind as a
candidate and the Google universe as well due to the power Google has).
Should we say we make the Tor anon mode compatible with these services
nevertheless (due to usability issues) and abandon stronger anonymity
measures? I would say no. Not at all. Rather we should be honest and
say: "Dear User, surfing anonymously AND using Facebook does not work.
You may use the Tor anon mode for that purpose though but there is a
high probability that it breaks functionality." The idea of getting more
users due to being not too strict here might be appealing but is not the
right decision in the end. I think one has to realize that there are
services in the web that are *designed* in a way that one EITHER may use
them OR use anonymity services. Sure, the devil is in the details (e.g.
there are probably a lot of services that may be usable anonymously but
then are accompanied with a certain lack of usability. What about them?
Should we decide against usability again or should we loosen our means
to provide unlinkability here?) but that does not mean there is no way
to find a good solution though. In short (and still roughly): I would
like to start thinking from having all means available to surf the web
anonymously and then downgrade them piece-by-piece to reach a trade-off
between anonymity and usability. Services that may not be used
anonymously at all would not trigger such a painful downgrade ("painful"
as one usually tries first to hack around existing problems encountering
unbelievable design issues and bugs and has to concede finally that it
is in the user's interest to exclude that feature (again)).

> The need for science especially comes in on the fingerprinting arena.
> Some fingerprinting opportunities may not actually be appealing to
> adversaries. Some may even appear appealing in theory, but in practice
> would be noticeable to the user, too noisy, and/or too error-prone.
> Hence I called for more panopticlick-style studies, especially of
> Javascript features, in the blog post.

Yes, that is definitely a good idea though I tend to avoid them all even
if currently no adversary is using them (especially if no usability
issue is at stake). First: no one knows whether one did not miss an
attacker using this kind of attack vector and second: Getting rid of
attack vectors is a good thing per se.

Georg

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 490 bytes
Desc: OpenPGP digital signature
URL: <http://lists.torproject.org/pipermail/tor-dev/attachments/20110710/9fddc03e/attachment.pgp>


More information about the tor-dev mailing list