[tor-project] Ethics Guidelines; crawling .onion

Tim Wilson-Brown - teor teor2345 at gmail.com
Thu Jul 7 05:44:21 UTC 2016


> On 7 Jul 2016, at 15:24, Virgil Griffith <i at virgil.gr> wrote:
> 
> > How do you make sure that Tor2web users are anonymised (as possible) when accessing hidden services?
> 
> I make a good faith effort not to wantonly reveal personally identifying information.  But in short, it's hard.  I urge people to think of tor2web nodes as closer to Twitter where they record what links you click.  I wholly support having the "where is Tor2web in regards to user privacy" discussion (hopefully could even make some improvements to it!), but it is orthogonal to the "robots.txt on .onion" discussion.  Let's address the robots.txt issue and then we can return to Tor2web user-privacy.

Well, as a separate issue, you might want to remove the client IP address (X-Forwarded-For) from HTTP headers your caching proxies send to hidden services. And work out if any of the other headers are sensitive.

> On 7 Jul 2016, at 14:40, Virgil Griffith <i at virgil.gr> wrote:
> 
> So now we have *three* different positions among respected members of the Tor community.
> 
> (A) isis et al: robots.txt is insufficient
> --- "Consent is not the absence of saying 'no' — it is explicitly saying 'yes'."
> 
> (B) onionlink/ahmia/notevil/grams: we respect robots.txt
> --- "Default is yes, but you can always opt-out."

Is the opt-out permanent, or does your server re-check every time it connects?
I can imagine there being issues with either model - one involves storing a list, the other, regular connections.

> (C) onionstats/memex: we ignore robots.txt
> --- "Don't care even if you opt-out." (see https://onionscan.org/reports/may2016.html)
> 
> 
> Isis did a good job arguing for (A) by claiming that representing (B) and (C) are "blatant and disgusting workaround[s] to the trust and expectations which onion service operators place in the network." https://lists.torproject.org/pipermail/tor-project/2016-May/000356.html
> 
> This is me arguing for (B): https://lists.torproject.org/pipermail/tor-project/2016-May/000411.html
> 
> I have no link arguing for (C).

I am disappointed that we have a Tor2web design where Tor2web needs to connect to a hidden service first, then check if it has given permission for Tor2web to connect to it. I am also disappointed that this only works for HTTP onions on the default port 80.

I would like to see a much better design for this.

I am also concerned about threat models where a single unwanted connection, or a number of unwanted connections, are security factors.
For example:
Imagine there is an (unknown) attack which can determine 1 bit of the 1024-bit RSA key per hidden service connection.
(Some known attacks on broken crypto systems are like this, as are some side-channels.)
Or imagine there is an attack which can determine 1 bit of the IPv4 address per connection.

For security, a hidden service operator decides to only allow 10 connections before rolling over their hidden service to a new key and server.

There are at least 10 connections to known .onion addresses every week, because there are at least 10 Tor2web or memex or onionstats instances on the web.
Therefore, every week, the operator must roll over their hidden service, and arrange to notify users of the new address in a secure fashion. Alternately, they must keep the address secret, even from the HSDir hash ring, which is not possible.

Is there an alternative to position (A) that supports threat models like this?

I believe that a technical solution to this threat model is hidden service client authentication (and the next-generation hidden service protocol, when available).
However, there is also the possibility of exerting social pressure to prevent people from running servers that continually connect to tor hidden services.

Tim

Tim Wilson-Brown (teor)

teor2345 at gmail dot com
PGP C855 6CED 5D90 A0C5 29F6 4D43 450C BA7F 968F 094B
ricochet:ekmygaiu4rzgsk6n





Tim

Tim Wilson-Brown (teor)

teor2345 at gmail dot com
PGP C855 6CED 5D90 A0C5 29F6 4D43 450C BA7F 968F 094B
ricochet:ekmygaiu4rzgsk6n




-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 842 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://lists.torproject.org/pipermail/tor-project/attachments/20160707/4b9c1339/attachment.sig>


More information about the tor-project mailing list