[tbb-bugs] #18361 [Tor Browser]: Issues with corporate censorship and mass surveillance
Tor Bug Tracker & Wiki
blackhole at torproject.org
Sun Mar 6 22:01:35 UTC 2016
#18361: Issues with corporate censorship and mass surveillance
Reporter: ioerror | Owner: tbb-team
Type: enhancement | Status: new
Priority: High | Milestone:
Component: Tor Browser | Version:
Severity: Critical | Resolution:
Keywords: security, privacy, anonymity | Actual Points:
Parent ID: | Points:
Comment (by samlanning):
I've been thinking over this problem for a number of days now, and think I
may have come to a solution that is somewhat of a Compromise.
(I've written this up in more detail as a blog post over at
https://samlanning.com/blog/the_tor_cloudflare_problem/ that I'd love
But here's the important bit:
This idea requires work from both the Tor developers (specifically those
who work on TBB), and the CloudFlare developers.
== The User Experience ==
For non Tor users, or Tor users using an older TBB, the experience is
unchanged. Older Tor users will still have to use a Captcha, which will
grant them full access to a website as is currently done now. For users
using the latest TBB, upon landing on a website protected by CloudFlare,
they will see something like this:
''Note: the wording in this screenshot is by no means final.''
Now the user can choose to either ignore the warning, dismiss it, or click
"Prove You're Human". Ignoring the warning will allow the user to continue
using the site in a Read Only mode; here I think the most appropriate
Implementation would be to use Cached-Only pages (not sending any requests
on to the server). For any cache misses it can display the Captcha.
Now when a user submits a form, the page will remain in a "loading" state
while a new tab is opened and focused for the user to complete a Captcha.
(We could optionally have the same warning displayed on this page, but
without the button or dismiss icon). Once the user has completed the
captcha, the tab will close and the existing (paused) tab will continue
(actually make the request).
A similar thing would happen for any AJAX or WebSocket requests, the
request would be paused until a Captcha is completed in a separate tab or
This would allows for, I think, the minimum amount of friction for
performing any particular task on a website, requiring a Captcha only when
necessary, and indicating to a user that they are viewing a reduced-
functionality version of a website.
== A Technical Implementation ==
On the TBB side, the browser would need to indicate that it supports this
"prove human" functionality by way of either User-Agent, or by specifying
a particular header. For example, along with the request, it could send `X
The CloudFlare server, upon receiving a request, if:
* The threat level has been determined as "CAPTCHA"
* The user agent supports the "Human Proof" feature (i.e. has the
appropriate `X-Human-Proof` header).
* There is no cookie set for the Captcha (no existing proof-of-human).
* The request is a `GET`.
* The requested URL is cached.
Then return the cached contents, along with a header like `X-Human-Proof-
Required: <some URL to visit for Captcha>`. In any other situation, behave
as normal. ''(Note: the URL will need to be for the same domain as the
request, so site-relative probably will make most sense, i.e. starting
The TBB, upon seeing a response with the header `X-Human-Proof-Required`,
will mark any domains that return this as "requiring human proof" (for the
given session), and for any pages whose URL contains a domain in this
list, display the bar shown in the screenshot (unless it's already been
Now when any non-`GET` request is made to a domain marked as "requiring
human proof" (whether AJAX, WebSocket or otherwise), pause the request,
and open a new tab to the URL required (given in the `X-Human-Proof-
Required` header). Wait for a response from the given domain that '''does
not''' contain the `X-Human-Proof-Required` header, then continue the
paused request (actually send the request to the server).
== Future Improvements ==
This would give us a good foundation for building on iterative UX
improvements, and improving mechanisms for how user agents prove to
servers that they are being operated by humans. From here we could:
* Submit an RFC for these headers, and try and make an official spec for
* Make these changes in the client (handling of headers, pausing requests,
opening challenge in new tab etc...) upstream, and across other browsers.
* Iteratively improve the UI, such as displaying a blocking-dialog on any
pages that are waiting on a captcha (or other challenge) to be completed.
* Encourage websites that don't use CloudFlare, but block tor exit nodes
to instead behave in this manner.
== Potential Issues ==
The biggest issue I see with this solution is that it would require some
non-trivial engineering effort from the Tor developers. For CloudFlare, I
feel that this engineering effort would be comparatively less difficult.
But I honesty feel it would pay off.
Another thing I did think of is that this mechanism may encourage website
operators to more eagerly block Tor traffic and require "proof-of-
humanness" to use a website to its full capacity, but I'm unsure about
After having given this idea some thought for a couple of days, other than
the above points, I am yet to come up with any significant issues. Please
let me know if you can think of any and I'll update this post.
I look forward to seeing if this idea can get us any further to finding a
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/18361#comment:194>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online
More information about the tbb-bugs