tor-commits
Threads by month
- ----- 2025 -----
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
April 2014
- 22 participants
- 2020 discussions

28 Apr '14
commit a7b44b6f813146d2f0fc4a138d8b33208291df67
Author: Mike Perry <mikeperry-git(a)torproject.org>
Date: Mon Apr 28 17:16:29 2014 +0200
Relocate some directories.
---
design-doc/CookieManagers.png | Bin 0 -> 113551 bytes
design-doc/Firefox17-TODO | 82 ++
design-doc/NewCookieManager.png | Bin 0 -> 27939 bytes
design-doc/build.sh | 1 +
design-doc/design.xml | 2733 ++++++++++++++++++++++++++++++++++++++
design-doc/outline.txt …
[View More] | 52 +
docs/audits/FF17_FEATURE_AUDIT | 19 -
docs/audits/FF17_NETWORK_AUDIT | 84 --
docs/audits/FF3.5_AUDIT | 195 ---
docs/audits/FF4_AUDIT | 50 -
docs/design/CookieManagers.png | Bin 113551 -> 0 bytes
docs/design/Firefox17-TODO | 82 --
docs/design/NewCookieManager.png | Bin 27939 -> 0 bytes
docs/design/build.sh | 1 -
docs/design/design.xml | 2733 --------------------------------------
docs/design/outline.txt | 52 -
16 files changed, 2868 insertions(+), 3216 deletions(-)
diff --git a/design-doc/CookieManagers.png b/design-doc/CookieManagers.png
new file mode 100644
index 0000000..0fc3e64
Binary files /dev/null and b/design-doc/CookieManagers.png differ
diff --git a/design-doc/Firefox17-TODO b/design-doc/Firefox17-TODO
new file mode 100644
index 0000000..41ef38e
--- /dev/null
+++ b/design-doc/Firefox17-TODO
@@ -0,0 +1,82 @@
++ Cleanups
+ + We specify browser.cache.memory.enable under disk avoidance. That's
+ wrong. We don't even set it at all. Torbutton relic?
+ + Disk leak documentation
+ + Firefox 17 will mess up all patch links
+
+- Heavy Writing by section
+ + Intro:
+ + We target Firefox ESR
+ + Component description
+ + Deprecation List/Future Philosophy:
+ + Linkability Transparency from
+ https://trac.torproject.org/projects/tor/ticket/5273#comment:12
+ + Adversary Goals
+ + Describe how each adversary attack violates design goals
+ + "Correlate activity across multiple site visits" as one of the adversary
+ goals. This is the primary goal of the ad networks, though. We need to
+ explicitly mention it in the Adversary Goals section for completeness.
+ + Misc implementation
+ + Link to prefs.js and describe omni.ja and extension-overrides hacks
+ + document the environment variables and settings used to provide a non-grey "New Identity" button.
+ + Mockup privacy UI
+ + Identifier Linkability
+ + Image cache jail
+ + DOM storage jail
+ + 3.5.8 is not clear that what we're trying to limit is non-click
+ driven/non-interactive linkability rather than linkability in all cases.
+ Other sections may have this problem, too.
+ + This is a subtlety that arises from both the impossibility of satisfying
+ unlinkability due to covert channels in GET/POST, as well as the desire
+ to avoid breaking thinks like consensual federated login.
+ - Fingerprinting
+ + @font-face exemption and preference
+ + Canvas prompt
+ + describe our resolution defenses
+ + Limit CSS media queries
+ + System colors + fonts
+ + Explain why panopticlick is weirdsauce
+ + We report our useragent as 17.0
+ + Click-to-play WebGL
+ + We should perhaps be more vocal about the fingerprinting issues with
+ some or all of http://www.w3.org/TR/navigation-timing/. I think I agree.
+ - provide an entropy count estimate for fingerprinting defenses
+ + Disk avoidance
+ + Private browsing + pref changes
+ + He reminded me about documenting disabling IndexedDB, but that is just one
+ of the many prefs.js changes we need to document.
+ - Testing
+ - Explain why panopticlick is weirdsauce
+ - Sync with QA pages
+ - Many are out of date
+ - http://www.stayinvisible.com/
+ - Evercookie test page, and perhaps also
+ http://jeremiahgrossman.blogspot.de/2007/04/tracking-users-without-cookies.…
+
+- Misc changes:
+ + Plugin handling
+ + All-but-flash patch
+ + Plugin manager manipulation
+ + We use Firefox's click-to-play
+ + Addons
+ + PDF.js inclusion
+ + List links to design violations/enhancements:
+ + https://trac.torproject.org/projects/tor/query?keywords=~tbb-linkability
+ + https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting
+ - Update notification/version checking?
+ - Create a deprecation list and link to it:
+ - Referer Header
+ - Window.name
+ - We should only preserve window.name if the url bar domain remains the
+ same. I could be convinced of this, but it's going to be trickier to
+ implement and I think it's not really possible to remove linkability for user
+ clicks in general.
+ - Torbutton Security Settings
+
+- Packaging
+ - Pref changes
+ - Socks ports
+ - Torbutton does not update
+
+
+
diff --git a/design-doc/NewCookieManager.png b/design-doc/NewCookieManager.png
new file mode 100644
index 0000000..97a0b40
Binary files /dev/null and b/design-doc/NewCookieManager.png differ
diff --git a/design-doc/build.sh b/design-doc/build.sh
new file mode 100755
index 0000000..5ffb650
--- /dev/null
+++ b/design-doc/build.sh
@@ -0,0 +1 @@
+xsltproc --output index.html.en -stringparam chunker.output.encoding UTF-8 --stringparam section.autolabel.max.depth 2 -stringparam section.label.includes.component.label 1 --stringparam section.autolabel 1 /usr/share/xml/docbook/stylesheet/docbook-xsl/xhtml/docbook.xsl design.xml
diff --git a/design-doc/design.xml b/design-doc/design.xml
new file mode 100644
index 0000000..d1cdf0f
--- /dev/null
+++ b/design-doc/design.xml
@@ -0,0 +1,2733 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!DOCTYPE article PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN"
+ "file:///usr/share/sgml/docbook/xml-dtd-4.4-1.0-30.1/docbookx.dtd">
+
+<article id="design">
+ <articleinfo>
+ <title>The Design and Implementation of the Tor Browser [DRAFT]</title>
+ <author>
+ <firstname>Mike</firstname><surname>Perry</surname>
+ <affiliation>
+ <address><email>mikeperry#torproject org</email></address>
+ </affiliation>
+ </author>
+ <author>
+ <firstname>Erinn</firstname><surname>Clark</surname>
+ <affiliation>
+ <address><email>erinn#torproject org</email></address>
+ </affiliation>
+ </author>
+ <author>
+ <firstname>Steven</firstname><surname>Murdoch</surname>
+ <affiliation>
+ <address><email>sjmurdoch#torproject org</email></address>
+ </affiliation>
+ </author>
+ <pubdate>March 15, 2013</pubdate>
+ </articleinfo>
+
+<!--
+- Introduction and Threat model: [Mostly Torbutton]
+ - [Remove the security requirements section]
+-->
+
+<sect1>
+ <title>Introduction</title>
+ <para>
+
+This document describes the <link linkend="adversary">adversary model</link>,
+<link linkend="DesignRequirements">design requirements</link>, and <link
+linkend="Implementation">implementation</link> <!-- <link
+linkend="Packaging">packaging</link> and <link linkend="Testing">testing
+procedures</link> --> of the Tor Browser. It is current as of Tor Browser
+2.3.25-5 and Torbutton 1.5.1.
+
+ </para>
+ <para>
+
+This document is also meant to serve as a set of design requirements and to
+describe a reference implementation of a Private Browsing Mode that defends
+against active network adversaries, in addition to the passive forensic local
+adversary currently addressed by the major browsers.
+
+ </para>
+ <sect2 id="components">
+ <title>Browser Component Overview</title>
+ <para>
+
+The Tor Browser is based on <ulink
+url="https://www.mozilla.org/en-US/firefox/organizations/">Mozilla's Extended
+Support Release (ESR) Firefox branch</ulink>. We have a <link
+linkend="firefox-patches">series of patches</link> against this browser to
+enhance privacy and security. Browser behavior is additionally augmented
+through the <ulink
+url="https://gitweb.torproject.org/torbutton.git/tree/master">Torbutton
+extension</ulink>, though we are in the process of moving this
+functionality into direct Firefox patches. We also <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/HEAD:/build-scripts/confi…">change
+a number of Firefox preferences</ulink> from their defaults.
+
+ </para>
+ <para>
+
+To help protect against potential Tor Exit Node eavesdroppers, we include
+<ulink url="https://www.eff.org/https-everywhere">HTTPS-Everywhere</ulink>. To
+provide users with optional defense-in-depth against Javascript and other
+potential exploit vectors, we also include <ulink
+url="http://noscript.net/">NoScript</ulink>. To protect against
+PDF-based Tor proxy bypass and to improve usability, we include the <ulink
+url="https://addons.mozilla.org/en-us/firefox/addon/pdfjs/">PDF.JS</ulink>
+extension. We also modify <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/HEAD:/build-scripts/confi…">several
+extension preferences</ulink> from their defaults.
+
+ </para>
+ </sect2>
+</sect1>
+
+<!--
+- Design overview and philosophy
+ - Security requirements [Torbutton]
+ + local leaks?
+ - state issues
+ - Privacy Requirements [Mostly blog post]
+ - Avoid Cross-Domain Linkability
+ - Indentifiers
+ - Fingerprinting
+ - 100% self-contained
+ - Does not share state with other modes/browsers
+ - Easy to remove + wipe with external tools
+ - click-to-play for "troublesome" features
+ - Philosophy
+ - No filters
+-->
+
+
+<sect1 id="DesignRequirements">
+ <title>Design Requirements and Philosophy</title>
+ <para>
+
+The Tor Browser Design Requirements are meant to describe the properties of a
+Private Browsing Mode that defends against both network and local forensic
+adversaries.
+
+ </para>
+ <para>
+
+There are two main categories of requirements: <link
+linkend="security">Security Requirements</link>, and <link
+linkend="privacy">Privacy Requirements</link>. Security Requirements are the
+minimum properties in order for a browser to be able to support Tor and
+similar privacy proxies safely. Privacy requirements are the set of properties
+that cause us to prefer one browser over another.
+
+ </para>
+ <para>
+
+While we will endorse the use of browsers that meet the security requirements,
+it is primarily the privacy requirements that cause us to maintain our own
+browser distribution.
+
+ </para>
+ <para>
+
+ The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL
+ NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and
+ "OPTIONAL" in this document are to be interpreted as described in
+ <ulink url="https://www.ietf.org/rfc/rfc2119.txt">RFC 2119</ulink>.
+
+ </para>
+
+ <sect2 id="security">
+ <title>Security Requirements</title>
+ <para>
+
+The security requirements are primarily concerned with ensuring the safe use
+of Tor. Violations in these properties typically result in serious risk for
+the user in terms of immediate deanonymization and/or observability. With
+respect to browser support, security requirements are the minimum properties
+in order for Tor to support the use of a particular browser.
+
+ </para>
+
+<orderedlist>
+ <listitem><link linkend="proxy-obedience"><command>Proxy
+Obedience</command></link>
+ <para>The browser
+MUST NOT bypass Tor proxy settings for any content.</para></listitem>
+
+ <listitem><link linkend="state-separation"><command>State
+Separation</command></link>
+
+ <para>
+
+The browser MUST NOT provide the content window with any state from any other
+browsers or any non-Tor browsing modes. This includes shared state from
+independent plugins, and shared state from Operating System implementations of
+TLS and other support libraries.
+
+</para></listitem>
+
+ <listitem><link linkend="disk-avoidance"><command>Disk
+Avoidance</command></link>
+
+<para>
+
+The browser MUST NOT write any information that is derived from or that
+reveals browsing activity to the disk, or store it in memory beyond the
+duration of one browsing session, unless the user has explicitly opted to
+store their browsing history information to disk.
+
+</para></listitem>
+ <listitem><link linkend="app-data-isolation"><command>Application Data
+Isolation</command></link>
+
+<para>
+
+The components involved in providing private browsing MUST be self-contained,
+or MUST provide a mechanism for rapid, complete removal of all evidence of the
+use of the mode. In other words, the browser MUST NOT write or cause the
+operating system to write <emphasis>any information</emphasis> about the use
+of private browsing to disk outside of the application's control. The user
+must be able to ensure that secure deletion of the software is sufficient to
+remove evidence of the use of the software. All exceptions and shortcomings
+due to operating system behavior MUST be wiped by an uninstaller. However, due
+to permissions issues with access to swap, implementations MAY choose to leave
+it out of scope, and/or leave it to the Operating System/platform to implement
+ephemeral-keyed encrypted swap.
+
+</para></listitem>
+
+<!--
+ <listitem><link linkend="update-safety"><command>Update
+Safety</command></link>
+
+<para>The browser SHOULD NOT perform unsafe updates or upgrades.</para></listitem>
+-->
+</orderedlist>
+
+ </sect2>
+
+ <sect2 id="privacy">
+ <title>Privacy Requirements</title>
+ <para>
+
+The privacy requirements are primarily concerned with reducing linkability:
+the ability for a user's activity on one site to be linked with their activity
+on another site without their knowledge or explicit consent. With respect to
+browser support, privacy requirements are the set of properties that cause us
+to prefer one browser over another.
+
+ </para>
+
+ <para>
+
+For the purposes of the unlinkability requirements of this section as well as
+the descriptions in the <link linkend="Implementation">implementation
+section</link>, a <command>url bar origin</command> means at least the
+second-level DNS name. For example, for mail.google.com, the origin would be
+google.com. Implementations MAY, at their option, restrict the url bar origin
+to be the entire fully qualified domain name.
+
+ </para>
+
+<orderedlist>
+ <listitem><link linkend="identifier-linkability"><command>Cross-Origin
+Identifier Unlinkability</command></link>
+ <para>
+
+User activity on one url bar origin MUST NOT be linkable to their activity in
+any other url bar origin by any third party automatically or without user
+interaction or approval. This requirement specifically applies to linkability
+from stored browser identifiers, authentication tokens, and shared state. The
+requirement does not apply to linkable information the user manually submits
+to sites, or due to information submitted during manual link traversal. This
+functionality SHOULD NOT interfere with interactive, click-driven federated
+login in a substantial way.
+
+ </para>
+ </listitem>
+ <listitem><link linkend="fingerprinting-linkability"><command>Cross-Origin
+Fingerprinting Unlinkability</command></link>
+ <para>
+
+User activity on one url bar origin MUST NOT be linkable to their activity in
+any other url bar origin by any third party. This property specifically applies to
+linkability from fingerprinting browser behavior.
+
+ </para>
+ </listitem>
+ <listitem><link linkend="new-identity"><command>Long-Term
+Unlinkability</command></link>
+ <para>
+
+The browser MUST provide an obvious, easy way for the user to remove all of
+its authentication tokens and browser state and obtain a fresh identity.
+Additionally, the browser SHOULD clear linkable state by default automatically
+upon browser restart, except at user option.
+
+ </para>
+ </listitem>
+</orderedlist>
+
+ </sect2>
+ <sect2 id="philosophy">
+ <title>Philosophy</title>
+ <para>
+
+In addition to the above design requirements, the technology decisions about
+Tor Browser are also guided by some philosophical positions about technology.
+
+ </para>
+ <orderedlist>
+ <listitem><command>Preserve existing user model</command>
+ <para>
+
+The existing way that the user expects to use a browser must be preserved. If
+the user has to maintain a different mental model of how the sites they are
+using behave depending on tab, browser state, or anything else that would not
+normally be what they experience in their default browser, the user will
+inevitably be confused. They will make mistakes and reduce their privacy as a
+result. Worse, they may just stop using the browser, assuming it is broken.
+
+ </para>
+ <para>
+
+User model breakage was one of the <ulink
+url="https://blog.torproject.org/blog/toggle-or-not-toggle-end-torbutton">failures
+of Torbutton</ulink>: Even if users managed to install everything properly,
+the toggle model was too hard for the average user to understand, especially
+in the face of accumulating tabs from multiple states crossed with the current
+Tor-state of the browser.
+
+ </para>
+ </listitem>
+ <listitem><command>Favor the implementation mechanism least likely to
+break sites</command>
+ <para>
+
+In general, we try to find solutions to privacy issues that will not induce
+site breakage, though this is not always possible.
+
+ </para>
+ </listitem>
+ <listitem><command>Plugins must be restricted</command>
+ <para>
+
+Even if plugins always properly used the browser proxy settings (which none of
+them do) and could not be induced to bypass them (which all of them can), the
+activities of closed-source plugins are very difficult to audit and control.
+They can obtain and transmit all manner of system information to websites,
+often have their own identifier storage for tracking users, and also
+contribute to fingerprinting.
+
+ </para>
+ <para>
+
+Therefore, if plugins are to be enabled in private browsing modes, they must
+be restricted from running automatically on every page (via click-to-play
+placeholders), and/or be sandboxed to restrict the types of system calls they
+can execute. If the user agent allows the user to craft an exemption to allow
+a plugin to be used automatically, it must only apply to the top level url bar
+domain, and not to all sites, to reduce cross-origin fingerprinting
+linkability.
+
+ </para>
+ </listitem>
+ <listitem><command>Minimize Global Privacy Options</command>
+ <para>
+
+<ulink url="https://trac.torproject.org/projects/tor/ticket/3100">Another
+failure of Torbutton</ulink> was the options panel. Each option
+that detectably alters browser behavior can be used as a fingerprinting tool.
+Similarly, all extensions <ulink
+url="http://blog.chromium.org/2010/06/extensions-in-incognito.html">should be
+disabled in the mode</ulink> except as an opt-in basis. We should not load
+system-wide and/or Operating System provided addons or plugins.
+
+ </para>
+ <para>
+Instead of global browser privacy options, privacy decisions should be made
+<ulink
+url="https://wiki.mozilla.org/Privacy/Features/Site-based_data_management_UI">per
+url bar origin</ulink> to eliminate the possibility of linkability
+between domains. For example, when a plugin object (or a Javascript access of
+window.plugins) is present in a page, the user should be given the choice of
+allowing that plugin object for that url bar origin only. The same
+goes for exemptions to third party cookie policy, geo-location, and any other
+privacy permissions.
+ </para>
+ <para>
+If the user has indicated they wish to record local history storage, these
+permissions can be written to disk. Otherwise, they should remain memory-only.
+ </para>
+ </listitem>
+ <listitem><command>No filters</command>
+ <para>
+
+Site-specific or filter-based addons such as <ulink
+url="https://addons.mozilla.org/en-US/firefox/addon/adblock-plus/">AdBlock
+Plus</ulink>, <ulink url="http://requestpolicy.com/">Request Policy</ulink>,
+<ulink url="http://www.ghostery.com/about">Ghostery</ulink>, <ulink
+url="http://priv3.icsi.berkeley.edu/">Priv3</ulink>, and <ulink
+url="http://sharemenot.cs.washington.edu/">Sharemenot</ulink> are to be
+avoided. We believe that these addons do not add any real privacy to a proper
+<link linkend="Implementation">implementation</link> of the above <link
+linkend="privacy">privacy requirements</link>, and that development efforts
+should be focused on general solutions that prevent tracking by all
+third parties, rather than a list of specific URLs or hosts.
+ </para>
+ <para>
+Filter-based addons can also introduce strange breakage and cause usability
+nightmares, and will also fail to do their job if an adversary simply
+registers a new domain or creates a new url path. Worse still, the unique
+filter sets that each user creates or installs will provide a wealth of
+fingerprinting targets.
+ </para>
+ <para>
+
+As a general matter, we are also generally opposed to shipping an always-on Ad
+blocker with Tor Browser. We feel that this would damage our credibility in
+terms of demonstrating that we are providing privacy through a sound design
+alone, as well as damage the acceptance of Tor users by sites that support
+themselves through advertising revenue.
+
+ </para>
+ <para>
+Users are free to install these addons if they wish, but doing
+so is not recommended, as it will alter the browser request fingerprint.
+ </para>
+ </listitem>
+ <listitem><command>Stay Current</command>
+ <para>
+We believe that if we do not stay current with the support of new web
+technologies, we cannot hope to substantially influence or be involved in
+their proper deployment or privacy realization. However, we will likely disable
+high-risk features pending analysis, audit, and mitigation.
+ </para>
+ </listitem>
+<!--
+ <listitem><command>Transparency in Navigation Tracking</command>
+ <para>
+
+While we believe it is possible to restrict third party tracking with only
+minimal site breakage, it is our long-term goal to further reduce cross-origin
+click navigation tracking to mechanisms that are detectable by experts and
+attentive users, so they can alert the general public if cross-origin
+click navigation tracking is happening where it should not be.
+
+ </para>
+ <para>
+
+However, the entrenched nature of certain archaic web features make it
+impossible for us to achieve this wider goal by ourselves without substantial
+site breakage. So, instead we maintain a <link linkend="deprecate">Deprecation
+Wishlist</link> of archaic web technologies that are currently being (ab)used
+to facilitate federated login and other legitimate click-driven cross-domain
+activity but that can one day be replaced with more privacy friendly,
+auditable alternatives.
+
+ </para>
+ </listitem>
+-->
+ </orderedlist>
+ </sect2>
+</sect1>
+
+<!--
+- Implementation
+ - Section Template
+ - Sub Section
+ - "Design Goal":
+ - "Implementation Status"
+ - Local Privacy
+ - Linkability
+ - Stored State
+ - Cookies
+ - Cache
+ - DOM Storage
+ - HTTP Auth
+ - SSL state
+ - Plugins
+ - Fingerprinting
+ - Location + timezone is part of this
+ - Patches?
+-->
+ <sect1 id="adversary">
+ <title>Adversary Model</title>
+ <para>
+
+A Tor web browser adversary has a number of goals, capabilities, and attack
+types that can be used to illustrate the design requirements for the
+Tor Browser. Let's start with the goals.
+
+ </para>
+ <sect2 id="adversary-goals">
+ <title>Adversary Goals</title>
+ <orderedlist>
+<!-- These aren't really commands.. But it's the closest I could find in an
+acceptable style.. Don't really want to make my own stylesheet -->
+ <listitem><command>Bypassing proxy settings</command>
+ <para>The adversary's primary goal is direct compromise and bypass of
+Tor, causing the user to directly connect to an IP of the adversary's
+choosing.</para>
+ </listitem>
+ <listitem><command>Correlation of Tor vs Non-Tor Activity</command>
+ <para>If direct proxy bypass is not possible, the adversary will likely
+happily settle for the ability to correlate something a user did via Tor with
+their non-Tor activity. This can be done with cookies, cache identifiers,
+javascript events, and even CSS. Sometimes the fact that a user uses Tor may
+be enough for some authorities.</para>
+ </listitem>
+ <listitem><command>History disclosure</command>
+ <para>
+The adversary may also be interested in history disclosure: the ability to
+query a user's history to see if they have issued certain censored search
+queries, or visited censored sites.
+ </para>
+ </listitem>
+ <listitem><command>Correlate activity across multiple sites</command>
+ <para>
+
+The primary goal of the advertising networks is to know that the user who
+visited siteX.com is the same user that visited siteY.com to serve them
+targeted ads. The advertising networks become our adversary insofar as they
+attempt to perform this correlation without the user's explicit consent.
+
+ </para>
+ </listitem>
+ <listitem><command>Fingerprinting/anonymity set reduction</command>
+ <para>
+
+Fingerprinting (more generally: "anonymity set reduction") is used to attempt
+to gather identifying information on a particular individual without the use
+of tracking identifiers. If the dissident or whistleblower's timezone is
+available, and they are using a rare build of Firefox for an obscure operating
+system, and they have a specific display resolution only used on one type of
+laptop, this can be very useful information for tracking them down, or at
+least <link linkend="fingerprinting">tracking their activities</link>.
+
+ </para>
+ </listitem>
+ <listitem><command>History records and other on-disk
+information</command>
+ <para>
+In some cases, the adversary may opt for a heavy-handed approach, such as
+seizing the computers of all Tor users in an area (especially after narrowing
+the field by the above two pieces of information). History records and cache
+data are the primary goals here.
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect2>
+
+ <sect2 id="adversary-positioning">
+ <title>Adversary Capabilities - Positioning</title>
+ <para>
+The adversary can position themselves at a number of different locations in
+order to execute their attacks.
+ </para>
+ <orderedlist>
+ <listitem><command>Exit Node or Upstream Router</command>
+ <para>
+The adversary can run exit nodes, or alternatively, they may control routers
+upstream of exit nodes. Both of these scenarios have been observed in the
+wild.
+ </para>
+ </listitem>
+ <listitem><command>Ad servers and/or Malicious Websites</command>
+ <para>
+The adversary can also run websites, or more likely, they can contract out
+ad space from a number of different ad servers and inject content that way. For
+some users, the adversary may be the ad servers themselves. It is not
+inconceivable that ad servers may try to subvert or reduce a user's anonymity
+through Tor for marketing purposes.
+ </para>
+ </listitem>
+ <listitem><command>Local Network/ISP/Upstream Router</command>
+ <para>
+The adversary can also inject malicious content at the user's upstream router
+when they have Tor disabled, in an attempt to correlate their Tor and Non-Tor
+activity.
+ </para>
+ <para>
+
+Additionally, at this position the adversary can block Tor, or attempt to
+recognize the traffic patterns of specific web pages at the entrance to the Tor
+network.
+
+ </para>
+ </listitem>
+ <listitem><command>Physical Access</command>
+ <para>
+Some users face adversaries with intermittent or constant physical access.
+Users in Internet cafes, for example, face such a threat. In addition, in
+countries where simply using tools like Tor is illegal, users may face
+confiscation of their computer equipment for excessive Tor usage or just
+general suspicion.
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect2>
+
+ <sect2 id="attacks">
+ <title>Adversary Capabilities - Attacks</title>
+ <para>
+
+The adversary can perform the following attacks from a number of different
+positions to accomplish various aspects of their goals. It should be noted
+that many of these attacks (especially those involving IP address leakage) are
+often performed by accident by websites that simply have Javascript, dynamic
+CSS elements, and plugins. Others are performed by ad servers seeking to
+correlate users' activity across different IP addresses, and still others are
+performed by malicious agents on the Tor network and at national firewalls.
+
+ </para>
+ <orderedlist>
+ <listitem><command>Read and insert identifiers</command>
+ <para>
+
+The browser contains multiple facilities for storing identifiers that the
+adversary creates for the purposes of tracking users. These identifiers are
+most obviously cookies, but also include HTTP auth, DOM storage, cached
+scripts and other elements with embedded identifiers, client certificates, and
+even TLS Session IDs.
+
+ </para>
+ <para>
+
+An adversary in a position to perform MITM content alteration can inject
+document content elements to both read and inject cookies for arbitrary
+domains. In fact, even many "SSL secured" websites are vulnerable to this sort of
+<ulink url="http://seclists.org/bugtraq/2007/Aug/0070.html">active
+sidejacking</ulink>. In addition, the ad networks of course perform tracking
+with cookies as well.
+
+ </para>
+ <para>
+
+These types of attacks are attempts at subverting our <link
+linkend="identifier-linkability">Cross-Origin Identifier Unlinkability</link> and <link
+linkend="new-identity">Long-Term Unlikability</link> design requirements.
+
+ </para>
+ </listitem>
+ <listitem id="fingerprinting"><command>Fingerprint users based on browser
+attributes</command>
+<para>
+
+There is an absurd amount of information available to websites via attributes
+of the browser. This information can be used to reduce anonymity set, or even
+uniquely fingerprint individual users. Attacks of this nature are typically
+aimed at tracking users across sites without their consent, in an attempt to
+subvert our <link linkend="fingerprinting-linkability">Cross-Origin
+Fingerprinting Unlinkability</link> and <link
+linkend="new-identity">Long-Term Unlikability</link> design requirements.
+
+</para>
+
+<para>
+
+Fingerprinting is an intimidating
+problem to attempt to tackle, especially without a metric to determine or at
+least intuitively understand and estimate which features will most contribute
+to linkability between visits.
+
+</para>
+
+<para>
+
+The <ulink url="https://panopticlick.eff.org/about.php">Panopticlick study
+done</ulink> by the EFF uses the <ulink
+url="https://en.wikipedia.org/wiki/Entropy_%28information_theory%29">Shannon
+entropy</ulink> - the number of identifying bits of information encoded in
+browser properties - as this metric. Their <ulink
+url="https://wiki.mozilla.org/Fingerprinting#Data">result data</ulink> is
+definitely useful, and the metric is probably the appropriate one for
+determining how identifying a particular browser property is. However, some
+quirks of their study means that they do not extract as much information as
+they could from display information: they only use desktop resolution and do
+not attempt to infer the size of toolbars. In the other direction, they may be
+over-counting in some areas, as they did not compute joint entropy over
+multiple attributes that may exhibit a high degree of correlation. Also, new
+browser features are added regularly, so the data should not be taken as
+final.
+
+ </para>
+ <para>
+
+Despite the uncertainty, all fingerprinting attacks leverage the following
+attack vectors:
+
+ </para>
+ <orderedlist>
+ <listitem><command>Observing Request Behavior</command>
+ <para>
+
+Properties of the user's request behavior comprise the bulk of low-hanging
+fingerprinting targets. These include: User agent, Accept-* headers, pipeline
+usage, and request ordering. Additionally, the use of custom filters such as
+AdBlock and other privacy filters can be used to fingerprint request patterns
+(as an extreme example).
+
+ </para>
+ </listitem>
+
+ <listitem><command>Inserting Javascript</command>
+ <para>
+
+Javascript can reveal a lot of fingerprinting information. It provides DOM
+objects such as window.screen and window.navigator to extract information
+about the useragent.
+
+Also, Javascript can be used to query the user's timezone via the
+<function>Date()</function> object, <ulink
+url="https://www.khronos.org/registry/webgl/specs/1.0/#5.13">WebGL</ulink> can
+reveal information about the video card in use, and high precision timing
+information can be used to <ulink
+url="http://w2spconf.com/2011/papers/jspriv.pdf">fingerprint the CPU and
+interpreter speed</ulink>. In the future, new JavaScript features such as
+<ulink url="http://w3c-test.org/webperf/specs/ResourceTiming/">Resource
+Timing</ulink> may leak an unknown amount of network timing related
+information.
+
+<!-- FIXME: resource-timing stuff? -->
+
+ </para>
+ </listitem>
+
+ <listitem><command>Inserting Plugins</command>
+ <para>
+
+The Panopticlick project found that the mere list of installed plugins (in
+navigator.plugins) was sufficient to provide a large degree of
+fingerprintability. Additionally, plugins are capable of extracting font lists,
+interface addresses, and other machine information that is beyond what the
+browser would normally provide to content. In addition, plugins can be used to
+store unique identifiers that are more difficult to clear than standard
+cookies. <ulink url="http://epic.org/privacy/cookies/flash.html">Flash-based
+cookies</ulink> fall into this category, but there are likely numerous other
+examples. Beyond fingerprinting, plugins are also abysmal at obeying the proxy
+settings of the browser.
+
+
+ </para>
+ </listitem>
+ <listitem><command>Inserting CSS</command>
+ <para>
+
+<ulink url="https://developer.mozilla.org/En/CSS/Media_queries">CSS media
+queries</ulink> can be inserted to gather information about the desktop size,
+widget size, display type, DPI, user agent type, and other information that
+was formerly available only to Javascript.
+
+ </para>
+ </listitem>
+ </orderedlist>
+ </listitem>
+ <listitem id="website-traffic-fingerprinting"><command>Website traffic fingerprinting</command>
+ <para>
+
+Website traffic fingerprinting is an attempt by the adversary to recognize the
+encrypted traffic patterns of specific websites. In the case of Tor, this
+attack would take place between the user and the Guard node, or at the Guard
+node itself.
+ </para>
+
+ <para> The most comprehensive study of the statistical properties of this
+attack against Tor was done by <ulink
+url="http://lorre.uni.lu/~andriy/papers/acmccs-wpes11-fingerprinting.pdf">Panchenko
+et al</ulink>. Unfortunately, the publication bias in academia has encouraged
+the production of a number of follow-on attack papers claiming "improved"
+success rates, in some cases even claiming to completely invalidate any
+attempt at defense. These "improvements" are actually enabled primarily by
+taking a number of shortcuts (such as classifying only very small numbers of
+web pages, neglecting to publish ROC curves or at least false positive rates,
+and/or omitting the effects of dataset size on their results). Despite these
+subsequent "improvements", we are skeptical of the efficacy of this attack in
+a real world scenario, <emphasis>especially</emphasis> in the face of any
+defenses.
+
+ </para>
+ <para>
+
+In general, with machine learning, as you increase the <ulink
+url="https://en.wikipedia.org/wiki/VC_dimension">number and/or complexity of
+categories to classify</ulink> while maintaining a limit on reliable feature
+information you can extract, you eventually run out of descriptive feature
+information, and either true positive accuracy goes down or the false positive
+rate goes up. This error is called the <ulink
+url="http://www.cs.washington.edu/education/courses/csep573/98sp/lectures/lectur…">bias
+in your hypothesis space</ulink>. In fact, even for unbiased hypothesis
+spaces, the number of training examples required to achieve a reasonable error
+bound is <ulink
+url="https://en.wikipedia.org/wiki/Probably_approximately_correct_learning#Equiv…">a
+function of the complexity of the categories</ulink> you need to classify.
+
+ </para>
+ <para>
+
+
+In the case of this attack, the key factors that increase the classification
+complexity (and thus hinder a real world adversary who attempts this attack)
+are large numbers of dynamically generated pages, partially cached content,
+and also the non-web activity of entire Tor network. This yields an effective
+number of "web pages" many orders of magnitude larger than even <ulink
+url="http://lorre.uni.lu/~andriy/papers/acmccs-wpes11-fingerprinting.pdf">Panchenko's
+"Open World" scenario</ulink>, which suffered continous near-constant decline
+in the true positive rate as the "Open World" size grew (see figure 4). This
+large level of classification complexity is further confounded by a noisy and
+low resolution featureset - one which is also relatively easy for the defender
+to manipulate at low cost.
+
+ </para>
+ <para>
+
+To make matters worse for a real-world adversary, the ocean of Tor Internet
+activity (at least, when compared to a lab setting) makes it a certainty that
+an adversary attempting examine large amounts of Tor traffic will ultimately
+be overwhelmed by false positives (even after making heavy tradeoffs on the
+ROC curve to minimize false positives to below 0.01%). This problem is known
+in the IDS literature as the <ulink
+url="http://www.raid-symposium.org/raid99/PAPERS/Axelsson.pdf">Base Rate
+Fallacy</ulink>, and it is the primary reason that anomaly and activity
+classification-based IDS and antivirus systems have failed to materialize in
+the marketplace (despite early success in academic literature).
+
+ </para>
+ <para>
+
+Still, we do not believe that these issues are enough to dismiss the attack
+outright. But we do believe these factors make it both worthwhile and
+effective to <link linkend="traffic-fingerprinting-defenses">deploy
+light-weight defenses</link> that reduce the accuracy of this attack by
+further contributing noise to hinder successful feature extraction.
+
+ </para>
+ </listitem>
+ <listitem><command>Remotely or locally exploit browser and/or
+OS</command>
+ <para>
+
+Last, but definitely not least, the adversary can exploit either general
+browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to
+install malware and surveillance software. An adversary with physical access
+can perform similar actions.
+
+ </para>
+ <para>
+
+For the purposes of the browser itself, we limit the scope of this adversary
+to one that has passive forensic access to the disk after browsing activity
+has taken place. This adversary motivates our
+<link linkend="disk-avoidance">Disk Avoidance</link> defenses.
+
+ </para>
+ <para>
+
+An adversary with arbitrary code execution typically has more power, though.
+It can be quite hard to really significantly limit the capabilities of such an
+adversary. <ulink
+url="http://tails.boum.org/contribute/design/">The Tails system</ulink> can
+provide some defense against this adversary through the use of readonly media
+and frequent reboots, but even this can be circumvented on machines without
+Secure Boot through the use of BIOS rootkits.
+
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect2>
+
+</sect1>
+
+<sect1 id="Implementation">
+ <title>Implementation</title>
+ <para>
+
+The Implementation section is divided into subsections, each of which
+corresponds to a <link linkend="DesignRequirements">Design Requirement</link>.
+Each subsection is divided into specific web technologies or properties. The
+implementation is then described for that property.
+
+ </para>
+ <para>
+
+In some cases, the implementation meets the design requirements in a non-ideal
+way (for example, by disabling features). In rare cases, there may be no
+implementation at all. Both of these cases are denoted by differentiating
+between the <command>Design Goal</command> and the <command>Implementation
+Status</command> for each property. Corresponding bugs in the <ulink
+url="https://trac.torproject.org/projects/tor/report">Tor bug tracker</ulink>
+are typically linked for these cases.
+
+ </para>
+ <sect2 id="proxy-obedience">
+ <title>Proxy Obedience</title>
+ <para>
+
+Proxy obedience is assured through the following:
+ </para>
+<orderedlist>
+ <listitem>Firefox proxy settings, patches, and build flags
+ <para>
+Our <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/HEAD:/build-scripts/confi…">Firefox
+preferences file</ulink> sets the Firefox proxy settings to use Tor directly as a
+SOCKS proxy. It sets <command>network.proxy.socks_remote_dns</command>,
+<command>network.proxy.socks_version</command>,
+<command>network.proxy.socks_port</command>, and
+<command>network.dns.disablePrefetch</command>.
+ </para>
+ <para>
+
+We also patch Firefox in order to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
+a DNS leak due to a WebSocket rate-limiting check</ulink>. As stated in the
+patch, we believe the direct DNS resolution performed by this check is in
+violation of the W3C standard, but <ulink
+url="https://bugzilla.mozilla.org/show_bug.cgi?id=751465">this DNS proxy leak
+remains present in stock Firefox releases</ulink>.
+
+ </para>
+ <para>
+
+During the transition to Firefox 17-ESR, a code audit was undertaken to verify
+that there were no system calls or XPCOM activity in the source tree that did
+not use the browser proxy settings. The only violation we found was that
+WebRTC was capable of creating UDP sockets and was compiled in by default. We
+subsequently disabled it using the Firefox build option
+<command>--disable-webrtc</command>.
+
+ </para>
+ <para>
+
+We have verified that these settings and patches properly proxy HTTPS, OCSP,
+HTTP, FTP, gopher (now defunct), DNS, SafeBrowsing Queries, all javascript
+activity, including HTML5 audio and video objects, addon updates, wifi
+geolocation queries, searchbox queries, XPCOM addon HTTPS/HTTP activity,
+WebSockets, and live bookmark updates. We have also verified that IPv6
+connections are not attempted, through the proxy or otherwise (Tor does not
+yet support IPv6). We have also verified that external protocol helpers, such
+as smb urls and other custom protocol handlers are all blocked.
+
+ </para>
+ <para>
+
+Numerous other third parties have also reviewed and tested the proxy settings
+and have provided test cases based on their work. See in particular <ulink
+url="http://decloak.net/">decloak.net</ulink>.
+
+ </para>
+</listitem>
+
+ <listitem>Disabling plugins
+
+ <para>Plugins have the ability to make arbitrary OS system calls and <ulink
+url="http://decloak.net/">bypass proxy settings</ulink>. This includes
+the ability to make UDP sockets and send arbitrary data independent of the
+browser proxy settings.
+ </para>
+ <para>
+Torbutton disables plugins by using the
+<command>@mozilla.org/plugin/host;1</command> service to mark the plugin tags
+as disabled. This block can be undone through both the Torbutton Security UI,
+and the Firefox Plugin Preferences.
+ </para>
+ <para>
+If the user does enable plugins in this way, plugin-handled objects are still
+restricted from automatic load through Firefox's click-to-play preference
+<command>plugins.click_to_play</command>.
+ </para>
+ <para>
+In addition, to reduce any unproxied activity by arbitrary plugins at load
+time, and to reduce the fingerprintability of the installed plugin list, we
+also patch the Firefox source code to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent the load of any plugins except
+for Flash and Gnash</ulink>.
+
+ </para>
+ </listitem>
+ <listitem>External App Blocking and Drag Event Filtering
+ <para>
+
+External apps can be induced to load files that perform network activity.
+Unfortunately, there are cases where such apps can be launched automatically
+with little to no user input. In order to prevent this, Torbutton installs a
+component to <ulink
+url="https://gitweb.torproject.org/torbutton.git/blob_plain/HEAD:/src/components…">
+provide the user with a popup</ulink> whenever the browser attempts to launch
+a helper app.
+
+ </para>
+ <para>
+
+Additionally, modern desktops now pre-emptively fetch any URLs in Drag and
+Drop events as soon as the drag is initiated. This download happens
+independent of the browser's Tor settings, and can be triggered by something
+as simple as holding the mouse button down for slightly too long while
+clicking on an image link. We had to patch Firefox to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">emit
+an observer event during dragging</ulink> to allow us to filter the drag
+events from Torbutton before the OS downloads the URLs the events contained.
+
+ </para>
+ </listitem>
+ <listitem>Disabling system extensions and clearing the addon whitelist
+ <para>
+
+Firefox addons can perform arbitrary activity on your computer, including
+bypassing Tor. It is for this reason we disable the addon whitelist
+(<command>xpinstall.whitelist.add</command>), so that users are prompted
+before installing addons regardless of the source. We also exclude
+system-level addons from the browser through the use of
+<command>extensions.enabledScopes</command> and
+<command>extensions.autoDisableScopes</command>.
+
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect2>
+ <sect2 id="state-separation">
+ <title>State Separation</title>
+ <para>
+
+Tor Browser State is separated from existing browser state through use of a
+custom Firefox profile, and by setting the $HOME environment variable to the
+root of the bundle's directory. The browser also does not load any
+system-wide extensions (through the use of
+<command>extensions.enabledScopes</command> and
+<command>extensions.autoDisableScopes</command>. Furthermore, plugins are
+disabled, which prevents Flash cookies from leaking from a pre-existing Flash
+directory.
+
+ </para>
+ </sect2>
+ <sect2 id="disk-avoidance">
+ <title>Disk Avoidance</title>
+ <sect3>
+ <title>Design Goal:</title>
+ <blockquote>
+
+The User Agent MUST (at user option) prevent all disk records of browser activity.
+The user should be able to optionally enable URL history and other history
+features if they so desire.
+
+ </blockquote>
+ </sect3>
+ <sect3>
+ <title>Implementation Status:</title>
+ <blockquote>
+
+We achieve this goal through several mechanisms. First, we set the Firefox
+Private Browsing preference
+<command>browser.privatebrowsing.autostart</command>. In addition, four Firefox patches are needed to prevent disk writes, even if
+Private Browsing Mode is enabled. We need to
+
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
+the permissions manager from recording HTTPS STS state</ulink>,
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
+intermediate SSL certificates from being recorded</ulink>,
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
+download history from being recorded</ulink>, and
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
+the content preferences service from recording site zoom</ulink>.
+
+For more details on these patches, <link linkend="firefox-patches">see the
+Firefox Patches section</link>.
+
+ </blockquote>
+ <blockquote>
+
+As an additional defense-in-depth measure, we set the following preferences:
+<command></command>,
+<command>browser.cache.disk.enable</command>,
+<command>browser.cache.offline.enable</command>,
+<command>dom.indexedDB.enabled</command>,
+<command>network.cookie.lifetimePolicy</command>,
+<command>signon.rememberSignons</command>,
+<command>browser.formfill.enable</command>,
+<command>browser.download.manager.retention</command>,
+<command>browser.sessionstore.privacy_level</command>,
+and <command>network.cookie.lifetimePolicy</command>. Many of these
+preferences are likely redundant with
+<command>browser.privatebrowsing.autostart</command>, but we have not done the
+auditing work to ensure that yet.
+
+ </blockquote>
+ <blockquote>
+
+Torbutton also <ulink
+url="https://gitweb.torproject.org/torbutton.git/blob/HEAD:/src/components/tbSes…">contains
+code</ulink> to prevent the Firefox session store from writing to disk.
+ </blockquote>
+ <blockquote>
+
+For more details on disk leak bugs and enhancements, see the <ulink
+url="https://trac.torproject.org/projects/tor/query?keywords=~tbb-disk-leak&…">tbb-disk-leak tag in our bugtracker</ulink>
+ </blockquote>
+ </sect3>
+ </sect2>
+ <sect2 id="app-data-isolation">
+ <title>Application Data Isolation</title>
+ <para>
+
+Tor Browser Bundle MUST NOT cause any information to be written outside of the
+bundle directory. This is to ensure that the user is able to completely and
+safely remove the bundle without leaving other traces of Tor usage on their
+computer.
+
+ </para>
+ <para>
+
+To ensure TBB directory isolation, we set
+<command>browser.download.useDownloadDir</command>,
+<command>browser.shell.checkDefaultBrowser</command>, and
+<command>browser.download.manager.addToRecentDocs</command>. We also set the
+$HOME environment variable to be the TBB extraction directory.
+ </para>
+
+ </sect2>
+<!-- FIXME: Write me...
+ <sect2 id="update-safety">
+ <title>Update Safety</title>
+ <para>FIXME: Write me..
+ </para>
+ </sect2>
+-->
+ <sect2 id="identifier-linkability">
+ <title>Cross-Origin Identifier Unlinkability</title>
+ <!-- FIXME: Mention web-send?? -->
+ <para>
+
+The Tor Browser MUST prevent a user's activity on one site from being linked
+to their activity on another site. When this goal cannot yet be met with an
+existing web technology, that technology or functionality is disabled. Our
+<link linkend="privacy">design goal</link> is to ultimately eliminate the need to disable arbitrary
+technologies, and instead simply alter them in ways that allows them to
+function in a backwards-compatible way while avoiding linkability. Users
+should be able to use federated login of various kinds to explicitly inform
+sites who they are, but that information should not transparently allow a
+third party to record their activity from site to site without their prior
+consent.
+
+ </para>
+ <para>
+
+The benefit of this approach comes not only in the form of reduced
+linkability, but also in terms of simplified privacy UI. If all stored browser
+state and permissions become associated with the url bar origin, the six or
+seven different pieces of privacy UI governing these identifiers and
+permissions can become just one piece of UI. For instance, a window that lists
+the url bar origin for which browser state exists, possibly with a
+context-menu option to drill down into specific types of state or permissions.
+An example of this simplification can be seen in Figure 1.
+
+ </para>
+ <figure><title>Improving the Privacy UI</title>
+ <mediaobject>
+ <imageobject>
+ <imagedata align="center" fileref="NewCookieManager.png"/>
+ </imageobject>
+ </mediaobject>
+ <caption> <para/>
+
+This example UI is a mock-up of how isolating identifiers to the URL bar
+origin can simplify the privacy UI for all data - not just cookies. Once
+browser identifiers and site permissions operate on a url bar basis, the same
+privacy window can represent browsing history, DOM Storage, HTTP Auth, search
+form history, login values, and so on within a context menu for each site.
+
+</caption>
+ </figure>
+ <orderedlist>
+ <listitem>Cookies
+ <para><command>Design Goal:</command>
+
+All cookies MUST be double-keyed to the url bar origin and third-party
+origin. There exists a <ulink
+url="https://bugzilla.mozilla.org/show_bug.cgi?id=565965">Mozilla bug</ulink>
+that contains a prototype patch, but it lacks UI, and does not apply to modern
+Firefoxes.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+As a stopgap to satisfy our design requirement of unlinkability, we currently
+entirely disable 3rd party cookies by setting
+<command>network.cookie.cookieBehavior</command> to 1. We would prefer that
+third party content continue to function, but we believe the requirement for
+unlinkability trumps that desire.
+
+ </para>
+ </listitem>
+ <listitem>Cache
+ <para>
+
+Cache is isolated to the url bar origin by using a technique pioneered by
+Colin Jackson et al, via their work on <ulink
+url="http://www.safecache.com/">SafeCache</ulink>. The technique re-uses the
+<ulink
+url="https://developer.mozilla.org/en/XPCOM_Interface_Reference/nsICachingChannel">nsICachingChannel.cacheKey</ulink>
+attribute that Firefox uses internally to prevent improper caching and reuse
+of HTTP POST data.
+
+ </para>
+ <para>
+
+However, to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3666">increase the
+security of the isolation</ulink> and to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3754">solve conflicts
+with OCSP relying the cacheKey property for reuse of POST requests</ulink>, we
+had to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
+Firefox to provide a cacheDomain cache attribute</ulink>. We use the fully
+qualified url bar domain as input to this field, to avoid the complexities
+of heuristically determining the second-level DNS name.
+
+ </para>
+ <para>
+
+<!-- FIXME: This could use a few more specifics.. Maybe. The Chrome folks
+won't care, but the Mozilla folks might. --> Furthermore, we chose a different
+isolation scheme than the Stanford implementation. First, we decoupled the
+cache isolation from the third party cookie attribute. Second, we use several
+mechanisms to attempt to determine the actual location attribute of the
+top-level window (to obtain the url bar FQDN) used to load the page, as
+opposed to relying solely on the Referer property.
+
+ </para>
+ <para>
+
+Therefore, <ulink
+url="http://crypto.stanford.edu/sameorigin/safecachetest.html">the original
+Stanford test cases</ulink> are expected to fail. Functionality can still be
+verified by navigating to <ulink url="about:cache">about:cache</ulink> and
+viewing the key used for each cache entry. Each third party element should
+have an additional "domain=string" property prepended, which will list the
+FQDN that was used to source the third party element.
+
+ </para>
+ <para>
+
+Additionally, because the image cache is a separate entity from the content
+cache, we had to patch Firefox to also <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">isolate
+this cache per url bar domain</ulink>.
+
+ </para>
+ </listitem>
+ <listitem>HTTP Auth
+ <para>
+
+HTTP authentication tokens are removed for third party elements using the
+<ulink
+url="https://developer.mozilla.org/en/Setting_HTTP_request_headers#Observers">http-on-modify-request
+observer</ulink> to remove the Authorization headers to prevent <ulink
+url="http://jeremiahgrossman.blogspot.com/2007/04/tracking-users-without-cookies…">silent
+linkability between domains</ulink>.
+ </para>
+ </listitem>
+ <listitem>DOM Storage
+ <para>
+
+DOM storage for third party domains MUST be isolated to the url bar origin,
+to prevent linkability between sites. This functionality is provided through a
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
+to Firefox</ulink>.
+
+ </para>
+ </listitem>
+ <listitem>Flash cookies
+ <para><command>Design Goal:</command>
+
+Users should be able to click-to-play flash objects from trusted sites. To
+make this behavior unlinkable, we wish to include a settings file for all platforms that disables flash
+cookies using the <ulink
+url="http://www.macromedia.com/support/documentation/en/flashplayer/help/setting…">Flash
+settings manager</ulink>.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+We are currently <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3974">having
+difficulties</ulink> causing Flash player to use this settings
+file on Windows, so Flash remains difficult to enable.
+
+ </para>
+ </listitem>
+ <listitem>SSL+TLS session resumption, HTTP Keep-Alive and SPDY
+ <para><command>Design Goal:</command>
+
+TLS session resumption tickets and SSL Session IDs MUST be limited to the url
+bar origin. HTTP Keep-Alive connections from a third party in one url bar
+origin MUST NOT be reused for that same third party in another url bar origin.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+We currently clear SSL Session IDs upon <link linkend="new-identity">New
+Identity</link>, we disable TLS Session Tickets via the Firefox Pref
+<command>security.enable_tls_session_tickets</command>. We disable SSL Session
+IDs via a <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
+to Firefox</ulink>. To compensate for the increased round trip latency from disabling
+these performance optimizations, we also enable
+<ulink url="https://tools.ietf.org/html/draft-bmoeller-tls-falsestart-00">TLS
+False Start</ulink> via the Firefox Pref
+<command>security.ssl.enable_false_start</command>.
+ </para>
+ <para>
+
+Because of the extreme performance benefits of HTTP Keep-Alive for interactive
+web apps, and because of the difficulties of conveying urlbar origin
+information down into the Firefox HTTP layer, as a compromise we currently
+merely reduce the HTTP Keep-Alive timeout to 20 seconds (which is measured
+from the last packet read on the connection) using the Firefox preference
+<command>network.http.keep-alive.timeout</command>.
+
+ </para>
+ <para>
+However, because SPDY can store identifiers and has extremely long keepalive
+duration, it is disabled through the Firefox preference
+<command>network.http.spdy.enabled</command>.
+ </para>
+ </listitem>
+ <listitem>Automated cross-origin redirects MUST NOT store identifiers
+ <para><command>Design Goal:</command>
+
+To prevent attacks aimed at subverting the Cross-Origin Identifier
+Unlinkability <link linkend="privacy">privacy requirement</link>, the browser
+MUST NOT store any identifiers (cookies, cache, DOM storage, HTTP auth, etc)
+for cross-origin redirect intermediaries that do not prompt for user input.
+For example, if a user clicks on a bit.ly url that redirects to a
+doubleclick.net url that finally redirects to a cnn.com url, only cookies from
+cnn.com should be retained after the redirect chain completes.
+
+ </para>
+ <para>
+
+Non-automated redirect chains that require user input at some step (such as
+federated login systems) SHOULD still allow identifiers to persist.
+
+ </para>
+ <para><command>Implementation status:</command>
+
+There are numerous ways for the user to be redirected, and the Firefox API
+support to detect each of them is poor. We have a <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3600">trac bug
+open</ulink> to implement what we can.
+
+ </para>
+ </listitem>
+ <listitem>window.name
+ <para>
+
+<ulink
+url="https://developer.mozilla.org/En/DOM/Window.name">window.name</ulink> is
+a magical DOM property that for some reason is allowed to retain a persistent value
+for the lifespan of a browser tab. It is possible to utilize this property for
+<ulink url="http://www.thomasfrank.se/sessionvars.html">identifier
+storage</ulink>.
+
+ </para>
+ <para>
+
+In order to eliminate non-consensual linkability but still allow for sites
+that utilize this property to function, we reset the window.name property of
+tabs in Torbutton every time we encounter a blank Referer. This behavior
+allows window.name to persist for the duration of a click-driven navigation
+session, but as soon as the user enters a new URL or navigates between
+https/http schemes, the property is cleared.
+
+ </para>
+ </listitem>
+ <listitem>Auto form-fill
+ <para>
+
+We disable the password saving functionality in the browser as part of our
+<link linkend="disk-avoidance">Disk Avoidance</link> requirement. However,
+since users may decide to re-enable disk history records and password saving,
+we also set the <ulink
+url="http://kb.mozillazine.org/Signon.autofillForms">signon.autofillForms</ulink>
+preference to false to prevent saved values from immediately populating
+fields upon page load. Since Javascript can read these values as soon as they
+appear, setting this preference prevents automatic linkability from stored passwords.
+
+ </para>
+ </listitem>
+ <listitem>HSTS supercookies
+ <para>
+
+An extreme (but not impossible) attack to mount is the creation of <ulink
+url="http://www.leviathansecurity.com/blog/archives/12-The-Double-Edged-Sword-of…">HSTS
+supercookies</ulink>. Since HSTS effectively stores one bit of information per domain
+name, an adversary in possession of numerous domains can use them to construct
+cookies based on stored HSTS state.
+
+ </para>
+ <para><command>Design Goal:</command>
+
+There appears to be three options for us: 1. Disable HSTS entirely, and rely
+instead on HTTPS-Everywhere to crawl and ship rules for HSTS sites. 2.
+Restrict the number of HSTS-enabled third parties allowed per url bar origin.
+3. Prevent third parties from storing HSTS rules. We have not yet decided upon
+the best approach.
+
+ </para>
+ <para><command>Implementation Status:</command> Currently, HSTS state is
+cleared by <link linkend="new-identity">New Identity</link>, but we don't
+defend against the creation of these cookies between <command>New
+Identity</command> invocations.
+ </para>
+ </listitem>
+ <listitem>Exit node usage
+ <para><command>Design Goal:</command>
+
+Every distinct navigation session (as defined by a non-blank Referer header)
+MUST exit through a fresh Tor circuit in Tor Browser to prevent exit node
+observers from linking concurrent browsing activity.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+The Tor feature that supports this ability only exists in the 0.2.3.x-alpha
+series. <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3455">Ticket
+#3455</ulink> is the Torbutton ticket to make use of the new Tor
+functionality.
+
+ </para>
+ </listitem>
+ </orderedlist>
+ <para>
+For more details on identifier linkability bugs and enhancements, see the <ulink
+url="https://trac.torproject.org/projects/tor/query?keywords=~tbb-linkability&am…">tbb-linkability tag in our bugtracker</ulink>
+ </para>
+ </sect2>
+ <sect2 id="fingerprinting-linkability">
+ <title>Cross-Origin Fingerprinting Unlinkability</title>
+ <para>
+
+In order to properly address the fingerprinting adversary on a technical
+level, we need a metric to measure linkability of the various browser
+properties beyond any stored origin-related state. <ulink
+url="https://panopticlick.eff.org/about.php">The Panopticlick Project</ulink>
+by the EFF provides us with a prototype of such a metric. The researchers
+conducted a survey of volunteers who were asked to visit an experiment page
+that harvested many of the above components. They then computed the Shannon
+Entropy of the resulting distribution of each of several key attributes to
+determine how many bits of identifying information each attribute provided.
+
+ </para>
+ <para>
+
+Many browser features have been added since the EFF first ran their experiment
+and collected their data. To avoid an infinite sinkhole, we reduce the efforts
+for fingerprinting resistance by only concerning ourselves with reducing the
+fingerprintable differences <emphasis>among</emphasis> Tor Browser users. We
+do not believe it is possible to solve cross-browser fingerprinting issues.
+
+ </para>
+ <para>
+
+Unfortunately, the unsolvable nature of the cross-browser fingerprinting
+problem means that the Panopticlick test website itself is not useful for
+evaluating the actual effectiveness of our defenses, or the fingerprinting
+defenses of any other web browser. Because the Panopticlick dataset is based
+on browser data spanning a number of widely deployed browsers over a number of
+years, any fingerprinting defenses attempted by browsers today are very likely
+to cause Panopticlick to report an <emphasis>increase</emphasis> in
+fingerprintability and entropy, because those defenses will stand out in sharp
+contrast to historical data. We have been <ulink
+url="https://trac.torproject.org/projects/tor/ticket/6119">working to convince
+the EFF</ulink> that it is worthwhile to release the source code to
+Panopticlick to allow us to run our own version for this reason.
+
+ </para>
+ <sect3 id="fingerprinting-defenses">
+ <title>Fingerprinting defenses in the Tor Browser</title>
+
+ <orderedlist>
+ <listitem>Plugins
+ <para>
+
+Plugins add to fingerprinting risk via two main vectors: their mere presence in
+window.navigator.plugins, as well as their internal functionality.
+
+ </para>
+ <para><command>Design Goal:</command>
+
+All plugins that have not been specifically audited or sandboxed MUST be
+disabled. To reduce linkability potential, even sandboxed plugins should not
+be allowed to load objects until the user has clicked through a click-to-play
+barrier. Additionally, version information should be reduced or obfuscated
+until the plugin object is loaded. For flash, we wish to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3974">provide a
+settings.sol file</ulink> to disable Flash cookies, and to restrict P2P
+features that are likely to bypass proxy settings.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+Currently, we entirely disable all plugins in Tor Browser. However, as a
+compromise due to the popularity of Flash, we allow users to re-enable Flash,
+and flash objects are blocked behind a click-to-play barrier that is available
+only after the user has specifically enabled plugins. Flash is the only plugin
+available, the rest are <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">entirely
+blocked from loading by a Firefox patch</ulink>. We also set the Firefox
+preference <command>plugin.expose_full_path</command> to false, to avoid
+leaking plugin installation information.
+
+ </para>
+ </listitem>
+ <listitem>HTML5 Canvas Image Extraction
+ <para>
+
+The <ulink url="https://developer.mozilla.org/en-US/docs/HTML/Canvas">HTML5
+Canvas</ulink> is a feature that has been added to major browsers after the
+EFF developed their Panopticlick study. After plugins and plugin-provided
+information, we believe that the HTML5 Canvas is the single largest
+fingerprinting threat browsers face today. <ulink
+url="http://www.w2spconf.com/2012/papers/w2sp12-final4.pdf">Initial
+studies</ulink> show that the Canvas can provide an easy-access fingerprinting
+target: The adversary simply renders WebGL, font, and named color data to a
+Canvas element, extracts the image buffer, and computes a hash of that image
+data. Subtle differences in the video card, font packs, and even font and
+graphics library versions allow the adversary to produce a stable, simple,
+high-entropy fingerprint of a computer. In fact, the hash of the rendered
+image can be used almost identically to a tracking cookie by the web server.
+
+ </para>
+ <para>
+
+To reduce the threat from this vector, we have patched Firefox to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prompt
+before returning valid image data</ulink> to the Canvas APIs. If the user
+hasn't previously allowed the site in the URL bar to access Canvas image data,
+pure white image data is returned to the Javascript APIs.
+
+ </para>
+ </listitem>
+ <listitem>WebGL
+ <para>
+
+WebGL is fingerprintable both through information that is exposed about the
+underlying driver and optimizations, as well as through performance
+fingerprinting.
+
+ </para>
+ <para>
+
+Because of the large amount of potential fingerprinting vectors and the <ulink
+url="http://www.contextis.com/resources/blog/webgl/">previously unexposed
+vulnerability surface</ulink>, we deploy a similar strategy against WebGL as
+for plugins. First, WebGL Canvases have click-to-play placeholders (provided
+by NoScript), and do not run until authorized by the user. Second, we
+obfuscate driver information by setting the Firefox preferences
+<command>webgl.disable-extensions</command> and
+<command>webgl.min_capability_mode</command>, which reduce the information
+provided by the following WebGL API calls: <command>getParameter()</command>,
+<command>getSupportedExtensions()</command>, and
+<command>getExtension()</command>.
+
+ </para>
+ </listitem>
+ <listitem>Fonts
+ <para>
+
+According to the Panopticlick study, fonts provide the most linkability when
+they are provided as an enumerable list in filesystem order, via either the
+Flash or Java plugins. However, it is still possible to use CSS and/or
+Javascript to query for the existence of specific fonts. With a large enough
+pre-built list to query, a large amount of fingerprintable information may
+still be available.
+
+ </para>
+ <para>
+
+The sure-fire way to address font linkability is to ship the browser with a
+font for every language, typeface, and style in use in the world, and to only
+use those fonts at the exclusion of system fonts. However, this set may be
+impractically large. It is possible that a smaller <ulink
+url="https://secure.wikimedia.org/wikipedia/en/wiki/Unicode_typeface#List_of_Uni…">common
+subset</ulink> may be found that provides total coverage. However, we believe
+that with strong url bar origin identifier isolation, a simpler approach can reduce the
+number of bits available to the adversary while avoiding the rendering and
+language issues of supporting a global font set.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+We disable plugins, which prevents font enumeration. Additionally, we limit
+both the number of font queries from CSS, as well as the total number of
+fonts that can be used in a document <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">with
+a Firefox patch</ulink>. We create two prefs,
+<command>browser.display.max_font_attempts</command> and
+<command>browser.display.max_font_count</command> for this purpose. Once these
+limits are reached, the browser behaves as if
+<command>browser.display.use_document_fonts</command> was set. We are
+still working to determine optimal values for these prefs.
+
+ </para>
+ <para>
+
+To improve rendering, we exempt remote <ulink
+url="https://developer.mozilla.org/en-US/docs/CSS/@font-face">@font-face
+fonts</ulink> from these counts, and if a font-family CSS rule lists a remote
+font (in any order), we use that font instead of any of the named local fonts.
+
+ </para>
+ </listitem>
+ <listitem>Desktop resolution, CSS Media Queries, and System Colors
+ <para>
+
+Both CSS and Javascript have access to a lot of information about the screen
+resolution, usable desktop size, OS widget size, toolbar size, title bar size,
+system theme colors, and other desktop features that are not at all relevant
+to rendering and serve only to provide information for fingerprinting.
+
+ </para>
+ <para><command>Design Goal:</command>
+
+Our design goal here is to reduce the resolution information down to the bare
+minimum required for properly rendering inside a content window. We intend to
+report all rendering information correctly with respect to the size and
+properties of the content window, but report an effective size of 0 for all
+border material, and also report that the desktop is only as big as the
+inner content window. Additionally, new browser windows are sized such that
+their content windows are one of a few fixed sizes based on the user's
+desktop resolution.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+We have implemented the above strategy using a window observer to <ulink
+url="https://gitweb.torproject.org/torbutton.git/blob/HEAD:/src/chrome/content/t…">resize
+new windows based on desktop resolution</ulink>. Additionally, we patch
+Firefox to use the client content window size <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">for
+window.screen</ulink> and <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">for
+CSS Media Queries</ulink>. Similarly, we <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
+DOM events to return content window relative points</ulink>. We also patch
+Firefox to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">report
+a fixed set of system colors to content window CSS</ulink>.
+
+ </para>
+ <para>
+
+To further reduce resolution-based fingerprinting, we are <ulink
+url="https://trac.torproject.org/projects/tor/ticket/7256">investigating
+zoom/viewport-based mechanisms</ulink> that might allow us to always report
+the same desktop resolution regardless of the actual size of the content
+window, and simply scale to make up the difference. However, the complexity
+and rendering impact of such a change is not yet known.
+
+ </para>
+ </listitem>
+ <listitem>User Agent and HTTP Headers
+ <para><command>Design Goal:</command>
+
+All Tor Browser users MUST provide websites with an identical user agent and
+HTTP header set for a given request type. We omit the Firefox minor revision,
+and report a popular Windows platform. If the software is kept up to date,
+these headers should remain identical across the population even when updated.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+Firefox provides several options for controlling the browser user agent string
+which we leverage. We also set similar prefs for controlling the
+Accept-Language and Accept-Charset headers, which we spoof to English by default. Additionally, we
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">remove
+content script access</ulink> to Components.interfaces, which <ulink
+url="http://pseudo-flaw.net/tor/torbutton/fingerprint-firefox.html">can be
+used</ulink> to fingerprint OS, platform, and Firefox minor version. </para>
+
+ </listitem>
+ <listitem>Timezone and clock offset
+ <para><command>Design Goal:</command>
+
+All Tor Browser users MUST report the same timezone to websites. Currently, we
+choose UTC for this purpose, although an equally valid argument could be made
+for EDT/EST due to the large English-speaking population density (coupled with
+the fact that we spoof a US English user agent). Additionally, the Tor
+software should detect if the users clock is significantly divergent from the
+clocks of the relays that it connects to, and use this to reset the clock
+values used in Tor Browser to something reasonably accurate.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+We set the timezone using the TZ environment variable, which is supported on
+all platforms. Additionally, we plan to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3652">obtain a clock
+offset from Tor</ulink>, but this won't be available until Tor 0.2.3.x is in
+use.
+
+ </para>
+ </listitem>
+ <listitem>Javascript performance fingerprinting
+ <para>
+
+<ulink url="http://w2spconf.com/2011/papers/jspriv.pdf">Javascript performance
+fingerprinting</ulink> is the act of profiling the performance
+of various Javascript functions for the purpose of fingerprinting the
+Javascript engine and the CPU.
+
+ </para>
+ <para><command>Design Goal:</command>
+
+We have <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3059">several potential
+mitigation approaches</ulink> to reduce the accuracy of performance
+fingerprinting without risking too much damage to functionality. Our current
+favorite is to reduce the resolution of the Event.timeStamp and the Javascript
+Date() object, while also introducing jitter. Our goal is to increase the
+amount of time it takes to mount a successful attack. <ulink
+url="http://w2spconf.com/2011/papers/jspriv.pdf">Mowery et al</ulink> found that
+even with the default precision in most browsers, they required up to 120
+seconds of amortization and repeated trials to get stable results from their
+feature set. We intend to work with the research community to establish the
+optimum trade-off between quantization+jitter and amortization time.
+
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+Currently, the only mitigation against performance fingerprinting is to
+disable <ulink url="http://www.w3.org/TR/navigation-timing/">Navigation
+Timing</ulink> through the Firefox preference
+<command>dom.enable_performance</command>.
+
+ </para>
+ </listitem>
+ <listitem>Non-Uniform HTML5 API Implementations
+ <para>
+
+At least two HTML5 features have different implementation status across the
+major OS vendors: the <ulink
+url="https://developer.mozilla.org/en-US/docs/DOM/window.navigator.battery">Battery
+API</ulink> and the <ulink
+url="https://developer.mozilla.org/en-US/docs/DOM/window.navigator.connection">Network
+Connection API</ulink>. We disable these APIs
+through the Firefox preferences <command>dom.battery.enabled</command> and
+<command>dom.network.enabled</command>.
+
+ </para>
+ </listitem>
+ <listitem>Keystroke fingerprinting
+ <para>
+
+Keystroke fingerprinting is the act of measuring key strike time and key
+flight time. It is seeing increasing use as a biometric.
+
+ </para>
+ <para><command>Design Goal:</command>
+
+We intend to rely on the same mechanisms for defeating Javascript performance
+fingerprinting: timestamp quantization and jitter.
+
+ </para>
+ <para><command>Implementation Status:</command>
+We have no implementation as of yet.
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect3>
+ <para>
+For more details on identifier linkability bugs and enhancements, see the <ulink
+url="https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting…">tbb-fingerprinting tag in our bugtracker</ulink>
+ </para>
+ </sect2>
+ <sect2 id="new-identity">
+ <title>Long-Term Unlinkability via "New Identity" button</title>
+ <para>
+
+In order to avoid long-term linkability, we provide a "New Identity" context
+menu option in Torbutton. This context menu option is active if Torbutton can
+read the environment variables $TOR_CONTROL_PASSWD and $TOR_CONTROL_PORT.
+
+ </para>
+
+ <sect3>
+ <title>Design Goal:</title>
+ <blockquote>
+
+All linkable identifiers and browser state MUST be cleared by this feature.
+
+ </blockquote>
+ </sect3>
+
+ <sect3>
+ <title>Implementation Status:</title>
+ <blockquote>
+ <para>
+
+First, Torbutton disables Javascript in all open tabs and windows by using
+both the <ulink
+url="https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIDocSh…">browser.docShell.allowJavascript</ulink>
+attribute as well as <ulink
+url="https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIDOMWi…">nsIDOMWindowUtil.suppressEventHandling()</ulink>.
+We then stop all page activity for each tab using <ulink
+url="https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIWebNa…">browser.webNavigation.stop(nsIWebNavigation.STOP_ALL)</ulink>.
+We then clear the site-specific Zoom by temporarily disabling the preference
+<command>browser.zoom.siteSpecific</command>, and clear the GeoIP wifi token URL
+<command>geo.wifi.access_token</command> and the last opened URL prefs (if
+they exist). Each tab is then closed.
+
+ </para>
+ <para>
+
+After closing all tabs, we then emit "<ulink
+url="https://developer.mozilla.org/en-US/docs/Supporting_private_browsing_mode#P…">browser:purge-session-history</ulink>"
+(which instructs addons and various Firefox components to clear their session
+state), and then manually clear the following state: searchbox and findbox
+text, HTTP auth, SSL state, OCSP state, site-specific content preferences
+(including HSTS state), content and image cache, offline cache, Cookies, DOM
+storage, DOM local storage, the safe browsing key, and the Google wifi geolocation
+token (if it exists).
+
+ </para>
+ <para>
+
+After the state is cleared, we then close all remaining HTTP keep-alive
+connections and then send the NEWNYM signal to the Tor control port to cause a
+new circuit to be created.
+ </para>
+ <para>
+Finally, a fresh browser window is opened, and the current browser window is
+closed (this does not spawn a new Firefox process, only a new window).
+ </para>
+ </blockquote>
+ <blockquote>
+If the user chose to "protect" any cookies by using the Torbutton Cookie
+Protections UI, those cookies are not cleared as part of the above.
+ </blockquote>
+ </sect3>
+ </sect2>
+<!--
+ <sect2 id="click-to-play">
+ <title>Click-to-play for plugins and invasive content</title>
+ <para>
+Some content types are too invasive and/or too opaque for us to properly
+eliminate their linkability properties. For these content types, we use
+NoScript to provide click-to-play placeholders that do not activate the
+content until the user clicks on it. This will eliminate the ability for an
+adversary to use such content types to link users in a dragnet fashion across
+arbitrary sites.
+ </para>
+ <para>
+Currently, the content types isolated in this way include Flash, WebGL, and
+audio and video objects.
+ </para>
+ </sect2>
+-->
+ <sect2 id="other-security">
+ <title>Other Security Measures</title>
+ <para>
+
+In addition to the above mechanisms that are devoted to preserving privacy
+while browsing, we also have a number of technical mechanisms to address other
+privacy and security issues.
+
+ </para>
+ <orderedlist>
+ <listitem id="traffic-fingerprinting-defenses"><command>Website Traffic Fingerprinting Defenses</command>
+ <para>
+
+<link linkend="website-traffic-fingerprinting">Website Traffic
+Fingerprinting</link> is a statistical attack to attempt to recognize specific
+encrypted website activity.
+
+ </para>
+ <sect3>
+ <title>Design Goal:</title>
+ <blockquote>
+ <para>
+
+We want to deploy a mechanism that reduces the accuracy of <ulink
+url="https://en.wikipedia.org/wiki/Feature_selection">useful features</ulink> available
+for classification. This mechanism would either impact the true and false
+positive accuracy rates, <emphasis>or</emphasis> reduce the number of webpages
+that could be classified at a given accuracy rate.
+
+ </para>
+ <para>
+
+Ideally, this mechanism would be as light-weight as possible, and would be
+tunable in terms of overhead. We suspect that it may even be possible to
+deploy a mechanism that reduces feature extraction resolution without any
+network overhead. In the no-overhead category, we have <ulink
+url="http://freehaven.net/anonbib/cache/LZCLCP_NDSS11.pdf">HTTPOS</ulink> and
+<ulink
+url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">better
+use of HTTP pipelining and/or SPDY</ulink>.
+In the tunable/low-overhead
+category, we have <ulink
+url="http://freehaven.net/anonbib/cache/ShWa-Timing06.pdf">Adaptive
+Padding</ulink> and <ulink url="http://www.cs.sunysb.edu/~xcai/fp.pdf">
+Congestion-Sensitive BUFLO</ulink>. It may be also possible to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/7028">tune such
+defenses</ulink> such that they only use existing spare Guard bandwidth capacity in the Tor
+network, making them also effectively no-overhead.
+
+ </para>
+ </blockquote>
+ </sect3>
+ <sect3>
+ <title>Implementation Status:</title>
+ <blockquote>
+ <para>
+Currently, we patch Firefox to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">randomize
+pipeline order and depth</ulink>. Unfortunately, pipelining is very fragile.
+Many sites do not support it, and even sites that advertise support for
+pipelining may simply return error codes for successive requests, effectively
+forcing the browser into non-pipelined behavior. Firefox also has code to back
+off and reduce or eliminate the pipeline if this happens. These
+shortcomings and fallback behaviors are the primary reason that Google
+developed SPDY as opposed simply extending HTTP to improve pipelining. It
+turns out that we could actually deploy exit-side proxies that allow us to
+<ulink
+url="https://gitweb.torproject.org/torspec.git/blob/HEAD:/proposals/ideas/xxx-us…">use
+SPDY from the client to the exit node</ulink>. This would make our defense not
+only free, but one that actually <emphasis>improves</emphasis> performance.
+
+ </para>
+ <para>
+
+Knowing this, we created this defense as an <ulink
+url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">experimental
+research prototype</ulink> to help evaluate what could be done in the best
+case with full server support. Unfortunately, the bias in favor of compelling
+attack papers has caused academia to ignore this request thus far, instead
+publishing only cursory (yet "devastating") evaluations that fail to provide
+even simple statistics such as the rates of actual pipeline utilization during
+their evaluations, in addition to the other shortcomings and shortcuts <link
+linkend="website-traffic-fingerprinting">mentioned earlier</link>. We can
+accept that our defense might fail to work as well as others (in fact we
+expect it), but unfortunately the very same shortcuts that provide excellent
+attack results also allow the conclusion that all defenses are broken forever.
+So sadly, we are still left in the dark on this point.
+
+ </para>
+ </blockquote>
+ </sect3>
+ </listitem>
+ <listitem><command>Privacy-preserving update notification</command>
+ <para>
+
+In order to inform the user when their Tor Browser is out of date, we perform a
+privacy-preserving update check asynchronously in the background. The
+check uses Tor to download the file <ulink
+url="https://check.torproject.org/RecommendedTBBVersions">https://check.torproject.org/RecommendedTBBVersions</ulink>
+and searches that version list for the current value for the local preference
+<command>torbrowser.version</command>. If the value from our preference is
+present in the recommended version list, the check is considered to have
+succeeded and the user is up to date. If not, it is considered to have failed
+and an update is needed. The check is triggered upon browser launch, new
+window, and new tab, but is rate limited so as to happen no more frequently
+than once every 1.5 hours.
+
+ </para>
+ <para>
+
+If the check fails, we cache this fact, and update the Torbutton graphic to
+display a flashing warning icon and insert a menu option that provides a link
+to our download page. Additionally, we reset the value for the browser
+homepage to point to a <ulink
+url="https://check.torproject.org/?lang=en-US&small=1&uptodate=0">page that
+informs the user</ulink> that their browser is out of
+date.
+
+ </para>
+ </listitem>
+
+ </orderedlist>
+ </sect2>
+ <sect2 id="firefox-patches">
+ <title>Description of Firefox Patches</title>
+ <para>
+
+The set of patches we have against Firefox can be found in the <ulink
+url="https://gitweb.torproject.org/torbrowser.git/tree/maint-2.4:/src/current-pa…">current-patches directory of the torbrowser git repository</ulink>. They are:
+
+ </para>
+ <orderedlist>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Block
+Components.interfaces</ulink>
+ <para>
+
+In order to reduce fingerprinting, we block access to this interface from
+content script. Components.interfaces can be used for fingerprinting the
+platform, OS, and Firebox version, but not much else.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make
+Permissions Manager memory only</ulink>
+ <para>
+
+This patch exposes a pref 'permissions.memory_only' that properly isolates the
+permissions manager to memory, which is responsible for all user specified
+site permissions, as well as stored <ulink
+url="https://secure.wikimedia.org/wikipedia/en/wiki/HTTP_Strict_Transport_Securi…">HSTS</ulink>
+policy from visited sites.
+
+The pref does successfully clear the permissions manager memory if toggled. It
+does not need to be set in prefs.js, and can be handled by Torbutton.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make
+Intermediate Cert Store memory-only</ulink>
+ <para>
+
+The intermediate certificate store records the intermediate SSL certificates
+the browser has seen to date. Because these intermediate certificates are used
+by a limited number of domains (and in some cases, only a single domain),
+the intermediate certificate store can serve as a low-resolution record of
+browsing history.
+
+ </para>
+ <!-- FIXME: Should this be a <note> tag too? -->
+ <para><command>Design Goal:</command>
+
+As an additional design goal, we would like to later alter this patch to allow this
+information to be cleared from memory. The implementation does not currently
+allow this.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add
+a string-based cacheKey property for domain isolation</ulink>
+ <para>
+
+To <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3666">increase the
+security of cache isolation</ulink> and to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3754">solve strange and
+unknown conflicts with OCSP</ulink>, we had to patch
+Firefox to provide a cacheDomain cache attribute. We use the url bar
+FQDN as input to this field.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Block
+all plugins except flash</ulink>
+ <para>
+We cannot use the <ulink
+url="http://www.oxymoronical.com/experiments/xpcomref/applications/Firefox/3.5/c…">
+(a)mozilla.org/extensions/blocklist;1</ulink> service, because we
+actually want to stop plugins from ever entering the browser's process space
+and/or executing code (for example, AV plugins that collect statistics/analyze
+URLs, magical toolbars that phone home or "help" the user, Skype buttons that
+ruin our day, and censorship filters). Hence we rolled our own.
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make content-prefs service memory only</ulink>
+ <para>
+This patch prevents random URLs from being inserted into content-prefs.sqlite in
+the profile directory as content prefs change (includes site-zoom and perhaps
+other site prefs?).
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make Tor Browser exit when not launched from Vidalia</ulink>
+ <para>
+
+It turns out that on Windows 7 and later systems, the Taskbar attempts to
+automatically learn the most frequent apps used by the user, and it recognizes
+Tor Browser as a separate app from Vidalia. This can cause users to try to
+launch Tor Browser without Vidalia or a Tor instance running. Worse, the Tor
+Browser will automatically find their default Firefox profile, and properly
+connect directly without using Tor. This patch is a simple hack to cause Tor
+Browser to immediately exit in this case.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Disable SSL Session ID tracking</ulink>
+ <para>
+
+This patch is a simple 1-line hack to prevent SSL connections from caching
+(and then later transmitting) their Session IDs. There was no preference to
+govern this behavior, so we had to hack it by altering the SSL new connection
+defaults.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Provide an observer event to close persistent connections</ulink>
+ <para>
+
+This patch creates an observer event in the HTTP connection manager to close
+all keep-alive connections that still happen to be open. This event is emitted
+by the <link linkend="new-identity">New Identity</link> button.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Limit Device and System Specific Media Queries</ulink>
+ <para>
+
+<ulink url="https://developer.mozilla.org/en-US/docs/CSS/Media_queries">CSS
+Media Queries</ulink> have a fingerprinting capability approaching that of
+Javascript. This patch causes such Media Queries to evaluate as if the device
+resolution was equal to the content window resolution.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Limit the number of fonts per document</ulink>
+ <para>
+
+Font availability can be <ulink url="http://flippingtypical.com/">queried by
+CSS and Javascript</ulink> and is a fingerprinting vector. This patch limits
+the number of times CSS and Javascript can cause font-family rules to
+evaluate. Remote @font-face fonts are exempt from the limits imposed by this
+patch, and remote fonts are given priority over local fonts whenever both
+appear in the same font-family rule. We do this by explicitly altering the
+nsRuleNode rule represenation itself to remove the local font families before
+the rule hits the font renderer.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Rebrand Firefox to Tor Browser</ulink>
+ <para>
+
+This patch updates our branding in compliance with Mozilla's trademark policy.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make Download Manager Memory Only</ulink>
+ <para>
+
+This patch prevents disk leaks from the download manager. The original
+behavior is to write the download history to disk and then delete it, even if
+you disable download history from your Firefox preferences.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add DDG and StartPage to Omnibox</ulink>
+ <para>
+
+This patch adds DuckDuckGo and StartPage to the Search Box, and sets our
+default search engine to StartPage. We deployed this patch due to excessive
+Captchas and complete 403 bans from Google.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make nsICacheService.EvictEntries() Synchronous</ulink>
+ <para>
+
+This patch eliminates a race condition with "New Identity". Without it,
+cache-based Evercookies survive for up to a minute after clearing the cache
+on some platforms.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Prevent WebSockets DNS Leak</ulink>
+ <para>
+
+This patch prevents a DNS leak when using WebSockets. It also prevents other
+similar types of DNS leaks.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Randomize HTTP pipeline order and depth</ulink>
+ <para>
+As an
+<ulink
+url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">experimental
+defense against Website Traffic Fingerprinting</ulink>, we patch the standard
+HTTP pipelining code to randomize the number of requests in a
+pipeline, as well as their order.
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Emit
+an observer event to filter the Drag and Drop URL list</ulink>
+ <para>
+
+This patch allows us to block external Drag and Drop events from Torbutton.
+We need to block Drag and Drop because Mac OS and Ubuntu both immediately load
+any URLs they find in your drag buffer before you even drop them (without
+using your browser's proxy settings, of course). This can lead to proxy bypass
+during user activity that is as basic as holding down the mouse button for
+slightly too long while clicking on an image link.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add mozIThirdPartyUtil.getFirstPartyURI() API</ulink>
+ <para>
+
+This patch provides an API that allows us to more easily isolate identifiers
+to the URL bar domain.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add canvas image extraction prompt</ulink>
+ <para>
+
+This patch prompts the user before returning canvas image data. Canvas image
+data can be used to create an extremely stable, high-entropy fingerprint based
+on the unique rendering behavior of video cards, OpenGL behavior,
+system fonts, and supporting library versions.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Return client window coordinates for mouse events</ulink>
+ <para>
+
+This patch causes mouse events to return coordinates relative to the content
+window instead of the desktop.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Do not expose physical screen info to window.screen</ulink>
+ <para>
+
+This patch causes window.screen to return the display resolution size of the
+content window instead of the desktop resolution size.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Do not expose system colors to CSS or canvas</ulink>
+ <para>
+
+This patch prevents CSS and Javascript from discovering your desktop color
+scheme and/or theme.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Isolate the Image Cache per url bar domain</ulink>
+ <para>
+
+This patch prevents cached images from being used to store third party tracking
+identifiers.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">nsIHTTPChannel.redirectTo() API</ulink>
+ <para>
+
+This patch provides HTTPS-Everywhere with an API to perform redirections more
+securely and without addon conflicts.
+
+ </para>
+ </listitem>
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Isolate DOM Storage to first party URI</ulink>
+ <para>
+
+This patch prevents DOM Storage from being used to store third party tracking
+identifiers.
+
+ </para>
+ </listitem>
+
+ <listitem><ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Remove
+"This plugin is disabled" barrier</ulink>
+
+ <para>
+
+This patch removes a barrier that was informing users that plugins were
+disabled and providing them with a link to enable them. We felt this was poor
+user experience, especially since the barrier was displayed even for sites
+with dual Flash+HTML5 video players, such as YouTube.
+
+ </para>
+ </listitem>
+
+ </orderedlist>
+ </sect2>
+
+</sect1>
+
+<!--
+- Packaging
+ - Build Process Security
+ - External Addons
+ - Included
+ - HTTPS-E
+ - NoScript
+ - Torbutton
+ - Deliberately excluded
+ - Request Policy, AdblockPlus, etc
+ - Desired
+ - Perspectives/Convergence/etc
+ - Pref Changes
+ - Caused by Torbutton
+ - Set manually in profile
+ - Update security
+ - Thandy
+
+<sect1 id="Packaging">
+ <title>Packaging</title>
+ <para> </para>
+ <sect2 id="build-security">
+ <title>Build Process Security</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="addons">
+ <title>External Addons</title>
+ <para> </para>
+ <sect3>
+ <title>Included Addons</title>
+ </sect3>
+ <sect3>
+ <title>Excluded Addons</title>
+ </sect3>
+ <sect3>
+ <title>Dangerous Addons</title>
+ </sect3>
+ </sect2>
+ <sect2 id="prefs">
+ <title>Pref Changes</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="update-mechanism">
+ <title>Update Security</title>
+ <para> </para>
+ </sect2>
+</sect1>
+-->
+
+<!--
+<sect1 id="Testing">
+ <title>Testing</title>
+ <para>
+
+The purpose of this section is to cover all the known ways that Tor browser
+security can be subverted from a penetration testing perspective. The hope
+is that it will be useful both for creating a "Tor Safety Check"
+page, and for developing novel tests and actively attacking Torbutton with the
+goal of finding vulnerabilities in either it or the Mozilla components,
+interfaces and settings upon which it relies.
+
+ </para>
+ <sect2 id="SingleStateTesting">
+ <title>Single state testing</title>
+ <para>
+
+Torbutton is a complicated piece of software. During development, changes to
+one component can affect a whole slough of unrelated features. A number of
+aggregated test suites exist that can be used to test for regressions in
+Torbutton and to help aid in the development of Torbutton-like addons and
+other privacy modifications of other browsers. Some of these test suites exist
+as a single automated page, while others are a series of pages you must visit
+individually. They are provided here for reference and future regression
+testing, and also in the hope that some brave soul will one day decide to
+combine them into a comprehensive automated test suite.
+
+ <orderedlist>
+ <listitem><ulink url="http://decloak.net/">Decloak.net</ulink>
+ <para>
+
+Decloak.net is the canonical source of plugin and external-application based
+proxy-bypass exploits. It is a fully automated test suite maintained by <ulink
+url="http://digitaloffense.net/">HD Moore</ulink> as a service for people to
+use to test their anonymity systems.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://deanonymizer.com/">Deanonymizer.com</ulink>
+ <para>
+
+Deanonymizer.com is another automated test suite that tests for proxy bypass
+and other information disclosure vulnerabilities. It is maintained by Kyle
+Williams, the author of <ulink url="http://www.janusvm.com/">JanusVM</ulink>
+and <ulink url="http://www.januspa.com/">JanusPA</ulink>.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="https://ip-check.info">JonDos
+AnonTest</ulink>
+ <para>
+
+The <ulink url="https://anonymous-proxy-servers.net/">JonDos people</ulink> also provide an
+anonymity tester. It is more focused on HTTP headers and behaviors than plugin bypass, and
+points out a couple of headers Torbutton could do a better job with
+obfuscating.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://browserspy.dk">Browserspy.dk</ulink>
+ <para>
+
+Browserspy.dk provides a tremendous collection of browser fingerprinting and
+general privacy tests. Unfortunately they are only available one page at a
+time, and there is not really solid feedback on good vs bad behavior in
+the test results.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://analyze.privacy.net/">Privacy
+Analyzer</ulink>
+ <para>
+
+The Privacy Analyzer provides a dump of all sorts of browser attributes and
+settings that it detects, including some information on your original IP
+address. Its page layout and lack of good vs bad test result feedback makes it
+not as useful as a user-facing testing tool, but it does provide some
+interesting checks in a single page.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://ha.ckers.org/mr-t/">Mr. T</ulink>
+ <para>
+
+Mr. T is a collection of browser fingerprinting and deanonymization exploits
+discovered by the <ulink url="http://ha.ckers.org">ha.ckers.org</ulink> crew
+and others. It is also not as user friendly as some of the above tests, but it
+is a useful collection.
+
+ </para>
+ </listitem>
+ <listitem>Gregory Fleischer's <ulink
+url="http://pseudo-flaw.net/content/tor/torbutton/">Torbutton</ulink> and
+<ulink
+url="http://pseudo-flaw.net/content/defcon/dc-17-demos/d.html">Defcon
+17</ulink> Test Cases
+ <para>
+
+Gregory Fleischer has been hacking and testing Firefox and Torbutton privacy
+issues for the past 2 years. He has an excellent collection of all his test
+cases that can be used for regression testing. In his Defcon work, he
+demonstrates ways to infer Firefox version based on arcane browser properties.
+We are still trying to determine the best way to address some of those test
+cases.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="https://torcheck.xenobite.eu/index.php">Xenobite's
+TorCheck Page</ulink>
+ <para>
+
+This page checks to ensure you are using a valid Tor exit node and checks for
+some basic browser properties related to privacy. It is not very fine-grained
+or complete, but it is automated and could be turned into something useful
+with a bit of work.
+
+ </para>
+ </listitem>
+ </orderedlist>
+ </para>
+ </sect2>
+-->
+<!--
+ <sect2>
+ <title>Multi-state testing</title>
+ <para>
+
+The tests in this section are geared towards a page that would instruct the
+user to toggle their Tor state after the fetch and perform some operations:
+mouseovers, stray clicks, and potentially reloads.
+
+ </para>
+ <sect3>
+ <title>Cookies and Cache Correlation</title>
+ <para>
+The most obvious test is to set a cookie, ask the user to toggle tor, and then
+have them reload the page. The cookie should no longer be set if they are
+using the default Torbutton settings. In addition, it is possible to leverage
+the cache to <ulink
+url="http://crypto.stanford.edu/sameorigin/safecachetest.html">store unique
+identifiers</ulink>. The default settings of Torbutton should also protect
+against these from persisting across Tor Toggle.
+
+ </para>
+ </sect3>
+ <sect3>
+ <title>Javascript timers and event handlers</title>
+ <para>
+
+Javascript can set timers and register event handlers in the hopes of fetching
+URLs after the user has toggled Torbutton.
+ </para>
+ </sect3>
+ <sect3>
+ <title>CSS Popups and non-script Dynamic Content</title>
+ <para>
+
+Even if Javascript is disabled, CSS is still able to
+<ulink url="http://www.tjkdesign.com/articles/css%20pop%20ups/">create popup-like
+windows</ulink>
+via the 'onmouseover' CSS attribute, which can cause arbitrary browser
+activity as soon as the mouse enters into the content window. It is also
+possible for meta-refresh tags to set timers long enough to make it likely
+that the user has toggled Tor before fetching content.
+
+ </para>
+ </sect3>
+ </sect2>
+ <sect2 id="HackTorbutton">
+ <title>Active testing (aka How to Hack Torbutton)</title>
+ <para>
+
+The idea behind active testing is to discover vulnerabilities in Torbutton to
+bypass proxy settings, run script in an opposite Tor state, store unique
+identifiers, leak location information, or otherwise violate <link
+linkend="requirements">its requirements</link>. Torbutton has ventured out
+into a strange and new security landscape. It depends on Firefox mechanisms
+that haven't necessarily been audited for security, certainly not for the
+threat model that Torbutton seeks to address. As such, it and the interfaces
+it depends upon still need a 'trial by fire' typical of new technologies. This
+section of the document was written with the intention of making that period
+as fast as possible. Please help us get through this period by considering
+these attacks, playing with them, and reporting what you find (and potentially
+submitting the test cases back to be run in the standard batch of Torbutton
+tests.
+
+ </para>
+ <sect3>
+ <title>Some suggested vectors to investigate</title>
+ <para>
+ <itemizedlist>
+ <listitem>Strange ways to register Javascript <ulink
+url="http://en.wikipedia.org/wiki/DOM_Events">events</ulink> and <ulink
+url="http://www.devshed.com/c/a/JavaScript/Using-Timers-in-JavaScript/">timeouts</ulink> should
+be verified to actually be ineffective after Tor has been toggled.</listitem>
+ <listitem>Other ways to cause Javascript to be executed after
+<command>javascript.enabled</command> has been toggled off.</listitem>
+ <listitem>Odd ways to attempt to load plugins. Kyle Williams has had
+some success with direct loads/meta-refreshes of plugin-handled URLs.</listitem>
+ <listitem>The Date and Timezone hooks should be verified to work with
+crazy combinations of iframes, nested iframes, iframes in frames, frames in
+iframes, and popups being loaded and
+reloaded in rapid succession, and/or from one another. Think race conditions and deep,
+parallel nesting, involving iframes from both <ulink
+url="http://en.wikipedia.org/wiki/Same_origin_policy">same-origin and
+non-same-origin</ulink> domains.</listitem>
+ <listitem>In addition, there may be alternate ways and other
+methods to query the timezone, or otherwise use some of the Date object's
+methods in combination to deduce the timezone offset. Of course, the author
+tried his best to cover all the methods he could foresee, but it's always good
+to have another set of eyes try it out.</listitem>
+ <listitem>Similarly, is there any way to confuse the <link
+linkend="contentpolicy">content policy</link>
+mentioned above to cause it to allow certain types of page fetches? For
+example, it was recently discovered that favicons are not fetched by the
+content, but the chrome itself, hence the content policy did not look up the
+correct window to determine the current Tor tag for the favicon fetch. Are
+there other things that can do this? Popups? Bookmarklets? Active bookmarks? </listitem>
+ <listitem>Alternate ways to store and fetch unique identifiers. For example, <ulink
+url="http://developer.mozilla.org/en/docs/DOM:Storage">DOM Storage</ulink>
+caught us off guard.
+It was
+also discovered by <ulink url="http://pseudo-flaw.net">Gregory
+Fleischer</ulink> that <ulink
+url="http://pseudo-flaw.net/content/tor/torbutton/">content window access to
+chrome</ulink> can be used to build <link linkend="fingerprinting">unique
+identifiers</link>.
+Are there any other
+arcane or experimental ways that Firefox provides to create and store unique
+identifiers? Or perhaps unique identifiers can be queried or derived from
+properties of the machine/browser that Javascript has access to? How unique
+can these identifiers be?
+ </listitem>
+ <listitem>Is it possible to get the browser to write some history to disk
+(aside from swap) that can be retrieved later? By default, Torbutton should
+write no history, cookie, or other browsing activity information to the
+harddisk.</listitem>
+ <listitem>Do popup windows make it easier to break any of the above
+behavior? Are javascript events still canceled in popups? What about recursive
+popups from Javascript, data, and other funky URL types? What about CSS
+popups? Are they still blocked after Tor is toggled?</listitem>
+ <listitem>Chrome-escalation attacks. The interaction between the
+Torbutton chrome Javascript and the client content window javascript is pretty
+well-defined and carefully constructed, but perhaps there is a way to smuggle
+javascript back in a return value, or otherwise inject network-loaded
+javascript into the chrome (and thus gain complete control of the browser).
+</listitem>
+</itemizedlist>
+
+ </para>
+ </sect3>
+ </sect2>
+</sect1>
+-->
+<appendix id="Transparency">
+<title>Towards Transparency in Navigation Tracking</title>
+<para>
+
+The <link linkend="privacy">privacy properties</link> of Tor Browser are based
+upon the assumption that link-click navigation indicates user consent to
+tracking between the linking site and the destination site. While this
+definition is sufficient to allow us to eliminate cross-site third party
+tracking with only minimal site breakage, it is our long-term goal to further
+reduce cross-origin click navigation tracking to mechanisms that are
+detectable by attentive users, so they can alert the general public if
+cross-origin click navigation tracking is happening where it should not be.
+
+</para>
+<para>
+
+In an ideal world, the mechanisms of tracking that can be employed during a
+link click would be limited to the contents of URL parameters and other
+properties that are fully visible to the user before they click. However, the
+entrenched nature of certain archaic web features make it impossible for us to
+achieve this transparency goal by ourselves without substantial site breakage.
+So, instead we maintain a <link linkend="deprecate">Deprecation
+Wishlist</link> of archaic web technologies that are currently being (ab)used
+to facilitate federated login and other legitimate click-driven cross-domain
+activity but that can one day be replaced with more privacy friendly,
+auditable alternatives.
+
+</para>
+<para>
+
+Because the total elimination of side channels during cross-origin navigation
+will undoubtedly break federated login as well as destroy ad revenue, we
+also describe auditable alternatives and promising web draft standards that would
+preserve this functionality while still providing transparency when tracking is
+occurring.
+
+</para>
+
+<sect1 id="deprecate">
+ <title>Deprecation Wishlist</title>
+ <orderedlist>
+ <listitem>The Referer Header
+ <para>
+
+We haven't disabled or restricted the Referer ourselves because of the
+non-trivial number of sites that rely on the Referer header to "authenticate"
+image requests and deep-link navigation on their sites. Furthermore, there
+seems to be no real privacy benefit to taking this action by itself in a
+vacuum, because many sites have begun encoding Referer URL information into
+GET parameters when they need it to cross http to https scheme transitions.
+Google's +1 buttons are the best example of this activity.
+
+ </para>
+ <para>
+
+Because of the availability of these other explicit vectors, we believe the
+main risk of the Referer header is through inadvertent and/or covert data
+leakage. In fact, <ulink
+url="http://www2.research.att.com/~bala/papers/wosn09.pdf">a great deal of
+personal data</ulink> is inadvertently leaked to third parties through the
+source URL parameters.
+
+ </para>
+ <para>
+
+We believe the Referer header should be made explicit. If a site wishes to
+transmit its URL to third party content elements during load or during
+link-click, it should have to specify this as a property of the associated HTML
+tag. With an explicit property, it would then be possible for the user agent to
+inform the user if they are about to click on a link that will transmit Referer
+information (perhaps through something as subtle as a different color in the
+lower toolbar for the destination URL). This same UI notification can also be
+used for links with the <ulink
+url="https://developer.mozilla.org/en-US/docs/HTML/Element/a#Attributes">"ping"</ulink>
+attribute.
+
+ </para>
+ </listitem>
+ <listitem>window.name
+ <para>
+<ulink
+url="https://developer.mozilla.org/En/DOM/Window.name">window.name</ulink> is
+a DOM property that for some reason is allowed to retain a persistent value
+for the lifespan of a browser tab. It is possible to utilize this property for
+<ulink url="http://www.thomasfrank.se/sessionvars.html">identifier
+storage</ulink> during click navigation. This is sometimes used for additional
+XSRF protection and federated login.
+ </para>
+ <para>
+
+It's our opinion that the contents of window.name should not be preserved for
+cross-origin navigation, but doing so may break federated login for some sites.
+
+ </para>
+ </listitem>
+ <listitem>Javascript link rewriting
+ <para>
+
+In general, it should not be possible for onclick handlers to alter the
+navigation destination of 'a' tags, silently transform them into POST
+requests, or otherwise create situations where a user believes they are
+clicking on a link leading to one URL that ends up on another. This
+functionality is deceptive and is frequently a vector for malware and phishing
+attacks. Unfortunately, many legitimate sites also employ such transparent
+link rewriting, and blanket disabling this functionality ourselves will simply
+cause Tor Browser to fail to navigate properly on these sites.
+
+ </para>
+ <para>
+
+Automated cross-origin redirects are one form of this behavior that is
+possible for us to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3600">address
+ourselves</ulink>, as they are comparatively rare and can be handled with site
+permissions.
+
+ </para>
+ </listitem>
+ </orderedlist>
+</sect1>
+<sect1>
+ <title>Promising Standards</title>
+ <orderedlist>
+ <listitem><ulink url="http://web-send.org">Web-Send Introducer</ulink>
+ <para>
+
+Web-Send is a browser-based link sharing and federated login widget that is
+designed to operate without relying on third-party tracking or abusing other
+cross-origin link-click side channels. It has a compelling list of <ulink
+url="http://web-send.org/features.html">privacy and security features</ulink>,
+especially if used as a "Like button" replacement.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="https://developer.mozilla.org/en-US/docs/Persona">Mozilla Persona</ulink>
+ <para>
+
+Mozilla's Persona is designed to provide decentralized, cryptographically
+authenticated federated login in a way that does not expose the user to third
+party tracking or require browser redirects or side channels. While it does
+not directly provide the link sharing capabilities that Web-Send does, it is a
+better solution to the privacy issues associated with federated login than
+Web-Send is.
+
+ </para>
+ </listitem>
+ </orderedlist>
+</sect1>
+</appendix>
+</article>
diff --git a/design-doc/outline.txt b/design-doc/outline.txt
new file mode 100644
index 0000000..f7aa5ec
--- /dev/null
+++ b/design-doc/outline.txt
@@ -0,0 +1,52 @@
+- Threat model: [Mostly Torbutton]
+ - [Remove the security requirements section]
+
+- Design overview and philosophy
+ - Security requirements [Torbutton]
+ + local leaks?
+ - state issues
+ - Privacy Requirements [Mostly blog post]
+ - Make local privacy optional
+ - Avoid Cross-Domain Linkability
+ - Indentifiers
+ - Fingerprinting
+ - 100% self-contained
+ - Does not share state with other modes/browsers
+ - Easy to remove + wipe with external tools
+ - No filters
+
+- Implementation
+ - Section Template
+ - Sub Section
+ - "Design Goal":
+ - "Implementation Status"
+ - Local Privacy Optional
+ - Linkability
+ - Stored State
+ - Cookies
+ - Cache
+ - DOM Storage
+ - HTTP Auth
+ - SSL state
+ - Plugins
+ - Fingerprinting
+ - Patches
+
+- Packaging
+ - Build Process Security
+ - External Addons
+ - Included
+ - HTTPS-E
+ - NoScript
+ - Torbutton
+ - Deliberately excluded
+ - Request Policy, AdblockPlus, etc
+ - Desired
+ - Perspectives/Convergence/etc
+ - Pref Changes
+ - Caused by Torbutton
+ - Set manually in profile
+ - Update security
+ - Thandy
+
+
diff --git a/docs/audits/FF17_FEATURE_AUDIT b/docs/audits/FF17_FEATURE_AUDIT
deleted file mode 100644
index b135c0a..0000000
--- a/docs/audits/FF17_FEATURE_AUDIT
+++ /dev/null
@@ -1,19 +0,0 @@
-- Can calc() accept device-width/height?
- - No. Numbers only.
- - Can viewport/meta viewport accept them?
- - nsContentUtils::GetViewportInfo() might be the only place..
- - nsContentUtils::ProcessViewportInfo()
- - Viewport seems unused on all but mobile
- - Maybe worth testing..
-
-- currentColor: OK
-- scrollMax: OK
-- IdleAPI: FirefoxOS only, but still present w/ pref??
- - Throws "The operation is insecure" exception
- - Disabled for content. WebApp only.
-
-- Web Activities
- - Seems unfinished and unexported to content
-
-- RegisterContent/ProtocolHandler -> Third party supercookie??
- - asks for confirmation
diff --git a/docs/audits/FF17_NETWORK_AUDIT b/docs/audits/FF17_NETWORK_AUDIT
deleted file mode 100644
index 8ec25ba..0000000
--- a/docs/audits/FF17_NETWORK_AUDIT
+++ /dev/null
@@ -1,84 +0,0 @@
-
-Lowest level resolver calls:
- - PR_GetHostByName
- + ./profile/dirserviceprovider/src/nsProfileLock.cpp
- + nsProfileLock::LockWithSymlink
- + ./security/nss/lib/libpkix/pkix_pl_nss/module/pkix_pl_socket.c
- - pkix_pl_Socket_CreateByHostAndPort()
- - pkix_pl_Socket_CreateByName()
- - ./security/nss/lib/certhigh/ocsp.c
- - ocsp_ConnectToHost()
- + ./security/nss/cmd/libpkix/pkix_pl/module/test_socket.c
- + ./security/nss/cmd/vfyserv/vfyserv.c
- - ./media/webrtc/trunk/src/modules/udp_transport/source/udp_transport_impl.cc (!!!)
- --disable-webrtc!!!
- + PR_GetAddrInfoByName
- + ./security/nss/cmd/ssltap/ssltap.c
- + ./security/nss/cmd/tstclnt/tstclnt.c
- + ./security/nss/cmd/strsclnt/strsclnt.c
-
-Direct paths to DNS resolution:
- + nsDNSService::Resolve
- + nsDNSService::AsyncResolve
- + nsHostResolver::ResolveHost
-
-Misc UDP (SOCK_DGRAM, PR_DESC_SOCKET_UDP):
- + ./nsprpub/pr/src/io/prsocket.c
- + PR_NewUDPSocket
- + PR_OpenUDPSocket
- + PR_Socket
- + ./nsprpub/pr/src/pthreads/ptio.c
- + ./netwerk/socket/nsUDPSocketProvider.cpp
-
-Misc TCP (SOCK_STREAM, PR_DESC_SOCKET_TCP):
- + ./nsprpub/pr/src/pthreads/ptio.c
- - ./nsprpub/pr/src/io/prsocket.c
- - PR_NewTCPSocket
- - PR_Socket
- - PR_OpenTCPSocket
- + ./nsprpub/pr/src/misc/prnetdb.c
- + TCPSocket:
- + ./security/manager/ssl/src/nsNSSIOLayer.cpp
- + nsSSLIOLayerNewSocket()
- + ./security/nss/lib/certhigh/ocsp.c
- + ocsp_SendEncodedRequest
- + ./security/nss/lib/libpkix/pkix_pl_nss/module/pkix_pl_socket.c
- + pkix_pl_Socket_CreateClient
- + pkix_pl_Socket_CreateServer
-
-Misc PR_Socket:
- + ./nsprpub/pr/src/cplus/rcnetio.cpp
- + RCNetStreamIO::RCNetStreamIO
-
-Misc XPCOM:
- - *SocketProvider
- + newSocket
- + ./netwerk/base/src/nsSocketTransport2.cpp:
- + addToSocket
- + @mozilla.org/network/socket:
- + createTransport()
- + ./netwerk/protocol/http/nsHttpConnectionMgr.cpp
- + ./netwerk/protocol/ftp/nsFtpConnectionThread.cpp:
- + ./netwerk/protocol/ftp/nsFtpControlConnection.cpp:
- + ./dom/network/src/TCPSocket.js
- + open()
-
-Android may have DNS leaks..
- - ./mobile/android/base/httpclientandroidlib/impl/conn/DefaultClientConnectionOperator.java
-
-nsDNSService/nsPIDNSService/nsIDNSService
- + calls nsHostResolver::ResolveHost
- + used by:
- + DNS prefetch (disabled)
- + ./netwerk/base/src/nsIOService.cpp (offline mode only)
- + ./netwerk/build/nsNetModule.cpp
- + ./netwerk/protocol/websocket/WebSocketChannel.cpp
- + ./netwerk/build/nsNetCID.h
- + ./netwerk/socket/nsSOCKSIOLayer.cpp (proxy lookup only)
-
-netwerk/base/src/nsSocketTransport2.cpp
- + nsSocketTransport::ResolveHost() has proper remote dns checks
- + Resolution is done by using hostname as sockaddr
- + PROXY_RESOLVES_HOST
-
-
diff --git a/docs/audits/FF3.5_AUDIT b/docs/audits/FF3.5_AUDIT
deleted file mode 100644
index 35a9fbf..0000000
--- a/docs/audits/FF3.5_AUDIT
+++ /dev/null
@@ -1,195 +0,0 @@
-First pass: Quick Review of Firefox Features
-- Video Tag
- - Docs:
- - https://developer.mozilla.org/En/HTML/Element/Audio
- - https://developer.mozilla.org/En/HTML/Element/Video
- - https://developer.mozilla.org/En/HTML/Element/Source
- - https://developer.mozilla.org/En/Manipulating_video_using_canvas
- - https://developer.mozilla.org/En/nsIDOMHTMLMediaElement
- - https://developer.mozilla.org/En/Media_formats_supported_by_the_audio_and_v…
- - http://en.flossmanuals.net/TheoraCookbook
- - nsIContentPolicy is checked on load
- - Uses NSIChannels for initial load
- - Wrapped in nsHTMLMediaElement::mDecoder
- - is nsOggDecoder() or nsWaveDecoder()
- - liboggplay
- - Governed by media.* prefs
- - Preliminary audit shows they do not use the liboggplay tcp functions
-- Geolocation
- - Wifi:
- - https://developer.mozilla.org/En/Monitoring_WiFi_access_points
- - Requires security policy to allow. Then still prompted
- - navigator.geolocation
- - Governed by geo.enabled
- - "2 week access token" is set
- - geo.wifi.access_token.. Clearing is prob a good idea
- - http://mxr.mozilla.org/mozilla1.9.1/source/dom/src/geolocation/NetworkGeolo…
- - https://developer.mozilla.org/En/Using_geolocation
-- DNS prefetching after toggle
- - prefetch pref? Always disable for now?
- - network.dns.disablePrefetch
- - Also disabled in netwerk/dns/src/nsDNSService2.cpp when manual proxies
- are set..
- - This should prevent prefetching of non-tor urls in tor mode..
- - But the reverse is unclear.
- - DocShell attribute!!1 YAY
- - http://www.oxymoronical.com/experiments/apidocs/interface/nsIDocShell
- - "Takes effect for the NEXT document loaded...."
- - Do we win this race? hrmm.. If we do, the tor->nontor direction
- should also be safe.
- - Content policy called?
- - No. See content/html/content/src/nsHTMLDNSPrefetch.cpp
-- Storage
- - https://developer.mozilla.org/en/Storage
- - "It is available to trusted callers, meaning extensions and Firefox
- components only."
-- New content policy
- - Content Security Policy. Addon-only
-- "Offline resources"
- - https://developer.mozilla.org/en/Offline_resources_in_Firefox
- - https://developer.mozilla.org/en/nsIApplicationCache
- - browser.cache.offline.enable toggles
- - browser.cache.disk.enable does not apply. Seperate "device".
- - Does our normal cache clearing mechanism apply?
- - We call nsICacheService.evictEntries()
- - May need: nsOfflineCacheDevice::EvictEntries(NULL)
- - Code is smart enough to behave cleanly if we simply set
- browser.cache.offline.enable or enable private browsing.
-- Mouse gesture and other new DOM events
-- Fonts
- - Remote fonts obey content policy. Good.
- - XXX: Are they cached independent of regular cache? Prob not.
- - Hrmm can probe for installed fonts:
- http://remysharp.com/2008/07/08/how-to-detect-if-a-font-is-installed-only-u…
- http://www.lalit.org/lab/javascript-css-font-detect
- http://www.ajaxupdates.com/cssjavascript-font-detector/
- http://code.google.com/p/jquery-fontavailable/
-- Drag and drop
- - https://developer.mozilla.org/En/DragDrop/Drag_and_Drop
- - https://developer.mozilla.org/En/DragDrop/Drag_Operations
- - https://developer.mozilla.org/En/DragDrop/Dragging_and_Dropping_Multiple_It…
- - https://developer.mozilla.org/En/DragDrop/Recommended_Drag_Types
- - https://developer.mozilla.org/En/DragDrop/DataTransfer
- - Should be no different than normal url handling..
-- Local Storage
- - https://developer.mozilla.org/en/DOM/Storage#localStorage
- - Disabled by dom storage pref..
- - Private browsing mode has its own DB
- - Memory only?
- - Disk Avoidance of gStorage and local storage:
- - mSessionOnly set via nsDOMStorage::CanUseStorage()
- - Seems to be set to true if cookies are session-only or private
- browsing mode
- - Our cookies are NOT session-only with dual cookie jars
- - but this is ok if we clear the session storage..
- - XXX: Technically clearing session storage may break
- sites if cookies remain though
- - nsDOMStoragePersistentDB not used if mSessionOnly
- - Can clear with nsDOMStorage::ClearAll() or nsIDOMStorage2::clear()?
- - These only work for a particular storage. There's both global now
- and per-origin storage instances
- - Each docshell has tons of storages for each origin contained in it
- - Toggling dom.storage.enabled does not clear existing storage
- - Oh HOT! cookie-changed to clear cookies clears all storages!
- - happens for both ff3.0 and 3.5 in dom/src/storage/nsDOMStorage.cpp
- - Conclusion:
- - can safely enable dom storage
- - May have minor buggy usability issues unless we preserve it
- when user is preserving cookies..
-
-Second Pass: Verification of all Torbutton Assumptions
-- "Better privacy controls"
- - Basically UI stuff for prefs we set already
- - address bar search disable option is interesting, but not
- torbutton's job to toggle. Users will hate us.
-- Private browsing
- - https://developer.mozilla.org/En/Supporting_private_browsing_mode
- - We should consider an option (off by default) to enable PBM during
- toggle
- - It is a good idea because it will let our users use DOM storage
- safely and also may cause their plugins and other addons to be
- safe
- - Doing it always will cause the user to lose fine-grained control
- of many settings
- - Also we'll need to prevent them from leaving without toggling tor
- - Stuff the emit does (grep for NS_PRIVATE_BROWSING_SWITCH_TOPIC and
- "private-browsing")
- - XXX: clear mozilla.org/security/sdr;1. We should too! Wtf is it??
- - Neg. Best to let them handle this. Users will be annoyed
- at having to re-enter their passwords..
- - They also clear the console service..
- - Recommend watching private-browsing-cancel-vote and blocking if
- we are performing a db operation
- - Maybe we want to block transitions during our toggle for safety
- - XXX: They also clear general.open_location.last_url
- - XXX: mozilla.org/permissionmanager
- - XXX: mozilla.org/content-pref/service
- - XXX: Sets browser.zoom.siteSpecific to false
- - Interesting.. They clear their titles.. I wonder if some
- window managers log titles.. But that level of surveillance is
- unbeatable..
- - XXX: Unless there is some way for flash or script to read titles?
- - They empty the clipboard..
- - Can js access the clipboard?? ...
- - Yes, but needs special pref+confirmation box
- - http://www.dynamic-tools.net/toolbox/copyToClipboard/
- - They clear cache..
- - Cookies:
- - Use in-memory table that is different than their default
- - This could fuck up our cookie storage options
- - We could maybe prevent them from getting this
- event by wrapping nsCookieService::Observe(). Lullz..
- - NavHistory:
- - XXX: nsNavHistory::AutoCompleteFeedback() doesn't track
- awesomebar choices for feedback.. Is this done on disk?
- - Don't add history entries
- - We should block this observe event too if we can..
- - The session store stops storing tabs
- - We could block this observe
- - XXX: They expunge private temporary files on exit from PMB
- - This is not done normally until browser exit or
- "on-profile-change"
- - emits browser:purge-domain-data.. Mostly just for session
- editing it appears
- - Direct component query for pbs.privateBrowsingEnabled
- - This is where we have no ability to provide certain option
- control
- - browser.js seems to prevent user from allowing blocked
- popups?
- - Some items in some places context menu get blocked:
- - Can't delete items from history? placesContext_deleteHost
- - nsCookiePermission::InPrivateBrowsing() calls direct
- - but is irellevant
- - Form history cannot be saved while in PBM.. :(
- - User won't be prompted for adding login passwords..
- - Can't remember prefs on content types
- - Many components read this value upon init:
- - This fucks up our observer game if tor starts enabled
- - NavHistory and cookie and dl manager
- - We could just wrap the bool on startup and lie
- and emit later... :/
- - Or! emit an exit and an enter always at startup if tor is
- enabled.
- - Read iSec report
- - Compare to Chrome
- - API use cases
-- SessionStore
- - Has been reworked with observers and write methods. Should use those.
-- security.enable_ssl2 to clear session id
- - Still cleared
-- browser.sessionstore.max_tabs_undo
- - Yep.
-- SafeBrowsing Update Key removed on cookie clear still?
- - Yep.
-- Livemark updates have kill events now
-- Test if nsICertStore is still buggy...
-
-Third Pass: Exploit Auditing
-- Remote fonts
-- SVG with HTML
-- Javascript threads+locking
-- Ogg theora and vorbis codecs
-- SQLite
-
-
-- https://developer.mozilla.org/en/Firefox_3_for_developers
diff --git a/docs/audits/FF4_AUDIT b/docs/audits/FF4_AUDIT
deleted file mode 100644
index 7830eb3..0000000
--- a/docs/audits/FF4_AUDIT
+++ /dev/null
@@ -1,50 +0,0 @@
-- Review of https://developer.mozilla.org/en/Firefox_4_for_developers
- - Potential proxy issues
- - DocShell and plugins inside createHTMLDocument?
- - https://developer.mozilla.org/en/DOM/DOMImplementation.createHTMLDocument
- - WebSockets?
- - Media attributes?
- - "buffered"
- - "preload"
- - new codecs?
- - What the hell is a blob url?
- - https://developer.mozilla.org/en/DOM/window.createBlobURL
- - https://developer.mozilla.org/en/DOM/window.revokeBlobURL
- - Seems only relevent to FS injection..
- - WebThreads are OK:
- - https://developer.mozilla.org/En/Using_web_workers
- - Network activity blocked by content policy
- - Fingerprinting issues:
- - New screen attributes
- - https://developer.mozilla.org/en/DOM/window.mozInnerScreenX, Y
- - High Res Animation Timers:
- - https://developer.mozilla.org/en/DOM/window.mozAnimationStartTime
- - https://developer.mozilla.org/en/DOM/Animations_using_MozBeforePaint
- - 50-60hz max.. Can we leverage this?
- - timeStamps on keystroke events
- - https://developer.mozilla.org/en/DOM/event.timeStamp
- - Bounding rectangles -> window sizes?
- - Maybe not display sizes, but seems possible to fingerprint rendered
- content size.. ugh.
- - https://developer.mozilla.org/en/DOM/element.getBoundingClientRect
- - https://developer.mozilla.org/en/dom:range
- - CSS resize, media queries, etc..
- - WebGL may also expose screen properties and video card properties:
- - https://developer.mozilla.org/en/WebGL
- - https://www.khronos.org/registry/webgl/specs/1.0/#5.2
- - https://www.khronos.org/registry/webgl/specs/1.0/#5.11
- - SVG needs auditing. It may also expose absolute coords, but appears OK
- - https://developer.mozilla.org/en/SVG/SVG_animation_with_SMIL
- - Mouse events reveal desktop coordinates
- - https://bugzilla.mozilla.org/show_bug.cgi?id=503943
- - https://developer.mozilla.org/en/DOM/Event/UIEvent/MouseEvent
- - Actual screen dimensions not exposed
- - Identifier Storage
- - Content Secuity Properties may need clearing:
- - https://developer.mozilla.org/en/Security/CSP
- - STS cache needs clearing
- - New window.history functions may allow state smuggling
- - https://developer.mozilla.org/en/DOM/Manipulating_the_browser_history
-
-- New Javascript hooking options may help improve Date() hooks:
- - https://developer.mozilla.org/en/JavaScript/New_in_JavaScript/1.8.5
diff --git a/docs/design/CookieManagers.png b/docs/design/CookieManagers.png
deleted file mode 100644
index 0fc3e64..0000000
Binary files a/docs/design/CookieManagers.png and /dev/null differ
diff --git a/docs/design/Firefox17-TODO b/docs/design/Firefox17-TODO
deleted file mode 100644
index 41ef38e..0000000
--- a/docs/design/Firefox17-TODO
+++ /dev/null
@@ -1,82 +0,0 @@
-+ Cleanups
- + We specify browser.cache.memory.enable under disk avoidance. That's
- wrong. We don't even set it at all. Torbutton relic?
- + Disk leak documentation
- + Firefox 17 will mess up all patch links
-
-- Heavy Writing by section
- + Intro:
- + We target Firefox ESR
- + Component description
- + Deprecation List/Future Philosophy:
- + Linkability Transparency from
- https://trac.torproject.org/projects/tor/ticket/5273#comment:12
- + Adversary Goals
- + Describe how each adversary attack violates design goals
- + "Correlate activity across multiple site visits" as one of the adversary
- goals. This is the primary goal of the ad networks, though. We need to
- explicitly mention it in the Adversary Goals section for completeness.
- + Misc implementation
- + Link to prefs.js and describe omni.ja and extension-overrides hacks
- + document the environment variables and settings used to provide a non-grey "New Identity" button.
- + Mockup privacy UI
- + Identifier Linkability
- + Image cache jail
- + DOM storage jail
- + 3.5.8 is not clear that what we're trying to limit is non-click
- driven/non-interactive linkability rather than linkability in all cases.
- Other sections may have this problem, too.
- + This is a subtlety that arises from both the impossibility of satisfying
- unlinkability due to covert channels in GET/POST, as well as the desire
- to avoid breaking thinks like consensual federated login.
- - Fingerprinting
- + @font-face exemption and preference
- + Canvas prompt
- + describe our resolution defenses
- + Limit CSS media queries
- + System colors + fonts
- + Explain why panopticlick is weirdsauce
- + We report our useragent as 17.0
- + Click-to-play WebGL
- + We should perhaps be more vocal about the fingerprinting issues with
- some or all of http://www.w3.org/TR/navigation-timing/. I think I agree.
- - provide an entropy count estimate for fingerprinting defenses
- + Disk avoidance
- + Private browsing + pref changes
- + He reminded me about documenting disabling IndexedDB, but that is just one
- of the many prefs.js changes we need to document.
- - Testing
- - Explain why panopticlick is weirdsauce
- - Sync with QA pages
- - Many are out of date
- - http://www.stayinvisible.com/
- - Evercookie test page, and perhaps also
- http://jeremiahgrossman.blogspot.de/2007/04/tracking-users-without-cookies.…
-
-- Misc changes:
- + Plugin handling
- + All-but-flash patch
- + Plugin manager manipulation
- + We use Firefox's click-to-play
- + Addons
- + PDF.js inclusion
- + List links to design violations/enhancements:
- + https://trac.torproject.org/projects/tor/query?keywords=~tbb-linkability
- + https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting
- - Update notification/version checking?
- - Create a deprecation list and link to it:
- - Referer Header
- - Window.name
- - We should only preserve window.name if the url bar domain remains the
- same. I could be convinced of this, but it's going to be trickier to
- implement and I think it's not really possible to remove linkability for user
- clicks in general.
- - Torbutton Security Settings
-
-- Packaging
- - Pref changes
- - Socks ports
- - Torbutton does not update
-
-
-
diff --git a/docs/design/NewCookieManager.png b/docs/design/NewCookieManager.png
deleted file mode 100644
index 97a0b40..0000000
Binary files a/docs/design/NewCookieManager.png and /dev/null differ
diff --git a/docs/design/build.sh b/docs/design/build.sh
deleted file mode 100755
index 5ffb650..0000000
--- a/docs/design/build.sh
+++ /dev/null
@@ -1 +0,0 @@
-xsltproc --output index.html.en -stringparam chunker.output.encoding UTF-8 --stringparam section.autolabel.max.depth 2 -stringparam section.label.includes.component.label 1 --stringparam section.autolabel 1 /usr/share/xml/docbook/stylesheet/docbook-xsl/xhtml/docbook.xsl design.xml
diff --git a/docs/design/design.xml b/docs/design/design.xml
deleted file mode 100644
index d1cdf0f..0000000
--- a/docs/design/design.xml
+++ /dev/null
@@ -1,2733 +0,0 @@
-<?xml version="1.0" encoding="ISO-8859-1"?>
-<!DOCTYPE article PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN"
- "file:///usr/share/sgml/docbook/xml-dtd-4.4-1.0-30.1/docbookx.dtd">
-
-<article id="design">
- <articleinfo>
- <title>The Design and Implementation of the Tor Browser [DRAFT]</title>
- <author>
- <firstname>Mike</firstname><surname>Perry</surname>
- <affiliation>
- <address><email>mikeperry#torproject org</email></address>
- </affiliation>
- </author>
- <author>
- <firstname>Erinn</firstname><surname>Clark</surname>
- <affiliation>
- <address><email>erinn#torproject org</email></address>
- </affiliation>
- </author>
- <author>
- <firstname>Steven</firstname><surname>Murdoch</surname>
- <affiliation>
- <address><email>sjmurdoch#torproject org</email></address>
- </affiliation>
- </author>
- <pubdate>March 15, 2013</pubdate>
- </articleinfo>
-
-<!--
-- Introduction and Threat model: [Mostly Torbutton]
- - [Remove the security requirements section]
--->
-
-<sect1>
- <title>Introduction</title>
- <para>
-
-This document describes the <link linkend="adversary">adversary model</link>,
-<link linkend="DesignRequirements">design requirements</link>, and <link
-linkend="Implementation">implementation</link> <!-- <link
-linkend="Packaging">packaging</link> and <link linkend="Testing">testing
-procedures</link> --> of the Tor Browser. It is current as of Tor Browser
-2.3.25-5 and Torbutton 1.5.1.
-
- </para>
- <para>
-
-This document is also meant to serve as a set of design requirements and to
-describe a reference implementation of a Private Browsing Mode that defends
-against active network adversaries, in addition to the passive forensic local
-adversary currently addressed by the major browsers.
-
- </para>
- <sect2 id="components">
- <title>Browser Component Overview</title>
- <para>
-
-The Tor Browser is based on <ulink
-url="https://www.mozilla.org/en-US/firefox/organizations/">Mozilla's Extended
-Support Release (ESR) Firefox branch</ulink>. We have a <link
-linkend="firefox-patches">series of patches</link> against this browser to
-enhance privacy and security. Browser behavior is additionally augmented
-through the <ulink
-url="https://gitweb.torproject.org/torbutton.git/tree/master">Torbutton
-extension</ulink>, though we are in the process of moving this
-functionality into direct Firefox patches. We also <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/HEAD:/build-scripts/confi…">change
-a number of Firefox preferences</ulink> from their defaults.
-
- </para>
- <para>
-
-To help protect against potential Tor Exit Node eavesdroppers, we include
-<ulink url="https://www.eff.org/https-everywhere">HTTPS-Everywhere</ulink>. To
-provide users with optional defense-in-depth against Javascript and other
-potential exploit vectors, we also include <ulink
-url="http://noscript.net/">NoScript</ulink>. To protect against
-PDF-based Tor proxy bypass and to improve usability, we include the <ulink
-url="https://addons.mozilla.org/en-us/firefox/addon/pdfjs/">PDF.JS</ulink>
-extension. We also modify <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/HEAD:/build-scripts/confi…">several
-extension preferences</ulink> from their defaults.
-
- </para>
- </sect2>
-</sect1>
-
-<!--
-- Design overview and philosophy
- - Security requirements [Torbutton]
- + local leaks?
- - state issues
- - Privacy Requirements [Mostly blog post]
- - Avoid Cross-Domain Linkability
- - Indentifiers
- - Fingerprinting
- - 100% self-contained
- - Does not share state with other modes/browsers
- - Easy to remove + wipe with external tools
- - click-to-play for "troublesome" features
- - Philosophy
- - No filters
--->
-
-
-<sect1 id="DesignRequirements">
- <title>Design Requirements and Philosophy</title>
- <para>
-
-The Tor Browser Design Requirements are meant to describe the properties of a
-Private Browsing Mode that defends against both network and local forensic
-adversaries.
-
- </para>
- <para>
-
-There are two main categories of requirements: <link
-linkend="security">Security Requirements</link>, and <link
-linkend="privacy">Privacy Requirements</link>. Security Requirements are the
-minimum properties in order for a browser to be able to support Tor and
-similar privacy proxies safely. Privacy requirements are the set of properties
-that cause us to prefer one browser over another.
-
- </para>
- <para>
-
-While we will endorse the use of browsers that meet the security requirements,
-it is primarily the privacy requirements that cause us to maintain our own
-browser distribution.
-
- </para>
- <para>
-
- The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL
- NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and
- "OPTIONAL" in this document are to be interpreted as described in
- <ulink url="https://www.ietf.org/rfc/rfc2119.txt">RFC 2119</ulink>.
-
- </para>
-
- <sect2 id="security">
- <title>Security Requirements</title>
- <para>
-
-The security requirements are primarily concerned with ensuring the safe use
-of Tor. Violations in these properties typically result in serious risk for
-the user in terms of immediate deanonymization and/or observability. With
-respect to browser support, security requirements are the minimum properties
-in order for Tor to support the use of a particular browser.
-
- </para>
-
-<orderedlist>
- <listitem><link linkend="proxy-obedience"><command>Proxy
-Obedience</command></link>
- <para>The browser
-MUST NOT bypass Tor proxy settings for any content.</para></listitem>
-
- <listitem><link linkend="state-separation"><command>State
-Separation</command></link>
-
- <para>
-
-The browser MUST NOT provide the content window with any state from any other
-browsers or any non-Tor browsing modes. This includes shared state from
-independent plugins, and shared state from Operating System implementations of
-TLS and other support libraries.
-
-</para></listitem>
-
- <listitem><link linkend="disk-avoidance"><command>Disk
-Avoidance</command></link>
-
-<para>
-
-The browser MUST NOT write any information that is derived from or that
-reveals browsing activity to the disk, or store it in memory beyond the
-duration of one browsing session, unless the user has explicitly opted to
-store their browsing history information to disk.
-
-</para></listitem>
- <listitem><link linkend="app-data-isolation"><command>Application Data
-Isolation</command></link>
-
-<para>
-
-The components involved in providing private browsing MUST be self-contained,
-or MUST provide a mechanism for rapid, complete removal of all evidence of the
-use of the mode. In other words, the browser MUST NOT write or cause the
-operating system to write <emphasis>any information</emphasis> about the use
-of private browsing to disk outside of the application's control. The user
-must be able to ensure that secure deletion of the software is sufficient to
-remove evidence of the use of the software. All exceptions and shortcomings
-due to operating system behavior MUST be wiped by an uninstaller. However, due
-to permissions issues with access to swap, implementations MAY choose to leave
-it out of scope, and/or leave it to the Operating System/platform to implement
-ephemeral-keyed encrypted swap.
-
-</para></listitem>
-
-<!--
- <listitem><link linkend="update-safety"><command>Update
-Safety</command></link>
-
-<para>The browser SHOULD NOT perform unsafe updates or upgrades.</para></listitem>
--->
-</orderedlist>
-
- </sect2>
-
- <sect2 id="privacy">
- <title>Privacy Requirements</title>
- <para>
-
-The privacy requirements are primarily concerned with reducing linkability:
-the ability for a user's activity on one site to be linked with their activity
-on another site without their knowledge or explicit consent. With respect to
-browser support, privacy requirements are the set of properties that cause us
-to prefer one browser over another.
-
- </para>
-
- <para>
-
-For the purposes of the unlinkability requirements of this section as well as
-the descriptions in the <link linkend="Implementation">implementation
-section</link>, a <command>url bar origin</command> means at least the
-second-level DNS name. For example, for mail.google.com, the origin would be
-google.com. Implementations MAY, at their option, restrict the url bar origin
-to be the entire fully qualified domain name.
-
- </para>
-
-<orderedlist>
- <listitem><link linkend="identifier-linkability"><command>Cross-Origin
-Identifier Unlinkability</command></link>
- <para>
-
-User activity on one url bar origin MUST NOT be linkable to their activity in
-any other url bar origin by any third party automatically or without user
-interaction or approval. This requirement specifically applies to linkability
-from stored browser identifiers, authentication tokens, and shared state. The
-requirement does not apply to linkable information the user manually submits
-to sites, or due to information submitted during manual link traversal. This
-functionality SHOULD NOT interfere with interactive, click-driven federated
-login in a substantial way.
-
- </para>
- </listitem>
- <listitem><link linkend="fingerprinting-linkability"><command>Cross-Origin
-Fingerprinting Unlinkability</command></link>
- <para>
-
-User activity on one url bar origin MUST NOT be linkable to their activity in
-any other url bar origin by any third party. This property specifically applies to
-linkability from fingerprinting browser behavior.
-
- </para>
- </listitem>
- <listitem><link linkend="new-identity"><command>Long-Term
-Unlinkability</command></link>
- <para>
-
-The browser MUST provide an obvious, easy way for the user to remove all of
-its authentication tokens and browser state and obtain a fresh identity.
-Additionally, the browser SHOULD clear linkable state by default automatically
-upon browser restart, except at user option.
-
- </para>
- </listitem>
-</orderedlist>
-
- </sect2>
- <sect2 id="philosophy">
- <title>Philosophy</title>
- <para>
-
-In addition to the above design requirements, the technology decisions about
-Tor Browser are also guided by some philosophical positions about technology.
-
- </para>
- <orderedlist>
- <listitem><command>Preserve existing user model</command>
- <para>
-
-The existing way that the user expects to use a browser must be preserved. If
-the user has to maintain a different mental model of how the sites they are
-using behave depending on tab, browser state, or anything else that would not
-normally be what they experience in their default browser, the user will
-inevitably be confused. They will make mistakes and reduce their privacy as a
-result. Worse, they may just stop using the browser, assuming it is broken.
-
- </para>
- <para>
-
-User model breakage was one of the <ulink
-url="https://blog.torproject.org/blog/toggle-or-not-toggle-end-torbutton">failures
-of Torbutton</ulink>: Even if users managed to install everything properly,
-the toggle model was too hard for the average user to understand, especially
-in the face of accumulating tabs from multiple states crossed with the current
-Tor-state of the browser.
-
- </para>
- </listitem>
- <listitem><command>Favor the implementation mechanism least likely to
-break sites</command>
- <para>
-
-In general, we try to find solutions to privacy issues that will not induce
-site breakage, though this is not always possible.
-
- </para>
- </listitem>
- <listitem><command>Plugins must be restricted</command>
- <para>
-
-Even if plugins always properly used the browser proxy settings (which none of
-them do) and could not be induced to bypass them (which all of them can), the
-activities of closed-source plugins are very difficult to audit and control.
-They can obtain and transmit all manner of system information to websites,
-often have their own identifier storage for tracking users, and also
-contribute to fingerprinting.
-
- </para>
- <para>
-
-Therefore, if plugins are to be enabled in private browsing modes, they must
-be restricted from running automatically on every page (via click-to-play
-placeholders), and/or be sandboxed to restrict the types of system calls they
-can execute. If the user agent allows the user to craft an exemption to allow
-a plugin to be used automatically, it must only apply to the top level url bar
-domain, and not to all sites, to reduce cross-origin fingerprinting
-linkability.
-
- </para>
- </listitem>
- <listitem><command>Minimize Global Privacy Options</command>
- <para>
-
-<ulink url="https://trac.torproject.org/projects/tor/ticket/3100">Another
-failure of Torbutton</ulink> was the options panel. Each option
-that detectably alters browser behavior can be used as a fingerprinting tool.
-Similarly, all extensions <ulink
-url="http://blog.chromium.org/2010/06/extensions-in-incognito.html">should be
-disabled in the mode</ulink> except as an opt-in basis. We should not load
-system-wide and/or Operating System provided addons or plugins.
-
- </para>
- <para>
-Instead of global browser privacy options, privacy decisions should be made
-<ulink
-url="https://wiki.mozilla.org/Privacy/Features/Site-based_data_management_UI">per
-url bar origin</ulink> to eliminate the possibility of linkability
-between domains. For example, when a plugin object (or a Javascript access of
-window.plugins) is present in a page, the user should be given the choice of
-allowing that plugin object for that url bar origin only. The same
-goes for exemptions to third party cookie policy, geo-location, and any other
-privacy permissions.
- </para>
- <para>
-If the user has indicated they wish to record local history storage, these
-permissions can be written to disk. Otherwise, they should remain memory-only.
- </para>
- </listitem>
- <listitem><command>No filters</command>
- <para>
-
-Site-specific or filter-based addons such as <ulink
-url="https://addons.mozilla.org/en-US/firefox/addon/adblock-plus/">AdBlock
-Plus</ulink>, <ulink url="http://requestpolicy.com/">Request Policy</ulink>,
-<ulink url="http://www.ghostery.com/about">Ghostery</ulink>, <ulink
-url="http://priv3.icsi.berkeley.edu/">Priv3</ulink>, and <ulink
-url="http://sharemenot.cs.washington.edu/">Sharemenot</ulink> are to be
-avoided. We believe that these addons do not add any real privacy to a proper
-<link linkend="Implementation">implementation</link> of the above <link
-linkend="privacy">privacy requirements</link>, and that development efforts
-should be focused on general solutions that prevent tracking by all
-third parties, rather than a list of specific URLs or hosts.
- </para>
- <para>
-Filter-based addons can also introduce strange breakage and cause usability
-nightmares, and will also fail to do their job if an adversary simply
-registers a new domain or creates a new url path. Worse still, the unique
-filter sets that each user creates or installs will provide a wealth of
-fingerprinting targets.
- </para>
- <para>
-
-As a general matter, we are also generally opposed to shipping an always-on Ad
-blocker with Tor Browser. We feel that this would damage our credibility in
-terms of demonstrating that we are providing privacy through a sound design
-alone, as well as damage the acceptance of Tor users by sites that support
-themselves through advertising revenue.
-
- </para>
- <para>
-Users are free to install these addons if they wish, but doing
-so is not recommended, as it will alter the browser request fingerprint.
- </para>
- </listitem>
- <listitem><command>Stay Current</command>
- <para>
-We believe that if we do not stay current with the support of new web
-technologies, we cannot hope to substantially influence or be involved in
-their proper deployment or privacy realization. However, we will likely disable
-high-risk features pending analysis, audit, and mitigation.
- </para>
- </listitem>
-<!--
- <listitem><command>Transparency in Navigation Tracking</command>
- <para>
-
-While we believe it is possible to restrict third party tracking with only
-minimal site breakage, it is our long-term goal to further reduce cross-origin
-click navigation tracking to mechanisms that are detectable by experts and
-attentive users, so they can alert the general public if cross-origin
-click navigation tracking is happening where it should not be.
-
- </para>
- <para>
-
-However, the entrenched nature of certain archaic web features make it
-impossible for us to achieve this wider goal by ourselves without substantial
-site breakage. So, instead we maintain a <link linkend="deprecate">Deprecation
-Wishlist</link> of archaic web technologies that are currently being (ab)used
-to facilitate federated login and other legitimate click-driven cross-domain
-activity but that can one day be replaced with more privacy friendly,
-auditable alternatives.
-
- </para>
- </listitem>
--->
- </orderedlist>
- </sect2>
-</sect1>
-
-<!--
-- Implementation
- - Section Template
- - Sub Section
- - "Design Goal":
- - "Implementation Status"
- - Local Privacy
- - Linkability
- - Stored State
- - Cookies
- - Cache
- - DOM Storage
- - HTTP Auth
- - SSL state
- - Plugins
- - Fingerprinting
- - Location + timezone is part of this
- - Patches?
--->
- <sect1 id="adversary">
- <title>Adversary Model</title>
- <para>
-
-A Tor web browser adversary has a number of goals, capabilities, and attack
-types that can be used to illustrate the design requirements for the
-Tor Browser. Let's start with the goals.
-
- </para>
- <sect2 id="adversary-goals">
- <title>Adversary Goals</title>
- <orderedlist>
-<!-- These aren't really commands.. But it's the closest I could find in an
-acceptable style.. Don't really want to make my own stylesheet -->
- <listitem><command>Bypassing proxy settings</command>
- <para>The adversary's primary goal is direct compromise and bypass of
-Tor, causing the user to directly connect to an IP of the adversary's
-choosing.</para>
- </listitem>
- <listitem><command>Correlation of Tor vs Non-Tor Activity</command>
- <para>If direct proxy bypass is not possible, the adversary will likely
-happily settle for the ability to correlate something a user did via Tor with
-their non-Tor activity. This can be done with cookies, cache identifiers,
-javascript events, and even CSS. Sometimes the fact that a user uses Tor may
-be enough for some authorities.</para>
- </listitem>
- <listitem><command>History disclosure</command>
- <para>
-The adversary may also be interested in history disclosure: the ability to
-query a user's history to see if they have issued certain censored search
-queries, or visited censored sites.
- </para>
- </listitem>
- <listitem><command>Correlate activity across multiple sites</command>
- <para>
-
-The primary goal of the advertising networks is to know that the user who
-visited siteX.com is the same user that visited siteY.com to serve them
-targeted ads. The advertising networks become our adversary insofar as they
-attempt to perform this correlation without the user's explicit consent.
-
- </para>
- </listitem>
- <listitem><command>Fingerprinting/anonymity set reduction</command>
- <para>
-
-Fingerprinting (more generally: "anonymity set reduction") is used to attempt
-to gather identifying information on a particular individual without the use
-of tracking identifiers. If the dissident or whistleblower's timezone is
-available, and they are using a rare build of Firefox for an obscure operating
-system, and they have a specific display resolution only used on one type of
-laptop, this can be very useful information for tracking them down, or at
-least <link linkend="fingerprinting">tracking their activities</link>.
-
- </para>
- </listitem>
- <listitem><command>History records and other on-disk
-information</command>
- <para>
-In some cases, the adversary may opt for a heavy-handed approach, such as
-seizing the computers of all Tor users in an area (especially after narrowing
-the field by the above two pieces of information). History records and cache
-data are the primary goals here.
- </para>
- </listitem>
- </orderedlist>
- </sect2>
-
- <sect2 id="adversary-positioning">
- <title>Adversary Capabilities - Positioning</title>
- <para>
-The adversary can position themselves at a number of different locations in
-order to execute their attacks.
- </para>
- <orderedlist>
- <listitem><command>Exit Node or Upstream Router</command>
- <para>
-The adversary can run exit nodes, or alternatively, they may control routers
-upstream of exit nodes. Both of these scenarios have been observed in the
-wild.
- </para>
- </listitem>
- <listitem><command>Ad servers and/or Malicious Websites</command>
- <para>
-The adversary can also run websites, or more likely, they can contract out
-ad space from a number of different ad servers and inject content that way. For
-some users, the adversary may be the ad servers themselves. It is not
-inconceivable that ad servers may try to subvert or reduce a user's anonymity
-through Tor for marketing purposes.
- </para>
- </listitem>
- <listitem><command>Local Network/ISP/Upstream Router</command>
- <para>
-The adversary can also inject malicious content at the user's upstream router
-when they have Tor disabled, in an attempt to correlate their Tor and Non-Tor
-activity.
- </para>
- <para>
-
-Additionally, at this position the adversary can block Tor, or attempt to
-recognize the traffic patterns of specific web pages at the entrance to the Tor
-network.
-
- </para>
- </listitem>
- <listitem><command>Physical Access</command>
- <para>
-Some users face adversaries with intermittent or constant physical access.
-Users in Internet cafes, for example, face such a threat. In addition, in
-countries where simply using tools like Tor is illegal, users may face
-confiscation of their computer equipment for excessive Tor usage or just
-general suspicion.
- </para>
- </listitem>
- </orderedlist>
- </sect2>
-
- <sect2 id="attacks">
- <title>Adversary Capabilities - Attacks</title>
- <para>
-
-The adversary can perform the following attacks from a number of different
-positions to accomplish various aspects of their goals. It should be noted
-that many of these attacks (especially those involving IP address leakage) are
-often performed by accident by websites that simply have Javascript, dynamic
-CSS elements, and plugins. Others are performed by ad servers seeking to
-correlate users' activity across different IP addresses, and still others are
-performed by malicious agents on the Tor network and at national firewalls.
-
- </para>
- <orderedlist>
- <listitem><command>Read and insert identifiers</command>
- <para>
-
-The browser contains multiple facilities for storing identifiers that the
-adversary creates for the purposes of tracking users. These identifiers are
-most obviously cookies, but also include HTTP auth, DOM storage, cached
-scripts and other elements with embedded identifiers, client certificates, and
-even TLS Session IDs.
-
- </para>
- <para>
-
-An adversary in a position to perform MITM content alteration can inject
-document content elements to both read and inject cookies for arbitrary
-domains. In fact, even many "SSL secured" websites are vulnerable to this sort of
-<ulink url="http://seclists.org/bugtraq/2007/Aug/0070.html">active
-sidejacking</ulink>. In addition, the ad networks of course perform tracking
-with cookies as well.
-
- </para>
- <para>
-
-These types of attacks are attempts at subverting our <link
-linkend="identifier-linkability">Cross-Origin Identifier Unlinkability</link> and <link
-linkend="new-identity">Long-Term Unlikability</link> design requirements.
-
- </para>
- </listitem>
- <listitem id="fingerprinting"><command>Fingerprint users based on browser
-attributes</command>
-<para>
-
-There is an absurd amount of information available to websites via attributes
-of the browser. This information can be used to reduce anonymity set, or even
-uniquely fingerprint individual users. Attacks of this nature are typically
-aimed at tracking users across sites without their consent, in an attempt to
-subvert our <link linkend="fingerprinting-linkability">Cross-Origin
-Fingerprinting Unlinkability</link> and <link
-linkend="new-identity">Long-Term Unlikability</link> design requirements.
-
-</para>
-
-<para>
-
-Fingerprinting is an intimidating
-problem to attempt to tackle, especially without a metric to determine or at
-least intuitively understand and estimate which features will most contribute
-to linkability between visits.
-
-</para>
-
-<para>
-
-The <ulink url="https://panopticlick.eff.org/about.php">Panopticlick study
-done</ulink> by the EFF uses the <ulink
-url="https://en.wikipedia.org/wiki/Entropy_%28information_theory%29">Shannon
-entropy</ulink> - the number of identifying bits of information encoded in
-browser properties - as this metric. Their <ulink
-url="https://wiki.mozilla.org/Fingerprinting#Data">result data</ulink> is
-definitely useful, and the metric is probably the appropriate one for
-determining how identifying a particular browser property is. However, some
-quirks of their study means that they do not extract as much information as
-they could from display information: they only use desktop resolution and do
-not attempt to infer the size of toolbars. In the other direction, they may be
-over-counting in some areas, as they did not compute joint entropy over
-multiple attributes that may exhibit a high degree of correlation. Also, new
-browser features are added regularly, so the data should not be taken as
-final.
-
- </para>
- <para>
-
-Despite the uncertainty, all fingerprinting attacks leverage the following
-attack vectors:
-
- </para>
- <orderedlist>
- <listitem><command>Observing Request Behavior</command>
- <para>
-
-Properties of the user's request behavior comprise the bulk of low-hanging
-fingerprinting targets. These include: User agent, Accept-* headers, pipeline
-usage, and request ordering. Additionally, the use of custom filters such as
-AdBlock and other privacy filters can be used to fingerprint request patterns
-(as an extreme example).
-
- </para>
- </listitem>
-
- <listitem><command>Inserting Javascript</command>
- <para>
-
-Javascript can reveal a lot of fingerprinting information. It provides DOM
-objects such as window.screen and window.navigator to extract information
-about the useragent.
-
-Also, Javascript can be used to query the user's timezone via the
-<function>Date()</function> object, <ulink
-url="https://www.khronos.org/registry/webgl/specs/1.0/#5.13">WebGL</ulink> can
-reveal information about the video card in use, and high precision timing
-information can be used to <ulink
-url="http://w2spconf.com/2011/papers/jspriv.pdf">fingerprint the CPU and
-interpreter speed</ulink>. In the future, new JavaScript features such as
-<ulink url="http://w3c-test.org/webperf/specs/ResourceTiming/">Resource
-Timing</ulink> may leak an unknown amount of network timing related
-information.
-
-<!-- FIXME: resource-timing stuff? -->
-
- </para>
- </listitem>
-
- <listitem><command>Inserting Plugins</command>
- <para>
-
-The Panopticlick project found that the mere list of installed plugins (in
-navigator.plugins) was sufficient to provide a large degree of
-fingerprintability. Additionally, plugins are capable of extracting font lists,
-interface addresses, and other machine information that is beyond what the
-browser would normally provide to content. In addition, plugins can be used to
-store unique identifiers that are more difficult to clear than standard
-cookies. <ulink url="http://epic.org/privacy/cookies/flash.html">Flash-based
-cookies</ulink> fall into this category, but there are likely numerous other
-examples. Beyond fingerprinting, plugins are also abysmal at obeying the proxy
-settings of the browser.
-
-
- </para>
- </listitem>
- <listitem><command>Inserting CSS</command>
- <para>
-
-<ulink url="https://developer.mozilla.org/En/CSS/Media_queries">CSS media
-queries</ulink> can be inserted to gather information about the desktop size,
-widget size, display type, DPI, user agent type, and other information that
-was formerly available only to Javascript.
-
- </para>
- </listitem>
- </orderedlist>
- </listitem>
- <listitem id="website-traffic-fingerprinting"><command>Website traffic fingerprinting</command>
- <para>
-
-Website traffic fingerprinting is an attempt by the adversary to recognize the
-encrypted traffic patterns of specific websites. In the case of Tor, this
-attack would take place between the user and the Guard node, or at the Guard
-node itself.
- </para>
-
- <para> The most comprehensive study of the statistical properties of this
-attack against Tor was done by <ulink
-url="http://lorre.uni.lu/~andriy/papers/acmccs-wpes11-fingerprinting.pdf">Panchenko
-et al</ulink>. Unfortunately, the publication bias in academia has encouraged
-the production of a number of follow-on attack papers claiming "improved"
-success rates, in some cases even claiming to completely invalidate any
-attempt at defense. These "improvements" are actually enabled primarily by
-taking a number of shortcuts (such as classifying only very small numbers of
-web pages, neglecting to publish ROC curves or at least false positive rates,
-and/or omitting the effects of dataset size on their results). Despite these
-subsequent "improvements", we are skeptical of the efficacy of this attack in
-a real world scenario, <emphasis>especially</emphasis> in the face of any
-defenses.
-
- </para>
- <para>
-
-In general, with machine learning, as you increase the <ulink
-url="https://en.wikipedia.org/wiki/VC_dimension">number and/or complexity of
-categories to classify</ulink> while maintaining a limit on reliable feature
-information you can extract, you eventually run out of descriptive feature
-information, and either true positive accuracy goes down or the false positive
-rate goes up. This error is called the <ulink
-url="http://www.cs.washington.edu/education/courses/csep573/98sp/lectures/lectur…">bias
-in your hypothesis space</ulink>. In fact, even for unbiased hypothesis
-spaces, the number of training examples required to achieve a reasonable error
-bound is <ulink
-url="https://en.wikipedia.org/wiki/Probably_approximately_correct_learning#Equiv…">a
-function of the complexity of the categories</ulink> you need to classify.
-
- </para>
- <para>
-
-
-In the case of this attack, the key factors that increase the classification
-complexity (and thus hinder a real world adversary who attempts this attack)
-are large numbers of dynamically generated pages, partially cached content,
-and also the non-web activity of entire Tor network. This yields an effective
-number of "web pages" many orders of magnitude larger than even <ulink
-url="http://lorre.uni.lu/~andriy/papers/acmccs-wpes11-fingerprinting.pdf">Panchenko's
-"Open World" scenario</ulink>, which suffered continous near-constant decline
-in the true positive rate as the "Open World" size grew (see figure 4). This
-large level of classification complexity is further confounded by a noisy and
-low resolution featureset - one which is also relatively easy for the defender
-to manipulate at low cost.
-
- </para>
- <para>
-
-To make matters worse for a real-world adversary, the ocean of Tor Internet
-activity (at least, when compared to a lab setting) makes it a certainty that
-an adversary attempting examine large amounts of Tor traffic will ultimately
-be overwhelmed by false positives (even after making heavy tradeoffs on the
-ROC curve to minimize false positives to below 0.01%). This problem is known
-in the IDS literature as the <ulink
-url="http://www.raid-symposium.org/raid99/PAPERS/Axelsson.pdf">Base Rate
-Fallacy</ulink>, and it is the primary reason that anomaly and activity
-classification-based IDS and antivirus systems have failed to materialize in
-the marketplace (despite early success in academic literature).
-
- </para>
- <para>
-
-Still, we do not believe that these issues are enough to dismiss the attack
-outright. But we do believe these factors make it both worthwhile and
-effective to <link linkend="traffic-fingerprinting-defenses">deploy
-light-weight defenses</link> that reduce the accuracy of this attack by
-further contributing noise to hinder successful feature extraction.
-
- </para>
- </listitem>
- <listitem><command>Remotely or locally exploit browser and/or
-OS</command>
- <para>
-
-Last, but definitely not least, the adversary can exploit either general
-browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to
-install malware and surveillance software. An adversary with physical access
-can perform similar actions.
-
- </para>
- <para>
-
-For the purposes of the browser itself, we limit the scope of this adversary
-to one that has passive forensic access to the disk after browsing activity
-has taken place. This adversary motivates our
-<link linkend="disk-avoidance">Disk Avoidance</link> defenses.
-
- </para>
- <para>
-
-An adversary with arbitrary code execution typically has more power, though.
-It can be quite hard to really significantly limit the capabilities of such an
-adversary. <ulink
-url="http://tails.boum.org/contribute/design/">The Tails system</ulink> can
-provide some defense against this adversary through the use of readonly media
-and frequent reboots, but even this can be circumvented on machines without
-Secure Boot through the use of BIOS rootkits.
-
- </para>
- </listitem>
- </orderedlist>
- </sect2>
-
-</sect1>
-
-<sect1 id="Implementation">
- <title>Implementation</title>
- <para>
-
-The Implementation section is divided into subsections, each of which
-corresponds to a <link linkend="DesignRequirements">Design Requirement</link>.
-Each subsection is divided into specific web technologies or properties. The
-implementation is then described for that property.
-
- </para>
- <para>
-
-In some cases, the implementation meets the design requirements in a non-ideal
-way (for example, by disabling features). In rare cases, there may be no
-implementation at all. Both of these cases are denoted by differentiating
-between the <command>Design Goal</command> and the <command>Implementation
-Status</command> for each property. Corresponding bugs in the <ulink
-url="https://trac.torproject.org/projects/tor/report">Tor bug tracker</ulink>
-are typically linked for these cases.
-
- </para>
- <sect2 id="proxy-obedience">
- <title>Proxy Obedience</title>
- <para>
-
-Proxy obedience is assured through the following:
- </para>
-<orderedlist>
- <listitem>Firefox proxy settings, patches, and build flags
- <para>
-Our <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/HEAD:/build-scripts/confi…">Firefox
-preferences file</ulink> sets the Firefox proxy settings to use Tor directly as a
-SOCKS proxy. It sets <command>network.proxy.socks_remote_dns</command>,
-<command>network.proxy.socks_version</command>,
-<command>network.proxy.socks_port</command>, and
-<command>network.dns.disablePrefetch</command>.
- </para>
- <para>
-
-We also patch Firefox in order to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
-a DNS leak due to a WebSocket rate-limiting check</ulink>. As stated in the
-patch, we believe the direct DNS resolution performed by this check is in
-violation of the W3C standard, but <ulink
-url="https://bugzilla.mozilla.org/show_bug.cgi?id=751465">this DNS proxy leak
-remains present in stock Firefox releases</ulink>.
-
- </para>
- <para>
-
-During the transition to Firefox 17-ESR, a code audit was undertaken to verify
-that there were no system calls or XPCOM activity in the source tree that did
-not use the browser proxy settings. The only violation we found was that
-WebRTC was capable of creating UDP sockets and was compiled in by default. We
-subsequently disabled it using the Firefox build option
-<command>--disable-webrtc</command>.
-
- </para>
- <para>
-
-We have verified that these settings and patches properly proxy HTTPS, OCSP,
-HTTP, FTP, gopher (now defunct), DNS, SafeBrowsing Queries, all javascript
-activity, including HTML5 audio and video objects, addon updates, wifi
-geolocation queries, searchbox queries, XPCOM addon HTTPS/HTTP activity,
-WebSockets, and live bookmark updates. We have also verified that IPv6
-connections are not attempted, through the proxy or otherwise (Tor does not
-yet support IPv6). We have also verified that external protocol helpers, such
-as smb urls and other custom protocol handlers are all blocked.
-
- </para>
- <para>
-
-Numerous other third parties have also reviewed and tested the proxy settings
-and have provided test cases based on their work. See in particular <ulink
-url="http://decloak.net/">decloak.net</ulink>.
-
- </para>
-</listitem>
-
- <listitem>Disabling plugins
-
- <para>Plugins have the ability to make arbitrary OS system calls and <ulink
-url="http://decloak.net/">bypass proxy settings</ulink>. This includes
-the ability to make UDP sockets and send arbitrary data independent of the
-browser proxy settings.
- </para>
- <para>
-Torbutton disables plugins by using the
-<command>@mozilla.org/plugin/host;1</command> service to mark the plugin tags
-as disabled. This block can be undone through both the Torbutton Security UI,
-and the Firefox Plugin Preferences.
- </para>
- <para>
-If the user does enable plugins in this way, plugin-handled objects are still
-restricted from automatic load through Firefox's click-to-play preference
-<command>plugins.click_to_play</command>.
- </para>
- <para>
-In addition, to reduce any unproxied activity by arbitrary plugins at load
-time, and to reduce the fingerprintability of the installed plugin list, we
-also patch the Firefox source code to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent the load of any plugins except
-for Flash and Gnash</ulink>.
-
- </para>
- </listitem>
- <listitem>External App Blocking and Drag Event Filtering
- <para>
-
-External apps can be induced to load files that perform network activity.
-Unfortunately, there are cases where such apps can be launched automatically
-with little to no user input. In order to prevent this, Torbutton installs a
-component to <ulink
-url="https://gitweb.torproject.org/torbutton.git/blob_plain/HEAD:/src/components…">
-provide the user with a popup</ulink> whenever the browser attempts to launch
-a helper app.
-
- </para>
- <para>
-
-Additionally, modern desktops now pre-emptively fetch any URLs in Drag and
-Drop events as soon as the drag is initiated. This download happens
-independent of the browser's Tor settings, and can be triggered by something
-as simple as holding the mouse button down for slightly too long while
-clicking on an image link. We had to patch Firefox to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">emit
-an observer event during dragging</ulink> to allow us to filter the drag
-events from Torbutton before the OS downloads the URLs the events contained.
-
- </para>
- </listitem>
- <listitem>Disabling system extensions and clearing the addon whitelist
- <para>
-
-Firefox addons can perform arbitrary activity on your computer, including
-bypassing Tor. It is for this reason we disable the addon whitelist
-(<command>xpinstall.whitelist.add</command>), so that users are prompted
-before installing addons regardless of the source. We also exclude
-system-level addons from the browser through the use of
-<command>extensions.enabledScopes</command> and
-<command>extensions.autoDisableScopes</command>.
-
- </para>
- </listitem>
- </orderedlist>
- </sect2>
- <sect2 id="state-separation">
- <title>State Separation</title>
- <para>
-
-Tor Browser State is separated from existing browser state through use of a
-custom Firefox profile, and by setting the $HOME environment variable to the
-root of the bundle's directory. The browser also does not load any
-system-wide extensions (through the use of
-<command>extensions.enabledScopes</command> and
-<command>extensions.autoDisableScopes</command>. Furthermore, plugins are
-disabled, which prevents Flash cookies from leaking from a pre-existing Flash
-directory.
-
- </para>
- </sect2>
- <sect2 id="disk-avoidance">
- <title>Disk Avoidance</title>
- <sect3>
- <title>Design Goal:</title>
- <blockquote>
-
-The User Agent MUST (at user option) prevent all disk records of browser activity.
-The user should be able to optionally enable URL history and other history
-features if they so desire.
-
- </blockquote>
- </sect3>
- <sect3>
- <title>Implementation Status:</title>
- <blockquote>
-
-We achieve this goal through several mechanisms. First, we set the Firefox
-Private Browsing preference
-<command>browser.privatebrowsing.autostart</command>. In addition, four Firefox patches are needed to prevent disk writes, even if
-Private Browsing Mode is enabled. We need to
-
-<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
-the permissions manager from recording HTTPS STS state</ulink>,
-<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
-intermediate SSL certificates from being recorded</ulink>,
-<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
-download history from being recorded</ulink>, and
-<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prevent
-the content preferences service from recording site zoom</ulink>.
-
-For more details on these patches, <link linkend="firefox-patches">see the
-Firefox Patches section</link>.
-
- </blockquote>
- <blockquote>
-
-As an additional defense-in-depth measure, we set the following preferences:
-<command></command>,
-<command>browser.cache.disk.enable</command>,
-<command>browser.cache.offline.enable</command>,
-<command>dom.indexedDB.enabled</command>,
-<command>network.cookie.lifetimePolicy</command>,
-<command>signon.rememberSignons</command>,
-<command>browser.formfill.enable</command>,
-<command>browser.download.manager.retention</command>,
-<command>browser.sessionstore.privacy_level</command>,
-and <command>network.cookie.lifetimePolicy</command>. Many of these
-preferences are likely redundant with
-<command>browser.privatebrowsing.autostart</command>, but we have not done the
-auditing work to ensure that yet.
-
- </blockquote>
- <blockquote>
-
-Torbutton also <ulink
-url="https://gitweb.torproject.org/torbutton.git/blob/HEAD:/src/components/tbSes…">contains
-code</ulink> to prevent the Firefox session store from writing to disk.
- </blockquote>
- <blockquote>
-
-For more details on disk leak bugs and enhancements, see the <ulink
-url="https://trac.torproject.org/projects/tor/query?keywords=~tbb-disk-leak&…">tbb-disk-leak tag in our bugtracker</ulink>
- </blockquote>
- </sect3>
- </sect2>
- <sect2 id="app-data-isolation">
- <title>Application Data Isolation</title>
- <para>
-
-Tor Browser Bundle MUST NOT cause any information to be written outside of the
-bundle directory. This is to ensure that the user is able to completely and
-safely remove the bundle without leaving other traces of Tor usage on their
-computer.
-
- </para>
- <para>
-
-To ensure TBB directory isolation, we set
-<command>browser.download.useDownloadDir</command>,
-<command>browser.shell.checkDefaultBrowser</command>, and
-<command>browser.download.manager.addToRecentDocs</command>. We also set the
-$HOME environment variable to be the TBB extraction directory.
- </para>
-
- </sect2>
-<!-- FIXME: Write me...
- <sect2 id="update-safety">
- <title>Update Safety</title>
- <para>FIXME: Write me..
- </para>
- </sect2>
--->
- <sect2 id="identifier-linkability">
- <title>Cross-Origin Identifier Unlinkability</title>
- <!-- FIXME: Mention web-send?? -->
- <para>
-
-The Tor Browser MUST prevent a user's activity on one site from being linked
-to their activity on another site. When this goal cannot yet be met with an
-existing web technology, that technology or functionality is disabled. Our
-<link linkend="privacy">design goal</link> is to ultimately eliminate the need to disable arbitrary
-technologies, and instead simply alter them in ways that allows them to
-function in a backwards-compatible way while avoiding linkability. Users
-should be able to use federated login of various kinds to explicitly inform
-sites who they are, but that information should not transparently allow a
-third party to record their activity from site to site without their prior
-consent.
-
- </para>
- <para>
-
-The benefit of this approach comes not only in the form of reduced
-linkability, but also in terms of simplified privacy UI. If all stored browser
-state and permissions become associated with the url bar origin, the six or
-seven different pieces of privacy UI governing these identifiers and
-permissions can become just one piece of UI. For instance, a window that lists
-the url bar origin for which browser state exists, possibly with a
-context-menu option to drill down into specific types of state or permissions.
-An example of this simplification can be seen in Figure 1.
-
- </para>
- <figure><title>Improving the Privacy UI</title>
- <mediaobject>
- <imageobject>
- <imagedata align="center" fileref="NewCookieManager.png"/>
- </imageobject>
- </mediaobject>
- <caption> <para/>
-
-This example UI is a mock-up of how isolating identifiers to the URL bar
-origin can simplify the privacy UI for all data - not just cookies. Once
-browser identifiers and site permissions operate on a url bar basis, the same
-privacy window can represent browsing history, DOM Storage, HTTP Auth, search
-form history, login values, and so on within a context menu for each site.
-
-</caption>
- </figure>
- <orderedlist>
- <listitem>Cookies
- <para><command>Design Goal:</command>
-
-All cookies MUST be double-keyed to the url bar origin and third-party
-origin. There exists a <ulink
-url="https://bugzilla.mozilla.org/show_bug.cgi?id=565965">Mozilla bug</ulink>
-that contains a prototype patch, but it lacks UI, and does not apply to modern
-Firefoxes.
-
- </para>
- <para><command>Implementation Status:</command>
-
-As a stopgap to satisfy our design requirement of unlinkability, we currently
-entirely disable 3rd party cookies by setting
-<command>network.cookie.cookieBehavior</command> to 1. We would prefer that
-third party content continue to function, but we believe the requirement for
-unlinkability trumps that desire.
-
- </para>
- </listitem>
- <listitem>Cache
- <para>
-
-Cache is isolated to the url bar origin by using a technique pioneered by
-Colin Jackson et al, via their work on <ulink
-url="http://www.safecache.com/">SafeCache</ulink>. The technique re-uses the
-<ulink
-url="https://developer.mozilla.org/en/XPCOM_Interface_Reference/nsICachingChannel">nsICachingChannel.cacheKey</ulink>
-attribute that Firefox uses internally to prevent improper caching and reuse
-of HTTP POST data.
-
- </para>
- <para>
-
-However, to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3666">increase the
-security of the isolation</ulink> and to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3754">solve conflicts
-with OCSP relying the cacheKey property for reuse of POST requests</ulink>, we
-had to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
-Firefox to provide a cacheDomain cache attribute</ulink>. We use the fully
-qualified url bar domain as input to this field, to avoid the complexities
-of heuristically determining the second-level DNS name.
-
- </para>
- <para>
-
-<!-- FIXME: This could use a few more specifics.. Maybe. The Chrome folks
-won't care, but the Mozilla folks might. --> Furthermore, we chose a different
-isolation scheme than the Stanford implementation. First, we decoupled the
-cache isolation from the third party cookie attribute. Second, we use several
-mechanisms to attempt to determine the actual location attribute of the
-top-level window (to obtain the url bar FQDN) used to load the page, as
-opposed to relying solely on the Referer property.
-
- </para>
- <para>
-
-Therefore, <ulink
-url="http://crypto.stanford.edu/sameorigin/safecachetest.html">the original
-Stanford test cases</ulink> are expected to fail. Functionality can still be
-verified by navigating to <ulink url="about:cache">about:cache</ulink> and
-viewing the key used for each cache entry. Each third party element should
-have an additional "domain=string" property prepended, which will list the
-FQDN that was used to source the third party element.
-
- </para>
- <para>
-
-Additionally, because the image cache is a separate entity from the content
-cache, we had to patch Firefox to also <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">isolate
-this cache per url bar domain</ulink>.
-
- </para>
- </listitem>
- <listitem>HTTP Auth
- <para>
-
-HTTP authentication tokens are removed for third party elements using the
-<ulink
-url="https://developer.mozilla.org/en/Setting_HTTP_request_headers#Observers">http-on-modify-request
-observer</ulink> to remove the Authorization headers to prevent <ulink
-url="http://jeremiahgrossman.blogspot.com/2007/04/tracking-users-without-cookies…">silent
-linkability between domains</ulink>.
- </para>
- </listitem>
- <listitem>DOM Storage
- <para>
-
-DOM storage for third party domains MUST be isolated to the url bar origin,
-to prevent linkability between sites. This functionality is provided through a
-<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
-to Firefox</ulink>.
-
- </para>
- </listitem>
- <listitem>Flash cookies
- <para><command>Design Goal:</command>
-
-Users should be able to click-to-play flash objects from trusted sites. To
-make this behavior unlinkable, we wish to include a settings file for all platforms that disables flash
-cookies using the <ulink
-url="http://www.macromedia.com/support/documentation/en/flashplayer/help/setting…">Flash
-settings manager</ulink>.
-
- </para>
- <para><command>Implementation Status:</command>
-
-We are currently <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3974">having
-difficulties</ulink> causing Flash player to use this settings
-file on Windows, so Flash remains difficult to enable.
-
- </para>
- </listitem>
- <listitem>SSL+TLS session resumption, HTTP Keep-Alive and SPDY
- <para><command>Design Goal:</command>
-
-TLS session resumption tickets and SSL Session IDs MUST be limited to the url
-bar origin. HTTP Keep-Alive connections from a third party in one url bar
-origin MUST NOT be reused for that same third party in another url bar origin.
-
- </para>
- <para><command>Implementation Status:</command>
-
-We currently clear SSL Session IDs upon <link linkend="new-identity">New
-Identity</link>, we disable TLS Session Tickets via the Firefox Pref
-<command>security.enable_tls_session_tickets</command>. We disable SSL Session
-IDs via a <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
-to Firefox</ulink>. To compensate for the increased round trip latency from disabling
-these performance optimizations, we also enable
-<ulink url="https://tools.ietf.org/html/draft-bmoeller-tls-falsestart-00">TLS
-False Start</ulink> via the Firefox Pref
-<command>security.ssl.enable_false_start</command>.
- </para>
- <para>
-
-Because of the extreme performance benefits of HTTP Keep-Alive for interactive
-web apps, and because of the difficulties of conveying urlbar origin
-information down into the Firefox HTTP layer, as a compromise we currently
-merely reduce the HTTP Keep-Alive timeout to 20 seconds (which is measured
-from the last packet read on the connection) using the Firefox preference
-<command>network.http.keep-alive.timeout</command>.
-
- </para>
- <para>
-However, because SPDY can store identifiers and has extremely long keepalive
-duration, it is disabled through the Firefox preference
-<command>network.http.spdy.enabled</command>.
- </para>
- </listitem>
- <listitem>Automated cross-origin redirects MUST NOT store identifiers
- <para><command>Design Goal:</command>
-
-To prevent attacks aimed at subverting the Cross-Origin Identifier
-Unlinkability <link linkend="privacy">privacy requirement</link>, the browser
-MUST NOT store any identifiers (cookies, cache, DOM storage, HTTP auth, etc)
-for cross-origin redirect intermediaries that do not prompt for user input.
-For example, if a user clicks on a bit.ly url that redirects to a
-doubleclick.net url that finally redirects to a cnn.com url, only cookies from
-cnn.com should be retained after the redirect chain completes.
-
- </para>
- <para>
-
-Non-automated redirect chains that require user input at some step (such as
-federated login systems) SHOULD still allow identifiers to persist.
-
- </para>
- <para><command>Implementation status:</command>
-
-There are numerous ways for the user to be redirected, and the Firefox API
-support to detect each of them is poor. We have a <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3600">trac bug
-open</ulink> to implement what we can.
-
- </para>
- </listitem>
- <listitem>window.name
- <para>
-
-<ulink
-url="https://developer.mozilla.org/En/DOM/Window.name">window.name</ulink> is
-a magical DOM property that for some reason is allowed to retain a persistent value
-for the lifespan of a browser tab. It is possible to utilize this property for
-<ulink url="http://www.thomasfrank.se/sessionvars.html">identifier
-storage</ulink>.
-
- </para>
- <para>
-
-In order to eliminate non-consensual linkability but still allow for sites
-that utilize this property to function, we reset the window.name property of
-tabs in Torbutton every time we encounter a blank Referer. This behavior
-allows window.name to persist for the duration of a click-driven navigation
-session, but as soon as the user enters a new URL or navigates between
-https/http schemes, the property is cleared.
-
- </para>
- </listitem>
- <listitem>Auto form-fill
- <para>
-
-We disable the password saving functionality in the browser as part of our
-<link linkend="disk-avoidance">Disk Avoidance</link> requirement. However,
-since users may decide to re-enable disk history records and password saving,
-we also set the <ulink
-url="http://kb.mozillazine.org/Signon.autofillForms">signon.autofillForms</ulink>
-preference to false to prevent saved values from immediately populating
-fields upon page load. Since Javascript can read these values as soon as they
-appear, setting this preference prevents automatic linkability from stored passwords.
-
- </para>
- </listitem>
- <listitem>HSTS supercookies
- <para>
-
-An extreme (but not impossible) attack to mount is the creation of <ulink
-url="http://www.leviathansecurity.com/blog/archives/12-The-Double-Edged-Sword-of…">HSTS
-supercookies</ulink>. Since HSTS effectively stores one bit of information per domain
-name, an adversary in possession of numerous domains can use them to construct
-cookies based on stored HSTS state.
-
- </para>
- <para><command>Design Goal:</command>
-
-There appears to be three options for us: 1. Disable HSTS entirely, and rely
-instead on HTTPS-Everywhere to crawl and ship rules for HSTS sites. 2.
-Restrict the number of HSTS-enabled third parties allowed per url bar origin.
-3. Prevent third parties from storing HSTS rules. We have not yet decided upon
-the best approach.
-
- </para>
- <para><command>Implementation Status:</command> Currently, HSTS state is
-cleared by <link linkend="new-identity">New Identity</link>, but we don't
-defend against the creation of these cookies between <command>New
-Identity</command> invocations.
- </para>
- </listitem>
- <listitem>Exit node usage
- <para><command>Design Goal:</command>
-
-Every distinct navigation session (as defined by a non-blank Referer header)
-MUST exit through a fresh Tor circuit in Tor Browser to prevent exit node
-observers from linking concurrent browsing activity.
-
- </para>
- <para><command>Implementation Status:</command>
-
-The Tor feature that supports this ability only exists in the 0.2.3.x-alpha
-series. <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3455">Ticket
-#3455</ulink> is the Torbutton ticket to make use of the new Tor
-functionality.
-
- </para>
- </listitem>
- </orderedlist>
- <para>
-For more details on identifier linkability bugs and enhancements, see the <ulink
-url="https://trac.torproject.org/projects/tor/query?keywords=~tbb-linkability&am…">tbb-linkability tag in our bugtracker</ulink>
- </para>
- </sect2>
- <sect2 id="fingerprinting-linkability">
- <title>Cross-Origin Fingerprinting Unlinkability</title>
- <para>
-
-In order to properly address the fingerprinting adversary on a technical
-level, we need a metric to measure linkability of the various browser
-properties beyond any stored origin-related state. <ulink
-url="https://panopticlick.eff.org/about.php">The Panopticlick Project</ulink>
-by the EFF provides us with a prototype of such a metric. The researchers
-conducted a survey of volunteers who were asked to visit an experiment page
-that harvested many of the above components. They then computed the Shannon
-Entropy of the resulting distribution of each of several key attributes to
-determine how many bits of identifying information each attribute provided.
-
- </para>
- <para>
-
-Many browser features have been added since the EFF first ran their experiment
-and collected their data. To avoid an infinite sinkhole, we reduce the efforts
-for fingerprinting resistance by only concerning ourselves with reducing the
-fingerprintable differences <emphasis>among</emphasis> Tor Browser users. We
-do not believe it is possible to solve cross-browser fingerprinting issues.
-
- </para>
- <para>
-
-Unfortunately, the unsolvable nature of the cross-browser fingerprinting
-problem means that the Panopticlick test website itself is not useful for
-evaluating the actual effectiveness of our defenses, or the fingerprinting
-defenses of any other web browser. Because the Panopticlick dataset is based
-on browser data spanning a number of widely deployed browsers over a number of
-years, any fingerprinting defenses attempted by browsers today are very likely
-to cause Panopticlick to report an <emphasis>increase</emphasis> in
-fingerprintability and entropy, because those defenses will stand out in sharp
-contrast to historical data. We have been <ulink
-url="https://trac.torproject.org/projects/tor/ticket/6119">working to convince
-the EFF</ulink> that it is worthwhile to release the source code to
-Panopticlick to allow us to run our own version for this reason.
-
- </para>
- <sect3 id="fingerprinting-defenses">
- <title>Fingerprinting defenses in the Tor Browser</title>
-
- <orderedlist>
- <listitem>Plugins
- <para>
-
-Plugins add to fingerprinting risk via two main vectors: their mere presence in
-window.navigator.plugins, as well as their internal functionality.
-
- </para>
- <para><command>Design Goal:</command>
-
-All plugins that have not been specifically audited or sandboxed MUST be
-disabled. To reduce linkability potential, even sandboxed plugins should not
-be allowed to load objects until the user has clicked through a click-to-play
-barrier. Additionally, version information should be reduced or obfuscated
-until the plugin object is loaded. For flash, we wish to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3974">provide a
-settings.sol file</ulink> to disable Flash cookies, and to restrict P2P
-features that are likely to bypass proxy settings.
-
- </para>
- <para><command>Implementation Status:</command>
-
-Currently, we entirely disable all plugins in Tor Browser. However, as a
-compromise due to the popularity of Flash, we allow users to re-enable Flash,
-and flash objects are blocked behind a click-to-play barrier that is available
-only after the user has specifically enabled plugins. Flash is the only plugin
-available, the rest are <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">entirely
-blocked from loading by a Firefox patch</ulink>. We also set the Firefox
-preference <command>plugin.expose_full_path</command> to false, to avoid
-leaking plugin installation information.
-
- </para>
- </listitem>
- <listitem>HTML5 Canvas Image Extraction
- <para>
-
-The <ulink url="https://developer.mozilla.org/en-US/docs/HTML/Canvas">HTML5
-Canvas</ulink> is a feature that has been added to major browsers after the
-EFF developed their Panopticlick study. After plugins and plugin-provided
-information, we believe that the HTML5 Canvas is the single largest
-fingerprinting threat browsers face today. <ulink
-url="http://www.w2spconf.com/2012/papers/w2sp12-final4.pdf">Initial
-studies</ulink> show that the Canvas can provide an easy-access fingerprinting
-target: The adversary simply renders WebGL, font, and named color data to a
-Canvas element, extracts the image buffer, and computes a hash of that image
-data. Subtle differences in the video card, font packs, and even font and
-graphics library versions allow the adversary to produce a stable, simple,
-high-entropy fingerprint of a computer. In fact, the hash of the rendered
-image can be used almost identically to a tracking cookie by the web server.
-
- </para>
- <para>
-
-To reduce the threat from this vector, we have patched Firefox to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">prompt
-before returning valid image data</ulink> to the Canvas APIs. If the user
-hasn't previously allowed the site in the URL bar to access Canvas image data,
-pure white image data is returned to the Javascript APIs.
-
- </para>
- </listitem>
- <listitem>WebGL
- <para>
-
-WebGL is fingerprintable both through information that is exposed about the
-underlying driver and optimizations, as well as through performance
-fingerprinting.
-
- </para>
- <para>
-
-Because of the large amount of potential fingerprinting vectors and the <ulink
-url="http://www.contextis.com/resources/blog/webgl/">previously unexposed
-vulnerability surface</ulink>, we deploy a similar strategy against WebGL as
-for plugins. First, WebGL Canvases have click-to-play placeholders (provided
-by NoScript), and do not run until authorized by the user. Second, we
-obfuscate driver information by setting the Firefox preferences
-<command>webgl.disable-extensions</command> and
-<command>webgl.min_capability_mode</command>, which reduce the information
-provided by the following WebGL API calls: <command>getParameter()</command>,
-<command>getSupportedExtensions()</command>, and
-<command>getExtension()</command>.
-
- </para>
- </listitem>
- <listitem>Fonts
- <para>
-
-According to the Panopticlick study, fonts provide the most linkability when
-they are provided as an enumerable list in filesystem order, via either the
-Flash or Java plugins. However, it is still possible to use CSS and/or
-Javascript to query for the existence of specific fonts. With a large enough
-pre-built list to query, a large amount of fingerprintable information may
-still be available.
-
- </para>
- <para>
-
-The sure-fire way to address font linkability is to ship the browser with a
-font for every language, typeface, and style in use in the world, and to only
-use those fonts at the exclusion of system fonts. However, this set may be
-impractically large. It is possible that a smaller <ulink
-url="https://secure.wikimedia.org/wikipedia/en/wiki/Unicode_typeface#List_of_Uni…">common
-subset</ulink> may be found that provides total coverage. However, we believe
-that with strong url bar origin identifier isolation, a simpler approach can reduce the
-number of bits available to the adversary while avoiding the rendering and
-language issues of supporting a global font set.
-
- </para>
- <para><command>Implementation Status:</command>
-
-We disable plugins, which prevents font enumeration. Additionally, we limit
-both the number of font queries from CSS, as well as the total number of
-fonts that can be used in a document <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">with
-a Firefox patch</ulink>. We create two prefs,
-<command>browser.display.max_font_attempts</command> and
-<command>browser.display.max_font_count</command> for this purpose. Once these
-limits are reached, the browser behaves as if
-<command>browser.display.use_document_fonts</command> was set. We are
-still working to determine optimal values for these prefs.
-
- </para>
- <para>
-
-To improve rendering, we exempt remote <ulink
-url="https://developer.mozilla.org/en-US/docs/CSS/@font-face">@font-face
-fonts</ulink> from these counts, and if a font-family CSS rule lists a remote
-font (in any order), we use that font instead of any of the named local fonts.
-
- </para>
- </listitem>
- <listitem>Desktop resolution, CSS Media Queries, and System Colors
- <para>
-
-Both CSS and Javascript have access to a lot of information about the screen
-resolution, usable desktop size, OS widget size, toolbar size, title bar size,
-system theme colors, and other desktop features that are not at all relevant
-to rendering and serve only to provide information for fingerprinting.
-
- </para>
- <para><command>Design Goal:</command>
-
-Our design goal here is to reduce the resolution information down to the bare
-minimum required for properly rendering inside a content window. We intend to
-report all rendering information correctly with respect to the size and
-properties of the content window, but report an effective size of 0 for all
-border material, and also report that the desktop is only as big as the
-inner content window. Additionally, new browser windows are sized such that
-their content windows are one of a few fixed sizes based on the user's
-desktop resolution.
-
- </para>
- <para><command>Implementation Status:</command>
-
-We have implemented the above strategy using a window observer to <ulink
-url="https://gitweb.torproject.org/torbutton.git/blob/HEAD:/src/chrome/content/t…">resize
-new windows based on desktop resolution</ulink>. Additionally, we patch
-Firefox to use the client content window size <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">for
-window.screen</ulink> and <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">for
-CSS Media Queries</ulink>. Similarly, we <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">patch
-DOM events to return content window relative points</ulink>. We also patch
-Firefox to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">report
-a fixed set of system colors to content window CSS</ulink>.
-
- </para>
- <para>
-
-To further reduce resolution-based fingerprinting, we are <ulink
-url="https://trac.torproject.org/projects/tor/ticket/7256">investigating
-zoom/viewport-based mechanisms</ulink> that might allow us to always report
-the same desktop resolution regardless of the actual size of the content
-window, and simply scale to make up the difference. However, the complexity
-and rendering impact of such a change is not yet known.
-
- </para>
- </listitem>
- <listitem>User Agent and HTTP Headers
- <para><command>Design Goal:</command>
-
-All Tor Browser users MUST provide websites with an identical user agent and
-HTTP header set for a given request type. We omit the Firefox minor revision,
-and report a popular Windows platform. If the software is kept up to date,
-these headers should remain identical across the population even when updated.
-
- </para>
- <para><command>Implementation Status:</command>
-
-Firefox provides several options for controlling the browser user agent string
-which we leverage. We also set similar prefs for controlling the
-Accept-Language and Accept-Charset headers, which we spoof to English by default. Additionally, we
-<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">remove
-content script access</ulink> to Components.interfaces, which <ulink
-url="http://pseudo-flaw.net/tor/torbutton/fingerprint-firefox.html">can be
-used</ulink> to fingerprint OS, platform, and Firefox minor version. </para>
-
- </listitem>
- <listitem>Timezone and clock offset
- <para><command>Design Goal:</command>
-
-All Tor Browser users MUST report the same timezone to websites. Currently, we
-choose UTC for this purpose, although an equally valid argument could be made
-for EDT/EST due to the large English-speaking population density (coupled with
-the fact that we spoof a US English user agent). Additionally, the Tor
-software should detect if the users clock is significantly divergent from the
-clocks of the relays that it connects to, and use this to reset the clock
-values used in Tor Browser to something reasonably accurate.
-
- </para>
- <para><command>Implementation Status:</command>
-
-We set the timezone using the TZ environment variable, which is supported on
-all platforms. Additionally, we plan to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3652">obtain a clock
-offset from Tor</ulink>, but this won't be available until Tor 0.2.3.x is in
-use.
-
- </para>
- </listitem>
- <listitem>Javascript performance fingerprinting
- <para>
-
-<ulink url="http://w2spconf.com/2011/papers/jspriv.pdf">Javascript performance
-fingerprinting</ulink> is the act of profiling the performance
-of various Javascript functions for the purpose of fingerprinting the
-Javascript engine and the CPU.
-
- </para>
- <para><command>Design Goal:</command>
-
-We have <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3059">several potential
-mitigation approaches</ulink> to reduce the accuracy of performance
-fingerprinting without risking too much damage to functionality. Our current
-favorite is to reduce the resolution of the Event.timeStamp and the Javascript
-Date() object, while also introducing jitter. Our goal is to increase the
-amount of time it takes to mount a successful attack. <ulink
-url="http://w2spconf.com/2011/papers/jspriv.pdf">Mowery et al</ulink> found that
-even with the default precision in most browsers, they required up to 120
-seconds of amortization and repeated trials to get stable results from their
-feature set. We intend to work with the research community to establish the
-optimum trade-off between quantization+jitter and amortization time.
-
-
- </para>
- <para><command>Implementation Status:</command>
-
-Currently, the only mitigation against performance fingerprinting is to
-disable <ulink url="http://www.w3.org/TR/navigation-timing/">Navigation
-Timing</ulink> through the Firefox preference
-<command>dom.enable_performance</command>.
-
- </para>
- </listitem>
- <listitem>Non-Uniform HTML5 API Implementations
- <para>
-
-At least two HTML5 features have different implementation status across the
-major OS vendors: the <ulink
-url="https://developer.mozilla.org/en-US/docs/DOM/window.navigator.battery">Battery
-API</ulink> and the <ulink
-url="https://developer.mozilla.org/en-US/docs/DOM/window.navigator.connection">Network
-Connection API</ulink>. We disable these APIs
-through the Firefox preferences <command>dom.battery.enabled</command> and
-<command>dom.network.enabled</command>.
-
- </para>
- </listitem>
- <listitem>Keystroke fingerprinting
- <para>
-
-Keystroke fingerprinting is the act of measuring key strike time and key
-flight time. It is seeing increasing use as a biometric.
-
- </para>
- <para><command>Design Goal:</command>
-
-We intend to rely on the same mechanisms for defeating Javascript performance
-fingerprinting: timestamp quantization and jitter.
-
- </para>
- <para><command>Implementation Status:</command>
-We have no implementation as of yet.
- </para>
- </listitem>
- </orderedlist>
- </sect3>
- <para>
-For more details on identifier linkability bugs and enhancements, see the <ulink
-url="https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting…">tbb-fingerprinting tag in our bugtracker</ulink>
- </para>
- </sect2>
- <sect2 id="new-identity">
- <title>Long-Term Unlinkability via "New Identity" button</title>
- <para>
-
-In order to avoid long-term linkability, we provide a "New Identity" context
-menu option in Torbutton. This context menu option is active if Torbutton can
-read the environment variables $TOR_CONTROL_PASSWD and $TOR_CONTROL_PORT.
-
- </para>
-
- <sect3>
- <title>Design Goal:</title>
- <blockquote>
-
-All linkable identifiers and browser state MUST be cleared by this feature.
-
- </blockquote>
- </sect3>
-
- <sect3>
- <title>Implementation Status:</title>
- <blockquote>
- <para>
-
-First, Torbutton disables Javascript in all open tabs and windows by using
-both the <ulink
-url="https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIDocSh…">browser.docShell.allowJavascript</ulink>
-attribute as well as <ulink
-url="https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIDOMWi…">nsIDOMWindowUtil.suppressEventHandling()</ulink>.
-We then stop all page activity for each tab using <ulink
-url="https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIWebNa…">browser.webNavigation.stop(nsIWebNavigation.STOP_ALL)</ulink>.
-We then clear the site-specific Zoom by temporarily disabling the preference
-<command>browser.zoom.siteSpecific</command>, and clear the GeoIP wifi token URL
-<command>geo.wifi.access_token</command> and the last opened URL prefs (if
-they exist). Each tab is then closed.
-
- </para>
- <para>
-
-After closing all tabs, we then emit "<ulink
-url="https://developer.mozilla.org/en-US/docs/Supporting_private_browsing_mode#P…">browser:purge-session-history</ulink>"
-(which instructs addons and various Firefox components to clear their session
-state), and then manually clear the following state: searchbox and findbox
-text, HTTP auth, SSL state, OCSP state, site-specific content preferences
-(including HSTS state), content and image cache, offline cache, Cookies, DOM
-storage, DOM local storage, the safe browsing key, and the Google wifi geolocation
-token (if it exists).
-
- </para>
- <para>
-
-After the state is cleared, we then close all remaining HTTP keep-alive
-connections and then send the NEWNYM signal to the Tor control port to cause a
-new circuit to be created.
- </para>
- <para>
-Finally, a fresh browser window is opened, and the current browser window is
-closed (this does not spawn a new Firefox process, only a new window).
- </para>
- </blockquote>
- <blockquote>
-If the user chose to "protect" any cookies by using the Torbutton Cookie
-Protections UI, those cookies are not cleared as part of the above.
- </blockquote>
- </sect3>
- </sect2>
-<!--
- <sect2 id="click-to-play">
- <title>Click-to-play for plugins and invasive content</title>
- <para>
-Some content types are too invasive and/or too opaque for us to properly
-eliminate their linkability properties. For these content types, we use
-NoScript to provide click-to-play placeholders that do not activate the
-content until the user clicks on it. This will eliminate the ability for an
-adversary to use such content types to link users in a dragnet fashion across
-arbitrary sites.
- </para>
- <para>
-Currently, the content types isolated in this way include Flash, WebGL, and
-audio and video objects.
- </para>
- </sect2>
--->
- <sect2 id="other-security">
- <title>Other Security Measures</title>
- <para>
-
-In addition to the above mechanisms that are devoted to preserving privacy
-while browsing, we also have a number of technical mechanisms to address other
-privacy and security issues.
-
- </para>
- <orderedlist>
- <listitem id="traffic-fingerprinting-defenses"><command>Website Traffic Fingerprinting Defenses</command>
- <para>
-
-<link linkend="website-traffic-fingerprinting">Website Traffic
-Fingerprinting</link> is a statistical attack to attempt to recognize specific
-encrypted website activity.
-
- </para>
- <sect3>
- <title>Design Goal:</title>
- <blockquote>
- <para>
-
-We want to deploy a mechanism that reduces the accuracy of <ulink
-url="https://en.wikipedia.org/wiki/Feature_selection">useful features</ulink> available
-for classification. This mechanism would either impact the true and false
-positive accuracy rates, <emphasis>or</emphasis> reduce the number of webpages
-that could be classified at a given accuracy rate.
-
- </para>
- <para>
-
-Ideally, this mechanism would be as light-weight as possible, and would be
-tunable in terms of overhead. We suspect that it may even be possible to
-deploy a mechanism that reduces feature extraction resolution without any
-network overhead. In the no-overhead category, we have <ulink
-url="http://freehaven.net/anonbib/cache/LZCLCP_NDSS11.pdf">HTTPOS</ulink> and
-<ulink
-url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">better
-use of HTTP pipelining and/or SPDY</ulink>.
-In the tunable/low-overhead
-category, we have <ulink
-url="http://freehaven.net/anonbib/cache/ShWa-Timing06.pdf">Adaptive
-Padding</ulink> and <ulink url="http://www.cs.sunysb.edu/~xcai/fp.pdf">
-Congestion-Sensitive BUFLO</ulink>. It may be also possible to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/7028">tune such
-defenses</ulink> such that they only use existing spare Guard bandwidth capacity in the Tor
-network, making them also effectively no-overhead.
-
- </para>
- </blockquote>
- </sect3>
- <sect3>
- <title>Implementation Status:</title>
- <blockquote>
- <para>
-Currently, we patch Firefox to <ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">randomize
-pipeline order and depth</ulink>. Unfortunately, pipelining is very fragile.
-Many sites do not support it, and even sites that advertise support for
-pipelining may simply return error codes for successive requests, effectively
-forcing the browser into non-pipelined behavior. Firefox also has code to back
-off and reduce or eliminate the pipeline if this happens. These
-shortcomings and fallback behaviors are the primary reason that Google
-developed SPDY as opposed simply extending HTTP to improve pipelining. It
-turns out that we could actually deploy exit-side proxies that allow us to
-<ulink
-url="https://gitweb.torproject.org/torspec.git/blob/HEAD:/proposals/ideas/xxx-us…">use
-SPDY from the client to the exit node</ulink>. This would make our defense not
-only free, but one that actually <emphasis>improves</emphasis> performance.
-
- </para>
- <para>
-
-Knowing this, we created this defense as an <ulink
-url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">experimental
-research prototype</ulink> to help evaluate what could be done in the best
-case with full server support. Unfortunately, the bias in favor of compelling
-attack papers has caused academia to ignore this request thus far, instead
-publishing only cursory (yet "devastating") evaluations that fail to provide
-even simple statistics such as the rates of actual pipeline utilization during
-their evaluations, in addition to the other shortcomings and shortcuts <link
-linkend="website-traffic-fingerprinting">mentioned earlier</link>. We can
-accept that our defense might fail to work as well as others (in fact we
-expect it), but unfortunately the very same shortcuts that provide excellent
-attack results also allow the conclusion that all defenses are broken forever.
-So sadly, we are still left in the dark on this point.
-
- </para>
- </blockquote>
- </sect3>
- </listitem>
- <listitem><command>Privacy-preserving update notification</command>
- <para>
-
-In order to inform the user when their Tor Browser is out of date, we perform a
-privacy-preserving update check asynchronously in the background. The
-check uses Tor to download the file <ulink
-url="https://check.torproject.org/RecommendedTBBVersions">https://check.torproject.org/RecommendedTBBVersions</ulink>
-and searches that version list for the current value for the local preference
-<command>torbrowser.version</command>. If the value from our preference is
-present in the recommended version list, the check is considered to have
-succeeded and the user is up to date. If not, it is considered to have failed
-and an update is needed. The check is triggered upon browser launch, new
-window, and new tab, but is rate limited so as to happen no more frequently
-than once every 1.5 hours.
-
- </para>
- <para>
-
-If the check fails, we cache this fact, and update the Torbutton graphic to
-display a flashing warning icon and insert a menu option that provides a link
-to our download page. Additionally, we reset the value for the browser
-homepage to point to a <ulink
-url="https://check.torproject.org/?lang=en-US&small=1&uptodate=0">page that
-informs the user</ulink> that their browser is out of
-date.
-
- </para>
- </listitem>
-
- </orderedlist>
- </sect2>
- <sect2 id="firefox-patches">
- <title>Description of Firefox Patches</title>
- <para>
-
-The set of patches we have against Firefox can be found in the <ulink
-url="https://gitweb.torproject.org/torbrowser.git/tree/maint-2.4:/src/current-pa…">current-patches directory of the torbrowser git repository</ulink>. They are:
-
- </para>
- <orderedlist>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Block
-Components.interfaces</ulink>
- <para>
-
-In order to reduce fingerprinting, we block access to this interface from
-content script. Components.interfaces can be used for fingerprinting the
-platform, OS, and Firebox version, but not much else.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make
-Permissions Manager memory only</ulink>
- <para>
-
-This patch exposes a pref 'permissions.memory_only' that properly isolates the
-permissions manager to memory, which is responsible for all user specified
-site permissions, as well as stored <ulink
-url="https://secure.wikimedia.org/wikipedia/en/wiki/HTTP_Strict_Transport_Securi…">HSTS</ulink>
-policy from visited sites.
-
-The pref does successfully clear the permissions manager memory if toggled. It
-does not need to be set in prefs.js, and can be handled by Torbutton.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make
-Intermediate Cert Store memory-only</ulink>
- <para>
-
-The intermediate certificate store records the intermediate SSL certificates
-the browser has seen to date. Because these intermediate certificates are used
-by a limited number of domains (and in some cases, only a single domain),
-the intermediate certificate store can serve as a low-resolution record of
-browsing history.
-
- </para>
- <!-- FIXME: Should this be a <note> tag too? -->
- <para><command>Design Goal:</command>
-
-As an additional design goal, we would like to later alter this patch to allow this
-information to be cleared from memory. The implementation does not currently
-allow this.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add
-a string-based cacheKey property for domain isolation</ulink>
- <para>
-
-To <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3666">increase the
-security of cache isolation</ulink> and to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3754">solve strange and
-unknown conflicts with OCSP</ulink>, we had to patch
-Firefox to provide a cacheDomain cache attribute. We use the url bar
-FQDN as input to this field.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Block
-all plugins except flash</ulink>
- <para>
-We cannot use the <ulink
-url="http://www.oxymoronical.com/experiments/xpcomref/applications/Firefox/3.5/c…">
-(a)mozilla.org/extensions/blocklist;1</ulink> service, because we
-actually want to stop plugins from ever entering the browser's process space
-and/or executing code (for example, AV plugins that collect statistics/analyze
-URLs, magical toolbars that phone home or "help" the user, Skype buttons that
-ruin our day, and censorship filters). Hence we rolled our own.
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make content-prefs service memory only</ulink>
- <para>
-This patch prevents random URLs from being inserted into content-prefs.sqlite in
-the profile directory as content prefs change (includes site-zoom and perhaps
-other site prefs?).
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make Tor Browser exit when not launched from Vidalia</ulink>
- <para>
-
-It turns out that on Windows 7 and later systems, the Taskbar attempts to
-automatically learn the most frequent apps used by the user, and it recognizes
-Tor Browser as a separate app from Vidalia. This can cause users to try to
-launch Tor Browser without Vidalia or a Tor instance running. Worse, the Tor
-Browser will automatically find their default Firefox profile, and properly
-connect directly without using Tor. This patch is a simple hack to cause Tor
-Browser to immediately exit in this case.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Disable SSL Session ID tracking</ulink>
- <para>
-
-This patch is a simple 1-line hack to prevent SSL connections from caching
-(and then later transmitting) their Session IDs. There was no preference to
-govern this behavior, so we had to hack it by altering the SSL new connection
-defaults.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Provide an observer event to close persistent connections</ulink>
- <para>
-
-This patch creates an observer event in the HTTP connection manager to close
-all keep-alive connections that still happen to be open. This event is emitted
-by the <link linkend="new-identity">New Identity</link> button.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Limit Device and System Specific Media Queries</ulink>
- <para>
-
-<ulink url="https://developer.mozilla.org/en-US/docs/CSS/Media_queries">CSS
-Media Queries</ulink> have a fingerprinting capability approaching that of
-Javascript. This patch causes such Media Queries to evaluate as if the device
-resolution was equal to the content window resolution.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Limit the number of fonts per document</ulink>
- <para>
-
-Font availability can be <ulink url="http://flippingtypical.com/">queried by
-CSS and Javascript</ulink> and is a fingerprinting vector. This patch limits
-the number of times CSS and Javascript can cause font-family rules to
-evaluate. Remote @font-face fonts are exempt from the limits imposed by this
-patch, and remote fonts are given priority over local fonts whenever both
-appear in the same font-family rule. We do this by explicitly altering the
-nsRuleNode rule represenation itself to remove the local font families before
-the rule hits the font renderer.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Rebrand Firefox to Tor Browser</ulink>
- <para>
-
-This patch updates our branding in compliance with Mozilla's trademark policy.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make Download Manager Memory Only</ulink>
- <para>
-
-This patch prevents disk leaks from the download manager. The original
-behavior is to write the download history to disk and then delete it, even if
-you disable download history from your Firefox preferences.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add DDG and StartPage to Omnibox</ulink>
- <para>
-
-This patch adds DuckDuckGo and StartPage to the Search Box, and sets our
-default search engine to StartPage. We deployed this patch due to excessive
-Captchas and complete 403 bans from Google.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Make nsICacheService.EvictEntries() Synchronous</ulink>
- <para>
-
-This patch eliminates a race condition with "New Identity". Without it,
-cache-based Evercookies survive for up to a minute after clearing the cache
-on some platforms.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Prevent WebSockets DNS Leak</ulink>
- <para>
-
-This patch prevents a DNS leak when using WebSockets. It also prevents other
-similar types of DNS leaks.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Randomize HTTP pipeline order and depth</ulink>
- <para>
-As an
-<ulink
-url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">experimental
-defense against Website Traffic Fingerprinting</ulink>, we patch the standard
-HTTP pipelining code to randomize the number of requests in a
-pipeline, as well as their order.
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Emit
-an observer event to filter the Drag and Drop URL list</ulink>
- <para>
-
-This patch allows us to block external Drag and Drop events from Torbutton.
-We need to block Drag and Drop because Mac OS and Ubuntu both immediately load
-any URLs they find in your drag buffer before you even drop them (without
-using your browser's proxy settings, of course). This can lead to proxy bypass
-during user activity that is as basic as holding down the mouse button for
-slightly too long while clicking on an image link.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add mozIThirdPartyUtil.getFirstPartyURI() API</ulink>
- <para>
-
-This patch provides an API that allows us to more easily isolate identifiers
-to the URL bar domain.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Add canvas image extraction prompt</ulink>
- <para>
-
-This patch prompts the user before returning canvas image data. Canvas image
-data can be used to create an extremely stable, high-entropy fingerprint based
-on the unique rendering behavior of video cards, OpenGL behavior,
-system fonts, and supporting library versions.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Return client window coordinates for mouse events</ulink>
- <para>
-
-This patch causes mouse events to return coordinates relative to the content
-window instead of the desktop.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Do not expose physical screen info to window.screen</ulink>
- <para>
-
-This patch causes window.screen to return the display resolution size of the
-content window instead of the desktop resolution size.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Do not expose system colors to CSS or canvas</ulink>
- <para>
-
-This patch prevents CSS and Javascript from discovering your desktop color
-scheme and/or theme.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Isolate the Image Cache per url bar domain</ulink>
- <para>
-
-This patch prevents cached images from being used to store third party tracking
-identifiers.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">nsIHTTPChannel.redirectTo() API</ulink>
- <para>
-
-This patch provides HTTPS-Everywhere with an API to perform redirections more
-securely and without addon conflicts.
-
- </para>
- </listitem>
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Isolate DOM Storage to first party URI</ulink>
- <para>
-
-This patch prevents DOM Storage from being used to store third party tracking
-identifiers.
-
- </para>
- </listitem>
-
- <listitem><ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/maint-2.4:/src/current-pa…">Remove
-"This plugin is disabled" barrier</ulink>
-
- <para>
-
-This patch removes a barrier that was informing users that plugins were
-disabled and providing them with a link to enable them. We felt this was poor
-user experience, especially since the barrier was displayed even for sites
-with dual Flash+HTML5 video players, such as YouTube.
-
- </para>
- </listitem>
-
- </orderedlist>
- </sect2>
-
-</sect1>
-
-<!--
-- Packaging
- - Build Process Security
- - External Addons
- - Included
- - HTTPS-E
- - NoScript
- - Torbutton
- - Deliberately excluded
- - Request Policy, AdblockPlus, etc
- - Desired
- - Perspectives/Convergence/etc
- - Pref Changes
- - Caused by Torbutton
- - Set manually in profile
- - Update security
- - Thandy
-
-<sect1 id="Packaging">
- <title>Packaging</title>
- <para> </para>
- <sect2 id="build-security">
- <title>Build Process Security</title>
- <para> </para>
- </sect2>
- <sect2 id="addons">
- <title>External Addons</title>
- <para> </para>
- <sect3>
- <title>Included Addons</title>
- </sect3>
- <sect3>
- <title>Excluded Addons</title>
- </sect3>
- <sect3>
- <title>Dangerous Addons</title>
- </sect3>
- </sect2>
- <sect2 id="prefs">
- <title>Pref Changes</title>
- <para> </para>
- </sect2>
- <sect2 id="update-mechanism">
- <title>Update Security</title>
- <para> </para>
- </sect2>
-</sect1>
--->
-
-<!--
-<sect1 id="Testing">
- <title>Testing</title>
- <para>
-
-The purpose of this section is to cover all the known ways that Tor browser
-security can be subverted from a penetration testing perspective. The hope
-is that it will be useful both for creating a "Tor Safety Check"
-page, and for developing novel tests and actively attacking Torbutton with the
-goal of finding vulnerabilities in either it or the Mozilla components,
-interfaces and settings upon which it relies.
-
- </para>
- <sect2 id="SingleStateTesting">
- <title>Single state testing</title>
- <para>
-
-Torbutton is a complicated piece of software. During development, changes to
-one component can affect a whole slough of unrelated features. A number of
-aggregated test suites exist that can be used to test for regressions in
-Torbutton and to help aid in the development of Torbutton-like addons and
-other privacy modifications of other browsers. Some of these test suites exist
-as a single automated page, while others are a series of pages you must visit
-individually. They are provided here for reference and future regression
-testing, and also in the hope that some brave soul will one day decide to
-combine them into a comprehensive automated test suite.
-
- <orderedlist>
- <listitem><ulink url="http://decloak.net/">Decloak.net</ulink>
- <para>
-
-Decloak.net is the canonical source of plugin and external-application based
-proxy-bypass exploits. It is a fully automated test suite maintained by <ulink
-url="http://digitaloffense.net/">HD Moore</ulink> as a service for people to
-use to test their anonymity systems.
-
- </para>
- </listitem>
- <listitem><ulink url="http://deanonymizer.com/">Deanonymizer.com</ulink>
- <para>
-
-Deanonymizer.com is another automated test suite that tests for proxy bypass
-and other information disclosure vulnerabilities. It is maintained by Kyle
-Williams, the author of <ulink url="http://www.janusvm.com/">JanusVM</ulink>
-and <ulink url="http://www.januspa.com/">JanusPA</ulink>.
-
- </para>
- </listitem>
- <listitem><ulink url="https://ip-check.info">JonDos
-AnonTest</ulink>
- <para>
-
-The <ulink url="https://anonymous-proxy-servers.net/">JonDos people</ulink> also provide an
-anonymity tester. It is more focused on HTTP headers and behaviors than plugin bypass, and
-points out a couple of headers Torbutton could do a better job with
-obfuscating.
-
- </para>
- </listitem>
- <listitem><ulink url="http://browserspy.dk">Browserspy.dk</ulink>
- <para>
-
-Browserspy.dk provides a tremendous collection of browser fingerprinting and
-general privacy tests. Unfortunately they are only available one page at a
-time, and there is not really solid feedback on good vs bad behavior in
-the test results.
-
- </para>
- </listitem>
- <listitem><ulink url="http://analyze.privacy.net/">Privacy
-Analyzer</ulink>
- <para>
-
-The Privacy Analyzer provides a dump of all sorts of browser attributes and
-settings that it detects, including some information on your original IP
-address. Its page layout and lack of good vs bad test result feedback makes it
-not as useful as a user-facing testing tool, but it does provide some
-interesting checks in a single page.
-
- </para>
- </listitem>
- <listitem><ulink url="http://ha.ckers.org/mr-t/">Mr. T</ulink>
- <para>
-
-Mr. T is a collection of browser fingerprinting and deanonymization exploits
-discovered by the <ulink url="http://ha.ckers.org">ha.ckers.org</ulink> crew
-and others. It is also not as user friendly as some of the above tests, but it
-is a useful collection.
-
- </para>
- </listitem>
- <listitem>Gregory Fleischer's <ulink
-url="http://pseudo-flaw.net/content/tor/torbutton/">Torbutton</ulink> and
-<ulink
-url="http://pseudo-flaw.net/content/defcon/dc-17-demos/d.html">Defcon
-17</ulink> Test Cases
- <para>
-
-Gregory Fleischer has been hacking and testing Firefox and Torbutton privacy
-issues for the past 2 years. He has an excellent collection of all his test
-cases that can be used for regression testing. In his Defcon work, he
-demonstrates ways to infer Firefox version based on arcane browser properties.
-We are still trying to determine the best way to address some of those test
-cases.
-
- </para>
- </listitem>
- <listitem><ulink url="https://torcheck.xenobite.eu/index.php">Xenobite's
-TorCheck Page</ulink>
- <para>
-
-This page checks to ensure you are using a valid Tor exit node and checks for
-some basic browser properties related to privacy. It is not very fine-grained
-or complete, but it is automated and could be turned into something useful
-with a bit of work.
-
- </para>
- </listitem>
- </orderedlist>
- </para>
- </sect2>
--->
-<!--
- <sect2>
- <title>Multi-state testing</title>
- <para>
-
-The tests in this section are geared towards a page that would instruct the
-user to toggle their Tor state after the fetch and perform some operations:
-mouseovers, stray clicks, and potentially reloads.
-
- </para>
- <sect3>
- <title>Cookies and Cache Correlation</title>
- <para>
-The most obvious test is to set a cookie, ask the user to toggle tor, and then
-have them reload the page. The cookie should no longer be set if they are
-using the default Torbutton settings. In addition, it is possible to leverage
-the cache to <ulink
-url="http://crypto.stanford.edu/sameorigin/safecachetest.html">store unique
-identifiers</ulink>. The default settings of Torbutton should also protect
-against these from persisting across Tor Toggle.
-
- </para>
- </sect3>
- <sect3>
- <title>Javascript timers and event handlers</title>
- <para>
-
-Javascript can set timers and register event handlers in the hopes of fetching
-URLs after the user has toggled Torbutton.
- </para>
- </sect3>
- <sect3>
- <title>CSS Popups and non-script Dynamic Content</title>
- <para>
-
-Even if Javascript is disabled, CSS is still able to
-<ulink url="http://www.tjkdesign.com/articles/css%20pop%20ups/">create popup-like
-windows</ulink>
-via the 'onmouseover' CSS attribute, which can cause arbitrary browser
-activity as soon as the mouse enters into the content window. It is also
-possible for meta-refresh tags to set timers long enough to make it likely
-that the user has toggled Tor before fetching content.
-
- </para>
- </sect3>
- </sect2>
- <sect2 id="HackTorbutton">
- <title>Active testing (aka How to Hack Torbutton)</title>
- <para>
-
-The idea behind active testing is to discover vulnerabilities in Torbutton to
-bypass proxy settings, run script in an opposite Tor state, store unique
-identifiers, leak location information, or otherwise violate <link
-linkend="requirements">its requirements</link>. Torbutton has ventured out
-into a strange and new security landscape. It depends on Firefox mechanisms
-that haven't necessarily been audited for security, certainly not for the
-threat model that Torbutton seeks to address. As such, it and the interfaces
-it depends upon still need a 'trial by fire' typical of new technologies. This
-section of the document was written with the intention of making that period
-as fast as possible. Please help us get through this period by considering
-these attacks, playing with them, and reporting what you find (and potentially
-submitting the test cases back to be run in the standard batch of Torbutton
-tests.
-
- </para>
- <sect3>
- <title>Some suggested vectors to investigate</title>
- <para>
- <itemizedlist>
- <listitem>Strange ways to register Javascript <ulink
-url="http://en.wikipedia.org/wiki/DOM_Events">events</ulink> and <ulink
-url="http://www.devshed.com/c/a/JavaScript/Using-Timers-in-JavaScript/">timeouts</ulink> should
-be verified to actually be ineffective after Tor has been toggled.</listitem>
- <listitem>Other ways to cause Javascript to be executed after
-<command>javascript.enabled</command> has been toggled off.</listitem>
- <listitem>Odd ways to attempt to load plugins. Kyle Williams has had
-some success with direct loads/meta-refreshes of plugin-handled URLs.</listitem>
- <listitem>The Date and Timezone hooks should be verified to work with
-crazy combinations of iframes, nested iframes, iframes in frames, frames in
-iframes, and popups being loaded and
-reloaded in rapid succession, and/or from one another. Think race conditions and deep,
-parallel nesting, involving iframes from both <ulink
-url="http://en.wikipedia.org/wiki/Same_origin_policy">same-origin and
-non-same-origin</ulink> domains.</listitem>
- <listitem>In addition, there may be alternate ways and other
-methods to query the timezone, or otherwise use some of the Date object's
-methods in combination to deduce the timezone offset. Of course, the author
-tried his best to cover all the methods he could foresee, but it's always good
-to have another set of eyes try it out.</listitem>
- <listitem>Similarly, is there any way to confuse the <link
-linkend="contentpolicy">content policy</link>
-mentioned above to cause it to allow certain types of page fetches? For
-example, it was recently discovered that favicons are not fetched by the
-content, but the chrome itself, hence the content policy did not look up the
-correct window to determine the current Tor tag for the favicon fetch. Are
-there other things that can do this? Popups? Bookmarklets? Active bookmarks? </listitem>
- <listitem>Alternate ways to store and fetch unique identifiers. For example, <ulink
-url="http://developer.mozilla.org/en/docs/DOM:Storage">DOM Storage</ulink>
-caught us off guard.
-It was
-also discovered by <ulink url="http://pseudo-flaw.net">Gregory
-Fleischer</ulink> that <ulink
-url="http://pseudo-flaw.net/content/tor/torbutton/">content window access to
-chrome</ulink> can be used to build <link linkend="fingerprinting">unique
-identifiers</link>.
-Are there any other
-arcane or experimental ways that Firefox provides to create and store unique
-identifiers? Or perhaps unique identifiers can be queried or derived from
-properties of the machine/browser that Javascript has access to? How unique
-can these identifiers be?
- </listitem>
- <listitem>Is it possible to get the browser to write some history to disk
-(aside from swap) that can be retrieved later? By default, Torbutton should
-write no history, cookie, or other browsing activity information to the
-harddisk.</listitem>
- <listitem>Do popup windows make it easier to break any of the above
-behavior? Are javascript events still canceled in popups? What about recursive
-popups from Javascript, data, and other funky URL types? What about CSS
-popups? Are they still blocked after Tor is toggled?</listitem>
- <listitem>Chrome-escalation attacks. The interaction between the
-Torbutton chrome Javascript and the client content window javascript is pretty
-well-defined and carefully constructed, but perhaps there is a way to smuggle
-javascript back in a return value, or otherwise inject network-loaded
-javascript into the chrome (and thus gain complete control of the browser).
-</listitem>
-</itemizedlist>
-
- </para>
- </sect3>
- </sect2>
-</sect1>
--->
-<appendix id="Transparency">
-<title>Towards Transparency in Navigation Tracking</title>
-<para>
-
-The <link linkend="privacy">privacy properties</link> of Tor Browser are based
-upon the assumption that link-click navigation indicates user consent to
-tracking between the linking site and the destination site. While this
-definition is sufficient to allow us to eliminate cross-site third party
-tracking with only minimal site breakage, it is our long-term goal to further
-reduce cross-origin click navigation tracking to mechanisms that are
-detectable by attentive users, so they can alert the general public if
-cross-origin click navigation tracking is happening where it should not be.
-
-</para>
-<para>
-
-In an ideal world, the mechanisms of tracking that can be employed during a
-link click would be limited to the contents of URL parameters and other
-properties that are fully visible to the user before they click. However, the
-entrenched nature of certain archaic web features make it impossible for us to
-achieve this transparency goal by ourselves without substantial site breakage.
-So, instead we maintain a <link linkend="deprecate">Deprecation
-Wishlist</link> of archaic web technologies that are currently being (ab)used
-to facilitate federated login and other legitimate click-driven cross-domain
-activity but that can one day be replaced with more privacy friendly,
-auditable alternatives.
-
-</para>
-<para>
-
-Because the total elimination of side channels during cross-origin navigation
-will undoubtedly break federated login as well as destroy ad revenue, we
-also describe auditable alternatives and promising web draft standards that would
-preserve this functionality while still providing transparency when tracking is
-occurring.
-
-</para>
-
-<sect1 id="deprecate">
- <title>Deprecation Wishlist</title>
- <orderedlist>
- <listitem>The Referer Header
- <para>
-
-We haven't disabled or restricted the Referer ourselves because of the
-non-trivial number of sites that rely on the Referer header to "authenticate"
-image requests and deep-link navigation on their sites. Furthermore, there
-seems to be no real privacy benefit to taking this action by itself in a
-vacuum, because many sites have begun encoding Referer URL information into
-GET parameters when they need it to cross http to https scheme transitions.
-Google's +1 buttons are the best example of this activity.
-
- </para>
- <para>
-
-Because of the availability of these other explicit vectors, we believe the
-main risk of the Referer header is through inadvertent and/or covert data
-leakage. In fact, <ulink
-url="http://www2.research.att.com/~bala/papers/wosn09.pdf">a great deal of
-personal data</ulink> is inadvertently leaked to third parties through the
-source URL parameters.
-
- </para>
- <para>
-
-We believe the Referer header should be made explicit. If a site wishes to
-transmit its URL to third party content elements during load or during
-link-click, it should have to specify this as a property of the associated HTML
-tag. With an explicit property, it would then be possible for the user agent to
-inform the user if they are about to click on a link that will transmit Referer
-information (perhaps through something as subtle as a different color in the
-lower toolbar for the destination URL). This same UI notification can also be
-used for links with the <ulink
-url="https://developer.mozilla.org/en-US/docs/HTML/Element/a#Attributes">"ping"</ulink>
-attribute.
-
- </para>
- </listitem>
- <listitem>window.name
- <para>
-<ulink
-url="https://developer.mozilla.org/En/DOM/Window.name">window.name</ulink> is
-a DOM property that for some reason is allowed to retain a persistent value
-for the lifespan of a browser tab. It is possible to utilize this property for
-<ulink url="http://www.thomasfrank.se/sessionvars.html">identifier
-storage</ulink> during click navigation. This is sometimes used for additional
-XSRF protection and federated login.
- </para>
- <para>
-
-It's our opinion that the contents of window.name should not be preserved for
-cross-origin navigation, but doing so may break federated login for some sites.
-
- </para>
- </listitem>
- <listitem>Javascript link rewriting
- <para>
-
-In general, it should not be possible for onclick handlers to alter the
-navigation destination of 'a' tags, silently transform them into POST
-requests, or otherwise create situations where a user believes they are
-clicking on a link leading to one URL that ends up on another. This
-functionality is deceptive and is frequently a vector for malware and phishing
-attacks. Unfortunately, many legitimate sites also employ such transparent
-link rewriting, and blanket disabling this functionality ourselves will simply
-cause Tor Browser to fail to navigate properly on these sites.
-
- </para>
- <para>
-
-Automated cross-origin redirects are one form of this behavior that is
-possible for us to <ulink
-url="https://trac.torproject.org/projects/tor/ticket/3600">address
-ourselves</ulink>, as they are comparatively rare and can be handled with site
-permissions.
-
- </para>
- </listitem>
- </orderedlist>
-</sect1>
-<sect1>
- <title>Promising Standards</title>
- <orderedlist>
- <listitem><ulink url="http://web-send.org">Web-Send Introducer</ulink>
- <para>
-
-Web-Send is a browser-based link sharing and federated login widget that is
-designed to operate without relying on third-party tracking or abusing other
-cross-origin link-click side channels. It has a compelling list of <ulink
-url="http://web-send.org/features.html">privacy and security features</ulink>,
-especially if used as a "Like button" replacement.
-
- </para>
- </listitem>
- <listitem><ulink url="https://developer.mozilla.org/en-US/docs/Persona">Mozilla Persona</ulink>
- <para>
-
-Mozilla's Persona is designed to provide decentralized, cryptographically
-authenticated federated login in a way that does not expose the user to third
-party tracking or require browser redirects or side channels. While it does
-not directly provide the link sharing capabilities that Web-Send does, it is a
-better solution to the privacy issues associated with federated login than
-Web-Send is.
-
- </para>
- </listitem>
- </orderedlist>
-</sect1>
-</appendix>
-</article>
diff --git a/docs/design/outline.txt b/docs/design/outline.txt
deleted file mode 100644
index f7aa5ec..0000000
--- a/docs/design/outline.txt
+++ /dev/null
@@ -1,52 +0,0 @@
-- Threat model: [Mostly Torbutton]
- - [Remove the security requirements section]
-
-- Design overview and philosophy
- - Security requirements [Torbutton]
- + local leaks?
- - state issues
- - Privacy Requirements [Mostly blog post]
- - Make local privacy optional
- - Avoid Cross-Domain Linkability
- - Indentifiers
- - Fingerprinting
- - 100% self-contained
- - Does not share state with other modes/browsers
- - Easy to remove + wipe with external tools
- - No filters
-
-- Implementation
- - Section Template
- - Sub Section
- - "Design Goal":
- - "Implementation Status"
- - Local Privacy Optional
- - Linkability
- - Stored State
- - Cookies
- - Cache
- - DOM Storage
- - HTTP Auth
- - SSL state
- - Plugins
- - Fingerprinting
- - Patches
-
-- Packaging
- - Build Process Security
- - External Addons
- - Included
- - HTTPS-E
- - NoScript
- - Torbutton
- - Deliberately excluded
- - Request Policy, AdblockPlus, etc
- - Desired
- - Perspectives/Convergence/etc
- - Pref Changes
- - Caused by Torbutton
- - Set manually in profile
- - Update security
- - Thandy
-
-
[View Less]
1
0
commit fcd5119235c3b3711fb100e3c5ef6eee4fa0cfdc
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Fri Sep 16 00:51:45 2011 -0700
Add design outline.
---
docs/design/outline.txt | 52 +++++++++++++++++++++++++++++++++++++++++++++++
1 file changed, 52 insertions(+)
diff --git a/docs/design/outline.txt b/docs/design/outline.txt
new file mode 100644
index 0000000..f7aa5ec
--- /dev/null
+++ b/docs/design/outline.txt
@@ -0,0 +1,52 @@
+- Threat model: [Mostly Torbutton]
+ - […
[View More]Remove the security requirements section]
+
+- Design overview and philosophy
+ - Security requirements [Torbutton]
+ + local leaks?
+ - state issues
+ - Privacy Requirements [Mostly blog post]
+ - Make local privacy optional
+ - Avoid Cross-Domain Linkability
+ - Indentifiers
+ - Fingerprinting
+ - 100% self-contained
+ - Does not share state with other modes/browsers
+ - Easy to remove + wipe with external tools
+ - No filters
+
+- Implementation
+ - Section Template
+ - Sub Section
+ - "Design Goal":
+ - "Implementation Status"
+ - Local Privacy Optional
+ - Linkability
+ - Stored State
+ - Cookies
+ - Cache
+ - DOM Storage
+ - HTTP Auth
+ - SSL state
+ - Plugins
+ - Fingerprinting
+ - Patches
+
+- Packaging
+ - Build Process Security
+ - External Addons
+ - Included
+ - HTTPS-E
+ - NoScript
+ - Torbutton
+ - Deliberately excluded
+ - Request Policy, AdblockPlus, etc
+ - Desired
+ - Perspectives/Convergence/etc
+ - Pref Changes
+ - Caused by Torbutton
+ - Set manually in profile
+ - Update security
+ - Thandy
+
+
[View Less]
1
0

[tor-browser-spec/master] Add initial xml from Torbotton design doc.
by mikeperry@torproject.org 28 Apr '14
by mikeperry@torproject.org 28 Apr '14
28 Apr '14
commit 9978adbd64d808eea2a2490655ab53632d14bd93
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Fri Sep 16 00:51:53 2011 -0700
Add initial xml from Torbotton design doc.
---
docs/design/build.sh | 1 +
docs/design/design.xml | 824 ++++++++++++++++++++++++++++++++++++++++++++++++
2 files changed, 825 insertions(+)
diff --git a/docs/design/build.sh b/docs/design/build.sh
new file mode 100755
index 0000000..6531077
--- /dev/null
+++ b/docs/design/build.sh
@@ -0,0 +1 @@
+…
[View More]xsltproc --output index.html.en --stringparam section.autolabel.max.depth 2 --stringparam section.autolabel 1 /usr/share/sgml/docbook/xsl-stylesheets-1.75.2/xhtml/docbook.xsl design.xml
diff --git a/docs/design/design.xml b/docs/design/design.xml
new file mode 100644
index 0000000..419143a
--- /dev/null
+++ b/docs/design/design.xml
@@ -0,0 +1,824 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!DOCTYPE article PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN"
+ "file:///usr/share/sgml/docbook/xml-dtd-4.4-1.0-30.1/docbookx.dtd">
+
+<article id="design">
+ <articleinfo>
+ <title>The Design and Implementation of the Tor Browser</title>
+ <author>
+ <firstname>Mike</firstname><surname>Perry</surname>
+ <affiliation>
+ <address><email>mikeperry#torproject org</email></address>
+ </affiliation>
+ </author>
+ <author>
+ <firstname>Erinn</firstname><surname>Clark</surname>
+ <affiliation>
+ <address><email>erinn_torproject\org</email></address>
+ </affiliation>
+ </author>
+ <author>
+ <firstname>Steven</firstname><surname>Murdoch</surname>
+ <affiliation>
+ <address><email>sjmurdoch#torproject\org</email></address>
+ </affiliation>
+ </author>
+ <pubdate>Sep 15 2011</pubdate>
+ </articleinfo>
+
+<!--
+- Introduction and Threat model: [Mostly Torbutton]
+ - [Remove the security requirements section]
+-->
+
+<sect1>
+ <title>Introduction</title>
+ <para>
+
+<!-- XXX:
+This document describes the goals, operation, and testing procedures of the
+Torbutton Firefox extension. It is current as of Torbutton 1.3.2.
+-->
+
+ </para>
+ <sect2 id="adversary">
+ <title>Adversary Model</title>
+ <para>
+
+A Tor web browser adversary has a number of goals, capabilities, and attack
+types that can be used to guide us towards a set of requirements for the
+Torbutton extension. Let's start with the goals.
+
+ </para>
+ <sect3 id="adversarygoals">
+ <title>Adversary Goals</title>
+ <orderedlist>
+<!-- These aren't really commands.. But it's the closest I could find in an
+acceptable style.. Don't really want to make my own stylesheet -->
+ <listitem><command>Bypassing proxy settings</command>
+ <para>The adversary's primary goal is direct compromise and bypass of
+Tor, causing the user to directly connect to an IP of the adversary's
+choosing.</para>
+ </listitem>
+ <listitem><command>Correlation of Tor vs Non-Tor Activity</command>
+ <para>If direct proxy bypass is not possible, the adversary will likely
+happily settle for the ability to correlate something a user did via Tor with
+their non-Tor activity. This can be done with cookies, cache identifiers,
+javascript events, and even CSS. Sometimes the fact that a user uses Tor may
+be enough for some authorities.</para>
+ </listitem>
+ <listitem><command>History disclosure</command>
+ <para>
+The adversary may also be interested in history disclosure: the ability to
+query a user's history to see if they have issued certain censored search
+queries, or visited censored sites.
+ </para>
+ </listitem>
+ <listitem><command>Location information</command>
+ <para>
+
+Location information such as timezone and locality can be useful for the
+adversary to determine if a user is in fact originating from one of the
+regions they are attempting to control, or to zero-in on the geographical
+location of a particular dissident or whistleblower.
+
+ </para>
+ </listitem>
+ <listitem><command>Miscellaneous anonymity set reduction</command>
+ <para>
+
+Anonymity set reduction is also useful in attempting to zero in on a
+particular individual. If the dissident or whistleblower is using a rare build
+of Firefox for an obscure operating system, this can be very useful
+information for tracking them down, or at least <link
+linkend="fingerprinting">tracking their activities</link>.
+
+ </para>
+ </listitem>
+ <listitem><command>History records and other on-disk
+information</command>
+ <para>
+In some cases, the adversary may opt for a heavy-handed approach, such as
+seizing the computers of all Tor users in an area (especially after narrowing
+the field by the above two pieces of information). History records and cache
+data are the primary goals here.
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect3>
+
+ <sect3 id="adversarypositioning">
+ <title>Adversary Capabilities - Positioning</title>
+ <para>
+The adversary can position themselves at a number of different locations in
+order to execute their attacks.
+ </para>
+ <orderedlist>
+ <listitem><command>Exit Node or Upstream Router</command>
+ <para>
+The adversary can run exit nodes, or alternatively, they may control routers
+upstream of exit nodes. Both of these scenarios have been observed in the
+wild.
+ </para>
+ </listitem>
+ <listitem><command>Adservers and/or Malicious Websites</command>
+ <para>
+The adversary can also run websites, or more likely, they can contract out
+ad space from a number of different adservers and inject content that way. For
+some users, the adversary may be the adservers themselves. It is not
+inconceivable that adservers may try to subvert or reduce a user's anonymity
+through Tor for marketing purposes.
+ </para>
+ </listitem>
+ <listitem><command>Local Network/ISP/Upstream Router</command>
+ <para>
+The adversary can also inject malicious content at the user's upstream router
+when they have Tor disabled, in an attempt to correlate their Tor and Non-Tor
+activity.
+ </para>
+ </listitem>
+ <listitem><command>Physical Access</command>
+ <para>
+Some users face adversaries with intermittent or constant physical access.
+Users in Internet cafes, for example, face such a threat. In addition, in
+countries where simply using tools like Tor is illegal, users may face
+confiscation of their computer equipment for excessive Tor usage or just
+general suspicion.
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect3>
+
+ <sect3 id="attacks">
+ <title>Adversary Capabilities - Attacks</title>
+ <para>
+
+The adversary can perform the following attacks from a number of different
+positions to accomplish various aspects of their goals. It should be noted
+that many of these attacks (especially those involving IP address leakage) are
+often performed by accident by websites that simply have Javascript, dynamic
+CSS elements, and plugins. Others are performed by adservers seeking to
+correlate users' activity across different IP addresses, and still others are
+performed by malicious agents on the Tor network and at national firewalls.
+
+ </para>
+ <orderedlist>
+ <listitem><command>Inserting Javascript</command>
+ <para>
+If not properly disabled, Javascript event handlers and timers
+can cause the browser to perform network activity after Tor has been disabled,
+thus allowing the adversary to correlate Tor and Non-Tor activity and reveal
+a user's non-Tor IP address. Javascript
+also allows the adversary to execute <ulink
+url="http://whattheinternetknowsaboutyou.com/">history disclosure attacks</ulink>:
+to query the history via the different attributes of 'visited' links to search
+for particular Google queries, sites, or even to <ulink
+url="http://www.mikeonads.com/2008/07/13/using-your-browser-url-history-estimate…">profile
+users based on gender and other classifications</ulink>. Finally,
+Javascript can be used to query the user's timezone via the
+<function>Date()</function> object, and to reduce the anonymity set by querying
+the <function>navigator</function> object for operating system, CPU, locale,
+and user agent information.
+ </para>
+ </listitem>
+
+ <listitem><command>Inserting Plugins</command>
+ <para>
+
+Plugins are abysmal at obeying the proxy settings of the browser. Every plugin
+capable of performing network activity that the author has
+investigated is also capable of performing network activity independent of
+browser proxy settings - and often independent of its own proxy settings.
+Sites that have plugin content don't even have to be malicious to obtain a
+user's
+Non-Tor IP (it usually leaks by itself), though <ulink
+url="http://decloak.net">plenty of active
+exploits</ulink> are possible as well. In addition, plugins can be used to store unique identifiers that are more
+difficult to clear than standard cookies.
+<ulink url="http://epic.org/privacy/cookies/flash.html">Flash-based
+cookies</ulink> fall into this category, but there are likely numerous other
+examples.
+
+ </para>
+ </listitem>
+ <listitem><command>Inserting CSS</command>
+ <para>
+
+CSS can also be used to correlate Tor and Non-Tor activity and reveal a user's
+Non-Tor IP address, via the usage of
+<ulink url="http://www.tjkdesign.com/articles/css%20pop%20ups/">CSS
+popups</ulink> - essentially CSS-based event handlers that fetch content via
+CSS's onmouseover attribute. If these popups are allowed to perform network
+activity in a different Tor state than they were loaded in, they can easily
+correlate Tor and Non-Tor activity and reveal a user's IP address. In
+addition, CSS can also be used without Javascript to perform <ulink
+url="http://ha.ckers.org/weird/CSS-history.cgi">CSS-only history disclosure
+attacks</ulink>.
+ </para>
+ </listitem>
+ <listitem><command>Read and insert cookies</command>
+ <para>
+
+An adversary in a position to perform MITM content alteration can inject
+document content elements to both read and inject cookies for
+arbitrary domains. In fact, many "SSL secured" websites are vulnerable to this
+sort of <ulink url="http://seclists.org/bugtraq/2007/Aug/0070.html">active
+sidejacking</ulink>.
+
+ </para>
+ </listitem>
+ <listitem><command>Create arbitrary cached content</command>
+ <para>
+
+Likewise, the browser cache can also be used to <ulink
+url="http://crypto.stanford.edu/sameorigin/safecachetest.html">store unique
+identifiers</ulink>. Since by default the cache has no same-origin policy,
+these identifiers can be read by any domain, making them an ideal target for
+adserver-class adversaries.
+
+ </para>
+ </listitem>
+
+ <listitem id="fingerprinting"><command>Fingerprint users based on browser
+attributes</command>
+<para>
+
+There is an absurd amount of information available to websites via attributes
+of the browser. This information can be used to reduce anonymity set, or even
+<ulink url="http://mandark.fr/0x000000/articles/Total_Recall_On_Firefox..html">uniquely
+fingerprint individual users</ulink>. </para>
+<para>
+For illustration, let's perform a
+back-of-the-envelope calculation on the number of anonymity sets for just the
+resolution information available in the <ulink
+url="http://developer.mozilla.org/en/docs/DOM:window">window</ulink> and
+<ulink
+url="http://developer.mozilla.org/en/docs/DOM:window.screen">window.screen</ulink>
+objects.
+
+
+
+Browser window resolution information provides something like
+(1280-640)*(1024-480)=348160 different anonymity sets. Desktop resolution
+information contributes about another factor of 5 (for about 5 resolutions in
+typical use). In addition, the dimensions and position of the desktop taskbar
+are available, which can reveal hints on OS information. This boosts the count
+by a factor of 5 (for each of the major desktop taskbars - Windows, OSX, KDE
+and Gnome, and None). Subtracting the browser content window
+size from the browser outer window size provide yet more information.
+Firefox toolbar presence gives about a factor of 8 (3 toolbars on/off give
+2<superscript>3</superscript>=8). Interface effects such as title bar font size
+and window manager settings gives a factor of about 9 (say 3 common font sizes
+for the title bar and 3 common sizes for browser GUI element fonts).
+Multiply this all out, and you have (1280-640)*(1024-480)*5*5*8*9 ~=
+2<superscript>29</superscript>, or a 29 bit identifier based on resolution
+information alone. </para>
+
+<para>
+
+Of course, this space is non-uniform in user density and prone to incremental
+changes. The <ulink
+url="https://wiki.mozilla.org/Fingerprinting#Data">Panopticlick study
+done</ulink> by the EFF attempts to measure the actual entropy - the number of
+identifying bits of information encoded in browser properties. Their result
+data is definitely useful, and the metric is probably the appropriate one for
+determining how identifying a particular browser property is. However, some
+quirks of their study means that they do not extract as much information as
+they could from display information: they only use desktop resolution (which
+Torbutton reports as the window resolution) and do not attempt to infer the
+size of toolbars.
+
+</para>
+<!--
+FIXME: This is no longer true. Only certain addons are now discoverable, and
+only if they want to be:
+http://webdevwonders.com/detecting-firefox-add-ons/
+https://developer.mozilla.org/en/Updating_web_applications_for_Firefox_3#section_7
+
+<para>
+
+To add insult to injury, <ulink
+url="http://pseudo-flaw.net/content/tor/torbutton/">chrome URL disclosure
+attacks</ulink> mean that each and every extension on <ulink
+url="https://addons.mozilla.org">addons.mozilla.org</ulink> adds another bit
+to that 2<superscript>29</superscript>. With hundreds of popular extensions
+and thousands of extensions total, it is easy to see that this sort of
+information is an impressively powerful identifier if used properly by a
+competent and determined adversary such as an ad network. Again, a
+nearest-neighbor bit vector space approach here would also gracefully handle
+incremental changes to installed extensions.
+
+</para>
+-->
+ </listitem>
+ <listitem><command>Remotely or locally exploit browser and/or
+OS</command>
+ <para>
+Last, but definitely not least, the adversary can exploit either general
+browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to
+install malware and surveillance software. An adversary with physical access
+can perform similar actions. Regrettably, this last attack capability is
+outside of Torbutton's ability to defend against, but it is worth mentioning
+for completeness.
+ </para>
+ </listitem>
+ </orderedlist>
+ </sect3>
+
+ </sect2>
+</sect1>
+
+<!--
+- Design overview and philosophy
+ - Security requirements [Torbutton]
+ + local leaks?
+ - state issues
+ - Privacy Requirements [Mostly blog post]
+ - Avoid Cross-Domain Linkability
+ - Indentifiers
+ - Fingerprinting
+ - 100% self-contained
+ - Does not share state with other modes/browsers
+ - Easy to remove + wipe with external tools
+ - click-to-play for "troublesome" features
+ - No filters
+-->
+
+<sect1 id="Design">
+ <title>Design and Philosophy</title>
+ <para>
+
+The Tor Browser is meant to serve as a specification and a reference
+implementation of a Private Browsing Mode that defends against both Network
+and Local adversaries.
+
+ </para>
+ <para>
+
+There are two main categories of requirements: Security Requirements, and
+Privacy Requirements. Security Requirements are the minimum properties in
+order for a web client platform to be able to support Tor. Privacy
+requirements are the set of properties that cause us to prefer one platform
+over another.
+
+We will maintain an alternate distribution of the web client in order to
+maintain and/or restore privacy properties to our users.
+
+ </para>
+ <sect2 id="security">
+ <title>Security Requirements</title>
+ <para>
+
+ </para>
+
+<orderedlist>
+ <listitem><command>Proxy Obedience</command>
+ <para>The browser
+MUST NOT bypass Tor proxy settings for any content.</para></listitem>
+
+ <listitem><command>State Separation</command>
+ <para>The browser MUST NOT provide any stored state to the content window
+from other browsing modes, including shared state from plugins, machine
+identifers, and TLS session state.
+</para></listitem>
+
+ <listitem><command>Disk Avoidance</command><para>The
+browser SHOULD NOT write any browsing history information to disk, or store it
+in memory beyond the duration of one Tor session, unless the user has
+explicitly opted to store their browsing history information to
+disk.</para></listitem>
+
+ <listitem><command>Disk Isolation</command><para>The Tor
+components of the browser MUST NOT write or cause the Operating System to
+write <emphasis>any information</emphasis> to disk outside of the application
+directory. All exceptions and shortcomings due to Operating System behavior
+MUST BE documented.
+
+</para></listitem>
+ <listitem><command>Update Safety</command><para>The browser SHOULD NOT perform unsafe updates or upgrades.</para></listitem>
+</orderedlist>
+ </sect2>
+
+ <sect2 id="Privacy">
+ <title>Privacy Requirements</title>
+ <para>
+
+ </para>
+
+<orderedlist>
+ <listitem><command>Cross-Domain Identifier Unlinkability</command>
+ <para>
+
+User activity on one url bar domain MUST NOT be linkable to their activity in
+any other domain by any third party. This property specifically applies to
+linkability from stored browser identifiers, authentication tokens, and shared
+state.
+
+ </para>
+ </listitem>
+ <listitem><command>Cross-Domain Fingerprinting Unlinkability</command>
+ <para>
+
+User activity on one url bar domain MUST NOT be linkable to their activity in
+any other domain by any third party. This property specifically applies to
+linkability from fingerprinting browser behavior.
+
+ </para>
+ </listitem>
+<!--
+ <listitem id="click-to-play"><command>Click-to-play for plugins and invasive content</command>
+ <para>
+
+XXX: Generalize+clarify
+
+Certain activities are inherently fingerprintable. For example, even if
+properly proxied, the activies of closed-source plugins are very difficult to
+control. Other browser features, such as WebGL, GeoLocation, and user-allowed
+exemptions to the identifier policy MUST NOT run until the user has clicked to
+explicitly allow that object or action. If the user decides to craft an
+exemption, it MUST ONLY apply to the top level urlbar domain, and not to all
+sites, to reduce linkability.
+
+ </para>
+ </listitem>
+-->
+</orderedlist>
+
+ </sect2>
+
+</sect1>
+
+<!--
+- Implementation
+ - Section Template
+ - Sub Section
+ - "Design Goal":
+ - "Implementation Status"
+ - Local Privacy
+ - Linkability
+ - Stored State
+ - Cookies
+ - Cache
+ - DOM Storage
+ - HTTP Auth
+ - SSL state
+ - Plugins
+ - Fingerprinting
+ - Location + timezone is part of this
+ - Patches?
+-->
+
+<sect1 id="Implementation">
+ <title>Implementation</title>
+ <para>
+ </para>
+ <sect2 id="proxy-obedience">
+ <title>Proxy Obedience</title>
+ <para>
+
+Proxy obedience is assured through the following:
+
+1. Proxy settings
+2. Blocking Plugins
+3. External App Blocking
+
+ </para>
+ </sect2>
+ <sect2 id="state-separation">
+ <title>State Separation</title>
+ <para>
+Tor Browser State is separated from existing browser state through use of a
+custom Firefox profile.
+ </para>
+ </sect2>
+ <sect2 id="disk-avoidance">
+ <title>Disk Avoidance</title>
+ <para>
+<!-- XXX: Settings involved -->
+
+ </para>
+ </sect2>
+ <sect2 id="disk-isolation">
+ <title>Disk Isolation</title>
+ <para>
+ </para>
+ </sect2>
+ <sect2 id="update-safety">
+ <title>Update Safety</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="identifier-linkability">
+ <title>Cross-Domain Identifier Unlinkability</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="fingerprinting-linkability">
+ <title>Cross-Domain Fingerprinting Unlinkability</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="click-to-play">
+ <title>Click-to-play for plugins and invasive content</title>
+ <para> </para>
+ </sect2>
+
+</sect1>
+
+<!--
+- Packaging
+ - Build Process Security
+ - External Addons
+ - Included
+ - HTTPS-E
+ - NoScript
+ - Torbutton
+ - Deliberately excluded
+ - Request Policy, AdblockPlus, etc
+ - Desired
+ - Perspectives/Convergence/etc
+ - Pref Changes
+ - Caused by Torbutton
+ - Set manually in profile
+ - Update security
+ - Thandy
+-->
+
+<sect1 id="Packaging">
+ <title>Packaging</title>
+ <para> </para>
+ <sect2 id="build-security">
+ <title>Build Process Security</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="addons">
+ <title>External Addons</title>
+ <para> </para>
+ <sect3>
+ <title>Included Addons</title>
+ </sect3>
+ <sect3>
+ <title>Excluded Addons</title>
+ </sect3>
+ <sect3>
+ <title>Dangerous Addons</title>
+ </sect3>
+ </sect2>
+ <sect2 id="prefs">
+ <title>Pref Changes</title>
+ <para> </para>
+ </sect2>
+ <sect2 id="update-mechanism">
+ <title>Update Security</title>
+ <para> </para>
+ </sect2>
+</sect1>
+
+<sect1 id="TestPlan">
+ <title>Testing</title>
+ <para>
+
+The purpose of this section is to cover all the known ways that Tor browser
+security can be subverted from a penetration testing perspective. The hope
+is that it will be useful both for creating a "Tor Safety Check"
+page, and for developing novel tests and actively attacking Torbutton with the
+goal of finding vulnerabilities in either it or the Mozilla components,
+interfaces and settings upon which it relies.
+
+ </para>
+ <sect2 id="SingleStateTesting">
+ <title>Single state testing</title>
+ <para>
+
+Torbutton is a complicated piece of software. During development, changes to
+one component can affect a whole slough of unrelated features. A number of
+aggregated test suites exist that can be used to test for regressions in
+Torbutton and to help aid in the development of Torbutton-like addons and
+other privacy modifications of other browsers. Some of these test suites exist
+as a single automated page, while others are a series of pages you must visit
+individually. They are provided here for reference and future regression
+testing, and also in the hope that some brave soul will one day decide to
+combine them into a comprehensive automated test suite.
+
+ <orderedlist>
+ <listitem><ulink url="http://decloak.net/">Decloak.net</ulink>
+ <para>
+
+Decloak.net is the canonical source of plugin and external-application based
+proxy-bypass exploits. It is a fully automated test suite maintained by <ulink
+url="http://digitaloffense.net/">HD Moore</ulink> as a service for people to
+use to test their anonymity systems.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://deanonymizer.com/">Deanonymizer.com</ulink>
+ <para>
+
+Deanonymizer.com is another automated test suite that tests for proxy bypass
+and other information disclosure vulnerabilities. It is maintained by Kyle
+Williams, the author of <ulink url="http://www.janusvm.com/">JanusVM</ulink>
+and <ulink url="http://www.januspa.com/">JanusPA</ulink>.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="https://www.jondos.de/en/anontest">JonDos
+AnonTest</ulink>
+ <para>
+
+The <ulink url="https://www.jondos.de">JonDos people</ulink> also provide an
+anonymity tester. It is more focused on HTTP headers than plugin bypass, and
+points out a couple of headers Torbutton could do a better job with
+obfuscating.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://browserspy.dk">Browserspy.dk</ulink>
+ <para>
+
+Browserspy.dk provides a tremendous collection of browser fingerprinting and
+general privacy tests. Unfortunately they are only available one page at a
+time, and there is not really solid feedback on good vs bad behavior in
+the test results.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://analyze.privacy.net/">Privacy
+Analyzer</ulink>
+ <para>
+
+The Privacy Analyzer provides a dump of all sorts of browser attributes and
+settings that it detects, including some information on your origin IP
+address. Its page layout and lack of good vs bad test result feedback makes it
+not as useful as a user-facing testing tool, but it does provide some
+interesting checks in a single page.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="http://ha.ckers.org/mr-t/">Mr. T</ulink>
+ <para>
+
+Mr. T is a collection of browser fingerprinting and deanonymization exploits
+discovered by the <ulink url="http://ha.ckers.org">ha.ckers.org</ulink> crew
+and others. It is also not as user friendly as some of the above tests, but it
+is a useful collection.
+
+ </para>
+ </listitem>
+ <listitem>Gregory Fleischer's <ulink
+url="http://pseudo-flaw.net/content/tor/torbutton/">Torbutton</ulink> and
+<ulink
+url="http://pseudo-flaw.net/content/defcon/dc-17-demos/d.html">Defcon
+17</ulink> Test Cases
+ <para>
+
+Gregory Fleischer has been hacking and testing Firefox and Torbutton privacy
+issues for the past 2 years. He has an excellent collection of all his test
+cases that can be used for regression testing. In his Defcon work, he
+demonstrates ways to infer Firefox version based on arcane browser properties.
+We are still trying to determine the best way to address some of those test
+cases.
+
+ </para>
+ </listitem>
+ <listitem><ulink url="https://torcheck.xenobite.eu/index.php">Xenobite's
+TorCheck Page</ulink>
+ <para>
+
+This page checks to ensure you are using a valid Tor exit node and checks for
+some basic browser properties related to privacy. It is not very fine-grained
+or complete, but it is automated and could be turned into something useful
+with a bit of work.
+
+ </para>
+ </listitem>
+ </orderedlist>
+ </para>
+ </sect2>
+ <sect2>
+ <title>Multi-state testing</title>
+ <para>
+
+The tests in this section are geared towards a page that would instruct the
+user to toggle their Tor state after the fetch and perform some operations:
+mouseovers, stray clicks, and potentially reloads.
+
+ </para>
+ <sect3>
+ <title>Cookies and Cache Correlation</title>
+ <para>
+The most obvious test is to set a cookie, ask the user to toggle tor, and then
+have them reload the page. The cookie should no longer be set if they are
+using the default Torbutton settings. In addition, it is possible to leverage
+the cache to <ulink
+url="http://crypto.stanford.edu/sameorigin/safecachetest.html">store unique
+identifiers</ulink>. The default settings of Torbutton should also protect
+against these from persisting across Tor Toggle.
+
+ </para>
+ </sect3>
+ <sect3>
+ <title>Javascript timers and event handlers</title>
+ <para>
+
+Javascript can set timers and register event handlers in the hopes of fetching
+URLs after the user has toggled Torbutton.
+ </para>
+ </sect3>
+ <sect3>
+ <title>CSS Popups and non-script Dynamic Content</title>
+ <para>
+
+Even if Javascript is disabled, CSS is still able to
+<ulink url="http://www.tjkdesign.com/articles/css%20pop%20ups/">create popup-like
+windows</ulink>
+via the 'onmouseover' CSS attribute, which can cause arbitrary browser
+activity as soon as the mouse enters into the content window. It is also
+possible for meta-refresh tags to set timers long enough to make it likely
+that the user has toggled Tor before fetching content.
+
+ </para>
+ </sect3>
+ </sect2>
+ <sect2 id="HackTorbutton">
+ <title>Active testing (aka How to Hack Torbutton)</title>
+ <para>
+
+The idea behind active testing is to discover vulnerabilities in Torbutton to
+bypass proxy settings, run script in an opposite Tor state, store unique
+identifiers, leak location information, or otherwise violate <link
+linkend="requirements">its requirements</link>. Torbutton has ventured out
+into a strange and new security landscape. It depends on Firefox mechanisms
+that haven't necessarily been audited for security, certainly not for the
+threat model that Torbutton seeks to address. As such, it and the interfaces
+it depends upon still need a 'trial by fire' typical of new technologies. This
+section of the document was written with the intention of making that period
+as fast as possible. Please help us get through this period by considering
+these attacks, playing with them, and reporting what you find (and potentially
+submitting the test cases back to be run in the standard batch of Torbutton
+tests.
+
+ </para>
+ <sect3>
+ <title>Some suggested vectors to investigate</title>
+ <para>
+ <itemizedlist>
+ <listitem>Strange ways to register Javascript <ulink
+url="http://en.wikipedia.org/wiki/DOM_Events">events</ulink> and <ulink
+url="http://www.devshed.com/c/a/JavaScript/Using-Timers-in-JavaScript/">timeouts</ulink> should
+be verified to actually be ineffective after Tor has been toggled.</listitem>
+ <listitem>Other ways to cause Javascript to be executed after
+<command>javascript.enabled</command> has been toggled off.</listitem>
+ <listitem>Odd ways to attempt to load plugins. Kyle Williams has had
+some success with direct loads/meta-refreshes of plugin-handled URLs.</listitem>
+ <listitem>The Date and Timezone hooks should be verified to work with
+crazy combinations of iframes, nested iframes, iframes in frames, frames in
+iframes, and popups being loaded and
+reloaded in rapid succession, and/or from one another. Think race conditions and deep,
+parallel nesting, involving iframes from both <ulink
+url="http://en.wikipedia.org/wiki/Same_origin_policy">same-origin and
+non-same-origin</ulink> domains.</listitem>
+ <listitem>In addition, there may be alternate ways and other
+methods to query the timezone, or otherwise use some of the Date object's
+methods in combination to deduce the timezone offset. Of course, the author
+tried his best to cover all the methods he could foresee, but it's always good
+to have another set of eyes try it out.</listitem>
+ <listitem>Similarly, is there any way to confuse the <link
+linkend="contentpolicy">content policy</link>
+mentioned above to cause it to allow certain types of page fetches? For
+example, it was recently discovered that favicons are not fetched by the
+content, but the chrome itself, hence the content policy did not look up the
+correct window to determine the current Tor tag for the favicon fetch. Are
+there other things that can do this? Popups? Bookmarklets? Active bookmarks? </listitem>
+ <listitem>Alternate ways to store and fetch unique identifiers. For example, <ulink
+url="http://developer.mozilla.org/en/docs/DOM:Storage">DOM Storage</ulink>
+caught us off guard.
+It was
+also discovered by <ulink url="http://pseudo-flaw.net">Gregory
+Fleischer</ulink> that <ulink
+url="http://pseudo-flaw.net/content/tor/torbutton/">content window access to
+chrome</ulink> can be used to build <link linkend="fingerprinting">unique
+identifiers</link>.
+Are there any other
+arcane or experimental ways that Firefox provides to create and store unique
+identifiers? Or perhaps unique identifiers can be queried or derived from
+properties of the machine/browser that Javascript has access to? How unique
+can these identifiers be?
+ </listitem>
+ <listitem>Is it possible to get the browser to write some history to disk
+(aside from swap) that can be retrieved later? By default, Torbutton should
+write no history, cookie, or other browsing activity information to the
+harddisk.</listitem>
+ <listitem>Do popup windows make it easier to break any of the above
+behavior? Are javascript events still canceled in popups? What about recursive
+popups from Javascript, data, and other funky URL types? What about CSS
+popups? Are they still blocked after Tor is toggled?</listitem>
+ <listitem>Chrome-escalation attacks. The interaction between the
+Torbutton chrome Javascript and the client content window javascript is pretty
+well-defined and carefully constructed, but perhaps there is a way to smuggle
+javascript back in a return value, or otherwise inject network-loaded
+javascript into the chrome (and thus gain complete control of the browser).
+</listitem>
+</itemizedlist>
+
+ </para>
+ </sect3>
+ </sect2>
+</sect1>
+</article>
[View Less]
1
0

28 Apr '14
commit e6c051cd7ae5caf544eac03b2eefa8dd646ea192
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Sat Sep 24 05:04:38 2011 -0700
Document Firefox patches.
Also provide a sketch for the New Identity section.
---
docs/design/design.xml | 118 +++++++++++++++++++++++++++++++++++++++++++-----
1 file changed, 106 insertions(+), 12 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index 619f76d..0d3d385 100644
--- a/docs/design/design.xml
+++ b/…
[View More]docs/design/design.xml
@@ -601,7 +601,6 @@ and/or what additional work or auditing needs to be done.
</sect2>
<sect2 id="identifier-linkability">
<title>Cross-Domain Identifier Unlinkability</title>
- <!-- XXX: Design goals vs implementation status -->
<para>
The Tor Browser MUST prevent a user's activity on one site from being linked
@@ -686,9 +685,10 @@ cases</ulink> are expected to fail.
<listitem>HTTP Auth
<para>
-HTTP authentication tokens are removed for third parties
-on-modify-request observer to remove the heads. However, we also needed to
-<ulink
+HTTP authentication tokens are removed for third parties on-modify-request
+observer to remove the headers to prevent <ulink
+url="http://jeremiahgrossman.blogspot.com/2007/04/tracking-users-without-cookies…">silent
+linkability between domains</ulink>. We also needed to <ulink
url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">patch
Firefox to cause the headers to get added early enough</ulink> to allow the
observer to modify it.
@@ -755,6 +755,25 @@ functionality.
<para>
</para>
</sect2>
+ <sect2 id="new-identity">
+ <title>Provide "New Identity" button to purge all state</title>
+ <para>
+XXX: make this prettier
+ 0. Disables all open tabs and windows.
+ 1. Closes all tabs and windows
+ 2. Clears state:
+ a. OCSP
+ b. Cache
+ c. Site-specific zoom
+ d. Cookies+DOM Storage+safe browsing key
+ e. google wifi geolocation token
+ f. http auth
+ g. SSL Session IDs
+ h. last open location url
+ i. clear content prefs
+ 3. Sends tor the NEWNYM signal to get a new circuit
+ </para>
+ </sect2>
<sect2 id="click-to-play">
<title>Click-to-play for plugins and invasive content</title>
<para>
@@ -774,34 +793,109 @@ audio and video objects.
<sect2 id="firefox-patches">
<title>Description of Firefox Patches</title>
<para>
-https://gitweb.torproject.org/torbrowser.git/tree/refs/heads/maint-2.2:/src/current-patches
+The set of patches we have against Firefox can be found in the <ulink
+url="https://gitweb.torproject.org/torbrowser.git/tree/refs/heads/maint-2.2:/src…">current-patches
+directory of the torbrowser git repository</ulink>
</para>
<orderedlist>
<listitem>Block Components.interfaces and Components.lookupMethod
- <para> </para>
+ <para>
+
+In order to reduce fingerprinting, we block access to these two interfaces
+from content script. Components.lookupMethod can undo our javascript hooks,
+and Components.interfaces is useful for fingerprinting the platform, OS, and
+Firebox version.
+
+ </para>
</listitem>
<listitem>Make Permissions Manager memory only
- <para> </para>
+ <para>
+
+This patch exposes a pref 'permissions.memory_only' that properly isolates the
+permissions manager to memory, which is responsible for all user specified
+site permissions, as well as stored HTTPS STS policy from visited sites.
+
+The pref does successfully clear the permissions manager memory if toggled. It
+does not need to be set in prefs.js, and can be handled by Torbutton.
+
+ </para>
+ <para><command>Design Goal:</command>
+
+As an additional design goal, we would like to later this patch to allow this
+information to be cleared from memory. The implementation does not currently
+allow this.
+
+ </para>
</listitem>
<listitem>Make Intermediate Cert Store memory-only
- <para> </para>
+ <para>
+
+The intermediate certificate store holds information about SSL certificates
+that may only be used by a limited number of domains. in some cases
+effectively recording on disk the fact that a website owned by a certain
+organization was viewed.
+
+ </para>
+ <!-- FIXME: Should these design goals be <note> tags? -->
+ <para><command>Design Goal:</command>
+
+As an additional design goal, we would like to later this patch to allow this
+information to be cleared from memory. The implementation does not currently
+allow this.
+
+ </para>
</listitem>
<listitem>Add HTTP auth headers before on-modify-request fires
- <para> </para>
+ <para>
+
+This patch provides a trivial modification to allow us to properly remove HTTP
+auth for third parties. This patch allows us to defend against an adversary
+attempting to use <ulink
+url="http://jeremiahgrossman.blogspot.com/2007/04/tracking-users-without-cookies…">HTTP
+auth to silently track users between domains</ulink>.
+
+ </para>
</listitem>
<listitem>Add a string-based cacheKey property for domain isolation
- <para> </para>
+ <para>
+
+To <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3666">increase the
+security of cache isolation</ulink> and to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3754">solve strange and
+unknown conflicts with OCSP</ulink>, we had to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">patch
+Firefox to provide a cacheDomain cache attribute</ulink>. We use the full
+url bar domain as input to this field.
+
+ </para>
</listitem>
<listitem>Randomize HTTP pipeline order and depth
<para>
-https://blog.torproject.org/blog/experimental-defense-website-traffic-fingerprinting
+As an
+<ulink
+url="https://blog.torproject.org/blog/experimental-defense-website-traffic-finge…">experimental
+defense against Website Traffic Fingerprinting</ulink>, we patch the standard
+HTTP pipelining code to randomize the number of requests in a
+pipeline, as well as their order.
</para>
</listitem>
<listitem>Block all plugins except flash
- <para> </para>
+ <para>
+<!-- XXX: Why allow flash at all?? Justify w/ a design goal describing a
+happy, safe-flash future... But here, or in some other section?? -->
+We cannot use the @mozilla.org/extensions/blocklist;1 service, because we
+actually want to stop plugins from ever entering the browser's process space
+and/or executing code (for example, AV plugins that collect statistics/analyse
+urls, magical toolbars that phone home or "help" the user, skype buttons that
+ruin our day, and censorship filters). Hence we rolled our own.
+ </para>
</listitem>
<listitem>Make content-prefs service memory only
<para>
+This patch prevents random urls from being inserted into content-prefs.sqllite in
+the profile directory as content prefs change (includes site-zoom and perhaps
+other site prefs?).
</para>
</listitem>
</orderedlist>
[View Less]
1
0

[tor-browser-spec/master] Add some implementation information.
by mikeperry@torproject.org 28 Apr '14
by mikeperry@torproject.org 28 Apr '14
28 Apr '14
commit 51c178fe236daad4929eeb42df3f15638c6e42e4
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Fri Sep 23 12:34:38 2011 -0700
Add some implementation information.
---
docs/design/design.xml | 216 +++++++++++++++++++++++++++++++++++++++++++++---
1 file changed, 205 insertions(+), 11 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index 419143a..586184c 100644
--- a/docs/design/design.xml
+++ b/docs/design/design.xml
@@ -477,47 +477,241 @@ sites, …
[View More]to reduce linkability.
<para>
Proxy obedience is assured through the following:
-
-1. Proxy settings
-2. Blocking Plugins
-3. External App Blocking
-
</para>
+<orderedlist>
+ <listitem>Firefox Proxy settings
+ <para>
+ The Torbutton xpi sets the Firefox proxy settings to use Tor directly as a
+SOCKS proxy. It sets <command>network.proxy.socks_remote_dns</command>,
+<command>network.proxy.socks_version</command>, and
+<command>network.proxy.socks_port</command>.
+ </para>
+</listitem>
+
+ <listitem>Disabling plugins
+ <para>
+ Plugins have the ability to make arbitrary OS system calls. This includes
+the ability to make UDP sockets and send arbitrary data independent of the
+browser proxy settings.
+ </para>
+ <para>
+Torbutton disables plugins by using the
+<command>@mozilla.org/plugin/host;1</command> service to mark the plugin tags
+as disabled. Additionally, we set
+<command>plugin.disable_full_page_plugin_for_types</command> to the list of
+supported mime types for all currently installed plugins.
+ </para>
+ <para>
+In addition, to prevent any unproxied activity by plugins at load time, we
+also patch the Firefox source code to <ulink
+linkend="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent the load of any plugins except
+for Flash and Gnash</ulink>.
+
+ </para>
+ </listitem>
+ <listitem>External App Blocking
+ <para>
+External apps, if launched automatically, can be induced to load files that
+perform network activity. In order to prevent this, Torbutton installs a
+component to
+<ulink
+linkend="https://gitweb.torproject.org/torbutton.git/blob_plain/HEAD:/src/components…">
+provide the user with a popup</ulink> whenever the browser attempts to
+launch a helper app.
+ </para>
+ </listitem>
+ </orderedlist>
</sect2>
<sect2 id="state-separation">
<title>State Separation</title>
<para>
Tor Browser State is separated from existing browser state through use of a
-custom Firefox profile.
+custom Firefox profile. Furthermore, plugins are disabled, which prevents
+Flash cookies from leaking from a pre-existing Flash directory.
</para>
</sect2>
<sect2 id="disk-avoidance">
<title>Disk Avoidance</title>
<para>
-<!-- XXX: Settings involved -->
+
+<!-- XXX: http auth on disk??? -->
+
+dom.storage.enabled
+browser.cache.memory.enable
+network.http.use-cache
+browser.cache.disk.enable
+browser.cache.offline.enable
+general.open_location.last_url
+places.history.enabled
+browser.formfill.enable
+signon.rememberSignons
+browser.download.manager.retention <!-- XXX: needs patch -->
+network.cookie.lifetimePolicy = 2
+
+https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src/current-patches/0002-Firefox6-Make-Permissions-Manager-memory-only.patch
+https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src/current-patches/0003-Firefox6-Make-Intermediate-Cert-Store-memory-only.patch
+https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src/current-patches/0008-Make-content-pref-service-memory-only-clearable.patch
</para>
</sect2>
<sect2 id="disk-isolation">
<title>Disk Isolation</title>
<para>
+<!-- XXX: sjmurdoch, Erinn -->
</para>
</sect2>
<sect2 id="update-safety">
<title>Update Safety</title>
- <para> </para>
+ <para>
+<!-- XXX: Design goal -->
+ </para>
</sect2>
<sect2 id="identifier-linkability">
<title>Cross-Domain Identifier Unlinkability</title>
- <para> </para>
+ <para>
+
+The Tor Browser MUST prevent a user's activity on one site from being
+linked to their activity on another site.
+
+<!-- XXX: Explain Why. UI simplification link -->
+
+ </para>
+ <orderedlist>
+ <listitem>Cookies
+ <para><command>Design Goal:</command>
+
+All cookies should be double-keyed to the top-level domain. There exists a
+<ulink
+linkend="https://bugzilla.mozilla.org/show_bug.cgi?id=565965">Mozilla
+bug</ulink> that contains a prototype patch, but it lacks UI, and does not
+apply to modern Firefoxes.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+As a stopgap to satisfy our design requirement of unlinkability, we currently
+entirely disable 3rd party cookies by setting
+<command>network.cookie.cookieBehavior</command> to 1. We would prefer that
+third party content continue to funtion , but we believe unlinkability.
+
+ </para>
+ </listitem>
+ <listitem>Cache
+ <para>
+Cache is isolated to the top-level url bar domain by using a technique
+pioneered by Colin Jackson et al, via their work on <ulink
+url="http://www.safecache.com/">SafeCache</ulink>. The technique re-uses the
+<ulink
+url="https://developer.mozilla.org/en/XPCOM_Interface_Reference/nsICachingChannel">nsICachingChannel.cacheKey</ulink>
+attribute that Firefox uses internally to prevent improper caching of HTTP POST data.
+ </para>
+ <para>
+However, to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3666">increase the
+security of the isolation</ulink> and to <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3754">solve strange and
+unknown conflicts with OCSP</ulink>, we had to <ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">patch
+Firefox to provide a cacheDomain cache attribute</ulink>. We use the full
+url bar domain as input to this field.
+ </para>
+ <para>
+
+<!-- FIXME: This could use a few more specifics.. Maybe. The Chrome folks
+won't care, but the Mozilla folks might. -->
+Furthermore, we chose a different isolation scheme than the stanford
+implemention. First, we decoupled the cache isolation from the third party
+cookie attribute. Second, we use several machanisms to attempt to determine
+the actual location attribute of the top-level window (the url bar domain)
+used to load the page, as opposed to relying solely on the referer property.
+ </para>
+ <para>
+Therefore, <ulink
+url="http://crypto.stanford.edu/sameorigin/safecachetest.html">the original
+stanford test
+cases</ulink> are expected to fail.
+ </para>
+ </listitem>
+ <listitem>HTTP Auth
+ <para>
+
+HTTP authentication tokens are removed for third parties
+on-modify-request observer to remove the heads. However, we also needed to
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">patch
+Firefox to cause the headers to get added early enough</ulink> to allow the
+observer to modify it.
+
+ </para>
+ </listitem>
+ <listitem>DOM Storage
+ <para><command>Design Goal:</command>
+
+DOM storage for third party domains MUST BE isolated to the url bar domain,
+to prevent linkability between sites.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+Because it is isolated to third party domain as opposed to top level url bar
+domain, we entirely disable DOM storage as a stopgap to ensure unlinkability.
+
+ </para>
+ </listitem>
+ <listitem>window.name
+ <para>
+
+<ulink
+url="https://developer.mozilla.org/En/DOM/Window.name">window.name</ulink> is
+a magical DOM property that for some reason is allowed to retain a persistent value
+for the lifespan of a browser tab. It is possible to utilize this property for
+<ulink url="http://www.thomasfrank.se/sessionvars.html">identifier
+storage</ulink>.
+
+ </para>
+ <para>
+
+In order to eliminate linkability but still allow for sites that utilize this
+property to function, we reset the window.name property of tabs in Torbutton every
+time we encounter a blank referer. This behavior allows window.name to persist
+for the duration of a link-driven navigation session, but as soon as the user
+enters a new URL or navigates between https/http schemes, the property is cleared.
+
+ </para>
+ </listitem>
+ <listitem>Exit node usage
+ <para><command>Design Goal:</command>
+
+Every distinct navigation session (as defined by a non-blank referer header)
+MUST exit through a fresh Tor circuit in Tor Browser to prevent exit node
+observers from linking concurrent browsing activity.
+
+ </para>
+ <para><command>Implementation Status:</command>
+
+The Tor feature that supports this ability only exists in the 0.2.3.x-alpha
+series. <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3455">Ticket
+#3455</ulink> is the Torbutton ticket.
+
+ </para>
+ </listitem>
+ </orderedlist>
</sect2>
<sect2 id="fingerprinting-linkability">
<title>Cross-Domain Fingerprinting Unlinkability</title>
- <para> </para>
+ <para>
+ </para>
</sect2>
<sect2 id="click-to-play">
<title>Click-to-play for plugins and invasive content</title>
- <para> </para>
+ <para>
+ </para>
+ </sect2>
+ <sect2 id="firefox-patches">
+ <title>Description of Firefox Patches</title>
+ <para>
+ </para>
</sect2>
</sect1>
[View Less]
1
0

28 Apr '14
commit 594385e416fd3b6ee8fb90705cc52f329e31d3bf
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Fri Sep 23 22:26:35 2011 -0700
Enumerate the firefox patches.
Also add some prose.
---
docs/design/design.xml | 139 ++++++++++++++++++++++++++++++++++++++++--------
1 file changed, 116 insertions(+), 23 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index 586184c..619f76d 100644
--- a/docs/design/design.xml
+++ b/docs/design/design.xml
@@ -…
[View More]532,48 +532,98 @@ Flash cookies from leaking from a pre-existing Flash directory.
</sect2>
<sect2 id="disk-avoidance">
<title>Disk Avoidance</title>
- <para>
+ <para><command>Design Goal:</command>
+
+Tor Browser should optionally prevent all disk records of browser activity.
+The user should be able to optionally enable URL history and other history
+features if they so desire. Once we <ulink
+url="https://trac.torproject.org/projects/tor/ticket/3100">simplify the
+preferences interface</ulink>, we will likely just enable Private Browsing
+mode by default to handle this goal.
+ </para>
+ <para><command>Implementation Status:</command>
+
+For now, Tor Browser blocks write access to the disk through Torbutton
+using several Firefox preferences.
<!-- XXX: http auth on disk??? -->
-dom.storage.enabled
-browser.cache.memory.enable
-network.http.use-cache
-browser.cache.disk.enable
-browser.cache.offline.enable
-general.open_location.last_url
-places.history.enabled
-browser.formfill.enable
-signon.rememberSignons
-browser.download.manager.retention <!-- XXX: needs patch -->
-network.cookie.lifetimePolicy = 2
-
-https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src/current-patches/0002-Firefox6-Make-Permissions-Manager-memory-only.patch
-https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src/current-patches/0003-Firefox6-Make-Intermediate-Cert-Store-memory-only.patch
-https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src/current-patches/0008-Make-content-pref-service-memory-only-clearable.patch
+The set of prefs is:
+<command>dom.storage.enabled</command>,
+<command>browser.cache.memory.enable</command>,
+<command>network.http.use-cache</command>,
+<command>browser.cache.disk.enable</command>,
+<command>browser.cache.offline.enable</command>,
+<command>general.open_location.last_url</command>,
+<command>places.history.enabled</command>,
+<command>browser.formfill.enable</command>,
+<command>signon.rememberSignons</command>,
+<command>browser.download.manager.retention <!-- XXX: needs patch --></command>,
+and <command>network.cookie.lifetimePolicy</command>.
+ </para>
+ <para>
+In addition, three Firefox patches are needed to prevent disk writes, even if
+Private Browsing Mode is enabled. We need to
+
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
+the permissions manager from recording HTTPS STS state</ulink>,
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
+intermediate SSL certficates from being recorded</ulink>, and
+<ulink
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
+the content preferences service from recording site zoom</ulink>.
+
+For more details on these patches, <link linkend="firefox-patches">see the
+Firefox Patches section</link>.
</para>
</sect2>
<sect2 id="disk-isolation">
<title>Disk Isolation</title>
<para>
-<!-- XXX: sjmurdoch, Erinn -->
+
+Tor Browser Bundle MUST NOT cause any information to be written outside of the
+bundle directory. This is to ensure that the user is able to completely and
+safely remove the bundle without leaving other traces of Tor usage on their
+computer.
+
</para>
+ <para>XXX: sjmurdoch, Erinn: explain what magic we do to satisfy this,
+and/or what additional work or auditing needs to be done.
</sect2>
<sect2 id="update-safety">
<title>Update Safety</title>
<para>
-<!-- XXX: Design goal -->
+<!-- XXX: Design goal vs implementation status -->
</para>
</sect2>
<sect2 id="identifier-linkability">
<title>Cross-Domain Identifier Unlinkability</title>
+ <!-- XXX: Design goals vs implementation status -->
+ <para>
+
+The Tor Browser MUST prevent a user's activity on one site from being linked
+to their activity on another site. When this goal cannot yet be met with an
+existing web technology, that technology or functionality is disabled. Our
+design goal is to ultimately eliminate the need to disable arbitrary
+technologies, and instead simply alter them in ways that allows them to
+function in a backwards-compatible way while avoiding linkability.
+
+ </para>
<para>
-The Tor Browser MUST prevent a user's activity on one site from being
-linked to their activity on another site.
+The benefit of this approach comes not only in the form of reduced
+linkability, but also in terms of simplified privacy UI. If all stored browser
+state and permissions become associated with the top-level url-bar domain, the
+six or seven different pieces of privacy UI governing these identifiers and
+permissions can become just one piece of UI. For instance, a window that lists
+the top-level url bar domains for which browser state exists with the ability
+to clear and/or block them, possibly with a context-menu option to drill down
+into specific types of state.
-<!-- XXX: Explain Why. UI simplification link -->
+<!-- XXX: Include graphic as a 'Design Goal' -->
</para>
<orderedlist>
@@ -592,7 +642,8 @@ apply to modern Firefoxes.
As a stopgap to satisfy our design requirement of unlinkability, we currently
entirely disable 3rd party cookies by setting
<command>network.cookie.cookieBehavior</command> to 1. We would prefer that
-third party content continue to funtion , but we believe unlinkability.
+third party content continue to funtion , but we believe the requirement for
+unlinkability trumps that desire.
</para>
</listitem>
@@ -692,7 +743,8 @@ observers from linking concurrent browsing activity.
The Tor feature that supports this ability only exists in the 0.2.3.x-alpha
series. <ulink
url="https://trac.torproject.org/projects/tor/ticket/3455">Ticket
-#3455</ulink> is the Torbutton ticket.
+#3455</ulink> is the Torbutton ticket to make use of the new Tor
+functionality.
</para>
</listitem>
@@ -706,12 +758,53 @@ url="https://trac.torproject.org/projects/tor/ticket/3455">Ticket
<sect2 id="click-to-play">
<title>Click-to-play for plugins and invasive content</title>
<para>
+Some content types are too invasive and/or too opaque for us to properly
+eliminate their linkability properties. For these content types, we use
+NoScript to provide click-to-play placeholders that do not activate the
+content until the user clicks on it. This will eliminate the ability for an
+adversary to use such content types to link users in a dragnet fashion across
+arbitrary sites.
+ </para>
+ <para>
+<!-- XXX: Where do we discuss our plans w/ flash -->
+Currently, the content types isolated in this way include Flash, WebGL, and
+audio and video objects.
</para>
</sect2>
<sect2 id="firefox-patches">
<title>Description of Firefox Patches</title>
<para>
+https://gitweb.torproject.org/torbrowser.git/tree/refs/heads/maint-2.2:/src/current-patches
</para>
+ <orderedlist>
+ <listitem>Block Components.interfaces and Components.lookupMethod
+ <para> </para>
+ </listitem>
+ <listitem>Make Permissions Manager memory only
+ <para> </para>
+ </listitem>
+ <listitem>Make Intermediate Cert Store memory-only
+ <para> </para>
+ </listitem>
+ <listitem>Add HTTP auth headers before on-modify-request fires
+ <para> </para>
+ </listitem>
+ <listitem>Add a string-based cacheKey property for domain isolation
+ <para> </para>
+ </listitem>
+ <listitem>Randomize HTTP pipeline order and depth
+ <para>
+https://blog.torproject.org/blog/experimental-defense-website-traffic-fingerprinting
+ </para>
+ </listitem>
+ <listitem>Block all plugins except flash
+ <para> </para>
+ </listitem>
+ <listitem>Make content-prefs service memory only
+ <para>
+ </para>
+ </listitem>
+ </orderedlist>
</sect2>
</sect1>
[View Less]
1
0
commit 5f4a6e32a430140c53385501abd7c0d30a443054
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Sat Sep 24 05:11:30 2011 -0700
Fix some build errors.
---
docs/design/design.xml | 8 ++++++--
1 file changed, 6 insertions(+), 2 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index 0d3d385..0030fa5 100644
--- a/docs/design/design.xml
+++ b/docs/design/design.xml
@@ -35,9 +35,9 @@
<title>Introduction</title>
<para>
-<!-- …
[View More]XXX:
+<!-- XXX: intro + version
This document describes the goals, operation, and testing procedures of the
-Torbutton Firefox extension. It is current as of Torbutton 1.3.2.
+Torbutton Firefox extension. It is current as of Tor Browser 2.2.32-4.
-->
</para>
@@ -592,6 +592,7 @@ computer.
</para>
<para>XXX: sjmurdoch, Erinn: explain what magic we do to satisfy this,
and/or what additional work or auditing needs to be done.
+ </para>
</sect2>
<sect2 id="update-safety">
<title>Update Safety</title>
@@ -978,6 +979,7 @@ individually. They are provided here for reference and future regression
testing, and also in the hope that some brave soul will one day decide to
combine them into a comprehensive automated test suite.
+ <!-- XXX: ip-check.info? -->
<orderedlist>
<listitem><ulink url="http://decloak.net/">Decloak.net</ulink>
<para>
@@ -1072,6 +1074,7 @@ with a bit of work.
</orderedlist>
</para>
</sect2>
+<!--
<sect2>
<title>Multi-state testing</title>
<para>
@@ -1201,5 +1204,6 @@ javascript into the chrome (and thus gain complete control of the browser).
</para>
</sect3>
</sect2>
+-->
</sect1>
</article>
[View Less]
1
0

28 Apr '14
commit 656e124e0d9fe519fab001385a5b4045974ef452
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Sun Sep 25 19:15:39 2011 -0700
Add some philosophy material.
---
docs/design/design.xml | 136 +++++++++++++++++++++++++++---------------------
1 file changed, 77 insertions(+), 59 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index 2b493d2..2fd3dc0 100644
--- a/docs/design/design.xml
+++ b/docs/design/design.xml
@@ -292,27 +292,6 @@ size of toolbars.
…
[View More] <!-- XXX: Also, new browser features are added regularly. -->
</para>
-<!--
-FIXME: This is no longer true. Only certain addons are now discoverable, and
-only if they want to be:
-http://webdevwonders.com/detecting-firefox-add-ons/
-https://developer.mozilla.org/en/Updating_web_applications_for_Firefox_3#section_7
-
-<para>
-
-To add insult to injury, <ulink
-url="http://pseudo-flaw.net/content/tor/torbutton/">chrome URL disclosure
-attacks</ulink> mean that each and every extension on <ulink
-url="https://addons.mozilla.org">addons.mozilla.org</ulink> adds another bit
-to that 2<superscript>29</superscript>. With hundreds of popular extensions
-and thousands of extensions total, it is easy to see that this sort of
-information is an impressively powerful identifier if used properly by a
-competent and determined adversary such as an ad network. Again, a
-nearest-neighbor bit vector space approach here would also gracefully handle
-incremental changes to installed extensions.
-
-</para>
--->
</listitem>
<listitem><command>Remotely or locally exploit browser and/or
OS</command>
@@ -462,67 +441,102 @@ Tor Browser are also guided by some philosophical positions about technology.
</para>
<orderedlist>
- <listitem>Preserve existing user model
+ <listitem><command>Preserve existing user model</command>
<para>
The existing way that the user expects to use a browser must be preserved. If
the user has to maintain a different mental model of how the sites they are
using behave depending on tab, browser state, or anything else that would not
normally be what they experience in their default browser, the user will
-inevitably be confused. They will become confused, make mistakes, and reduce
-their privacy as a result. Worse, they may just stop using the browser,
-assuming it is broken.
+inevitably be confused. They will make mistakes and reduce their privacy as a
+result. Worse, they may just stop using the browser, assuming it is broken.
</para>
<para>
-User model breakage was one of the failures of Torbutton: Even if users
-managed to install everything properly, the toggle model was too hard for the
-average user to understand, especially in the face of accumulating tabs from
-multiple states crossed with the current tor-state of the browser.
+User model breakage was one of the <ulink
+url="https://blog.torproject.org/blog/toggle-or-not-toggle-end-torbutton">failures
+of Torbutton</ulink>: Even if users managed to install everything properly,
+the toggle model was too hard for the average user to understand, especially
+in the face of accumulating tabs from multiple states crossed with the current
+tor-state of the browser.
</para>
</listitem>
- <listitem>Minimal breakage to support requirements
+ <listitem><command>Minimal breakage to support requirements</command>
<para>
-Minimal
-
+In general, we try to find solutions to privacy issues that will not induce
+site breakage, though this is not always possible.
</para>
</listitem>
- <listitem>Plugins must be restricted
-<!--
- <listitem id="click-to-play"><command>Click-to-play for plugins and invasive content</command>
- <para>
+ <listitem><command>Plugins must be restricted</command>
+ <para>
+Even if plugins always properly used the browser proxy settings (which none of
+them do) and can't be induced to bypass them (which all of them can), the
+activies of closed-source plugins are very difficult to audit and control.
+They can obtain and transmit all manner of system information to websites, and
+often have their own identifier storage for tracking users.
+ </para>
+ <para>
+
+Therefore, if plugins are to be enabled in private browsing modes, they must
+be restricted from running automatically on every page (via click-to-play
+placeholders), and/or be sandboxed to restrict the types of system calls they
+can execute. If the user decides to craft an exemption, it MUST ONLY apply to
+the top level urlbar domain, and not to all sites, to reduce linkability.
-XXX: Generalize+clarify
+ </para>
+ </listitem>
+ <listitem><command>Minimize Global Privacy Options</command>
+ <para>
-Certain activities are inherently fingerprintable. For example, even if
-properly proxied, the activies of closed-source plugins are very difficult to
-control. Other browser features, such as WebGL, GeoLocation, and user-allowed
-exemptions to the identifier policy MUST NOT run until the user has clicked to
-explicitly allow that object or action. If the user decides to craft an
-exemption, it MUST ONLY apply to the top level urlbar domain, and not to all
-sites, to reduce linkability.
+<ulink url="https://trac.torproject.org/projects/tor/ticket/3100">Another
+failure of Torbutton</ulink> was (and still is) the options panel. Each option
+that detectably alters browser behavior can be used as a fingerprinting tool
+on the part of ad networks. Similarly, all extensions <ulink
+url="http://blog.chromium.org/2010/06/extensions-in-incognito.html">should be
+disabled in the mode</ulink> except as an opt-in basis. We should not load
+system addons or plugins.
+ </para>
+ <para>
+Instead of global browser privacy options, privacy decisions should be made
+<ulink
+url="https://wiki.mozilla.org/Privacy/Features/Site-based_data_management_UI">per
+top-level url-bar domain</ulink> to eliminate the possibility of linkability
+between domains. For example, when a plugin object (or a JavaScript access of
+window.plugins) is present in a page, the user should be given the choice of
+allowing that plugin object for that top-level url-bar domain only. The same
+goes for exemptions to third party cookie policy, geo-location, and any other
+privacy permissions.
+ </para>
+ <para>
+If the user has indicated they do not care about local history storage, these
+permissions can be written to disk. Otherwise, they should remain memory-only.
+ </para>
</para>
</listitem>
--->
-
<para>
</para>
</listitem>
- <listitem>No filters</listitem>
- <para>
- </para>
- <listitem>Stay Current</listitem>
- <para>
+ <listitem><command>No filters</command>
+ <para>
+
+<!-- XXX: Might want to briefly explain why -->
+We don't need no stinking filters.
+
+ </para>
+ </listitem>
+ <listitem><command>Stay Current</command>
+ <para>
We believe that if we do not stay current with the support of new web
technologies, we cannot hope to substantially influence or be involved in
their proper deployment or realization. However, we will likely disable
certain new features (where possible) pending analysis and audit.
- </para>
+ </para>
+ </listitem>
</orderedlist>
</sect2>
</sect1>
@@ -611,17 +625,20 @@ Flash cookies from leaking from a pre-existing Flash directory.
</sect2>
<sect2 id="disk-avoidance">
<title>Disk Avoidance</title>
- <para><command>Design Goal:</command>
-
+ <sect3>
+ <title>Design Goal:</title>
+ <para>
Tor Browser should optionally prevent all disk records of browser activity.
The user should be able to optionally enable URL history and other history
features if they so desire. Once we <ulink
url="https://trac.torproject.org/projects/tor/ticket/3100">simplify the
preferences interface</ulink>, we will likely just enable Private Browsing
mode by default to handle this goal.
- </para>
- <para><command>Implementation Status:</command>
-
+ </para>
+ </sect3>
+ <sect3>
+ <title>Implementation Status:</title>
+ <para>
For now, Tor Browser blocks write access to the disk through Torbutton
using several Firefox preferences.
@@ -641,6 +658,7 @@ The set of prefs is:
<command>browser.download.manager.retention <!-- XXX: needs patch --></command>,
and <command>network.cookie.lifetimePolicy</command>.
</para>
+ </sect3>
<para>
In addition, three Firefox patches are needed to prevent disk writes, even if
Private Browsing Mode is enabled. We need to
@@ -867,6 +885,7 @@ In order to avoid long-term linkability, we provide a "New Identity" context
menu option in Torbutton.
</para>
+<!-- XXX: Note tag? -->
<para> <command>Implementation Status:</command> First, Torbutton disables
all open tabs and windows via nsIContentPolicy blocking, and then closes each
tab and window. The extra step for blocking tabs is done as a precaution to
@@ -896,7 +915,6 @@ adversary to use such content types to link users in a dragnet fashion across
arbitrary sites.
</para>
<para>
-<!-- XXX: Where do we discuss our plans w/ flash -->
Currently, the content types isolated in this way include Flash, WebGL, and
audio and video objects.
</para>
@@ -947,7 +965,7 @@ effectively recording on disk the fact that a website owned by a certain
organization was viewed.
</para>
- <!-- FIXME: Should these design goals be <note> tags? -->
+ <!-- FIXME: Should this be a <note> tag too? -->
<para><command>Design Goal:</command>
As an additional design goal, we would like to later this patch to allow this
[View Less]
1
0
commit 17f79ed7e4f51d29adae2181aaf7219fc4fb1b6e
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Wed Sep 28 13:50:35 2011 -0700
Update patch urls.
---
docs/design/design.xml | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index bfffb2b..b0e261f 100644
--- a/docs/design/design.xml
+++ b/docs/design/design.xml
@@ -700,10 +700,10 @@ In addition, three Firefox patches are needed to prevent disk writes, …
[View More]even if
Private Browsing Mode is enabled. We need to
<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
the permissions manager from recording HTTPS STS state</ulink>,
<ulink
-url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
+url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
intermediate SSL certficates from being recorded</ulink>, and
<ulink
url="https://gitweb.torproject.org/torbrowser.git/blob/refs/heads/maint-2.2:/src…">prevent
[View Less]
1
0
commit 5e822bfefbac8621b7fcedfd7c42fdf6af163bb1
Author: Mike Perry <mikeperry-git(a)fscked.org>
Date: Wed Sep 28 13:11:46 2011 -0700
Minor changes.
---
docs/design/design.xml | 22 ++++++++++++++++++----
1 file changed, 18 insertions(+), 4 deletions(-)
diff --git a/docs/design/design.xml b/docs/design/design.xml
index e3870e6..bfffb2b 100644
--- a/docs/design/design.xml
+++ b/docs/design/design.xml
@@ -728,13 +728,14 @@ computer.
and/or what additional work or auditing needs …
[View More]to be done.
</para>
</sect2>
+<!-- XXX: Write me...
<sect2 id="update-safety">
<title>Update Safety</title>
<para>
-<!-- XXX: Design goal vs implementation status -->
XXX: Write me..
</para>
</sect2>
+-->
<sect2 id="identifier-linkability">
<title>Cross-Domain Identifier Unlinkability</title>
<!-- XXX: Mention web-send?? -->
@@ -915,9 +916,9 @@ functionality.
<title>Cross-Domain Fingerprinting Unlinkability</title>
<para>
-In order to properly address the network adversary on a technical level, we
-need a metric to measure linkability of the various browser properties that
-extend beyond any stored origin-related state. <ulink
+In order to properly address the fingerprinting adversary on a technical
+level, we need a metric to measure linkability of the various browser
+properties that extend beyond any stored origin-related state. <ulink
url="https://panopticlick.eff.org/about.php">The Panopticlick Project</ulink>
by the EFF provides us with exactly this metric. The researchers conducted a
survey of volunteers who were asked to visit an experiment page that harvested
@@ -947,12 +948,25 @@ fingerprinting issues, at least not at this stage.
</para>
<orderedlist>
<listitem>Plugins
+ <para>
+
+Plugins add to fingerprinting risk via two main vectors: their mere presence in
+window.navigator.plugins, as well as their internal functionality.
+
+ </para>
<para><command>Design Goal:</command>
+All plugins that have not been specifically audited or sandboxed must be
+disabled. Additionally, version information should be obfuscated until the
+plugin object is loaded... <!-- XXX: finish -->
</para>
<para><command>Implementation Status:</command>
</para>
</listitem>
<listitem>Fonts
+ <para>
+
+
+ </para>
<para><command>Design Goal:</command>
</para>
<para><command>Implementation Status:</command>
[View Less]
1
0