[tbb-dev] TBB vs mandatory extension signing

Georg Koppen gk at torproject.org
Wed Apr 12 09:40:00 UTC 2017


anonym:
> [This is a repost of my full original post. It contains a line consisting only of "--" which makes some MUA:s think "ah, the rest is a signature, let's hide it".]
> 
> Hi,
> 
> In Tails we've been wondering what to do about Firefox's mandatory extension signing [1] in FF52ESR since the opt-out preference that we have been using for FF45ESR will be removed from released versions. Today I found the Tor Browser's solution for tor-browser-52.0.2esr-7.0-2 in commit 584c17c1b3e07239b8dd195e8b91eefb2ff9b2f4; here's an extract:
> 
>      function isCorrectlySigned(aAddon) {
>        // Add-ons without an "isCorrectlySigned" property are correctly signed as
>        // they aren't the correct type for signing.
>     +  if (aAddon.id == "torbutton at torproject.org" ||
>     +      aAddon.id == "tor-launcher at torproject.org" ||
>     +      aAddon.id == "https-everywhere-eff at eff.org") {
>     +    return true;
>     +  }
>        return aAddon.isCorrectlySigned !== false;
>      }
> 
> So it's a list of exceptions. In Tails we install two additional extensions we'd like exceptions for:
> 
> * a localization-related extension that we generate dynamically *during build*, so signing will be impossible.
> * Ublock Origin, which we install from Debian, and the signature is missing.
> 
> Can this list of exceptions be moved to an external file that we in Tails can modify instead? I think I have exhausted all options we have on our side. [2]

Hm. I have not thought about having a different method for setting up a
whitelist, so, maybe? Regarding your [2]: the binary files are just
optimizations. You can delete the respective file(s) and the
uncompressed one(s) will be used. That should have a tiny performance
impact but that might be negligible in your case.

> ~~
> 
> I'd also like to ask if you have analysed the security implications of introducing this exception list since I couldn't find any such discussion on the relevant ticket [3]. So, have you? Personally I reacted on that it is a simple match vs the extension's id, e.g. something we should consider attacker-controlled. I haven't looked at the code closely, but I'd expect attackers can deliver their malicious code in extensions that only need to have that same id as some extension with an exception to completely bypass the code signing check. Think, for instance, about an "upgraded" Torbutton.
> 
> Is my above hunch true? If so I think this approach is a bit dangerous: without the "mandatory" part, this code signing adds a false sense of security. IMHO this approach then either has to be improved (e.g. by also matching on a cryptographic hash of the XPI) or dropped and replaced with a different solution. I think it can be argued that just flipping the build switch that disables the extension code signing checks is preferable.

We have looked into that but have not written anything down that can be
found easily. I guess a good place for that would be our design
document. In fact I just have opened #21XXX to get that spec updated,
thanks.

Regarding your question it is first important to see that we have need
extensions not signed by Mozilla. We could have a discussion about
whether we should get Torbutton and TorLauncher signed by Mozilla and
just require XPI-signing everywhere. But that would be a different one
than the one in this mail and I am not convinced the benefits outweigh
the risks.

So, what do we do? It seems to me we have 3 possible ways to deal with that

1) Disable the XPI-signing requirement globally
2) Ship a narrow whitelist
3) Deploy our own extension signing infrastructure

3) seemed to us pretty invasive back then but I am fine thinking harder
about it if we come to the conclusion that no other option strikes a
proper benefit/risk balance.

Before going to discuss 2) and 1) lets differentiate between a local
attacker who can do things on your machine and an attacker on the
network (I am not sure if you had both or just one of them in mind when
you used "attacker" above).

A local attacker is out of scope for this I think. Because extension
signing would not help against one. They could just modify your firefox
binary and you are hosed or they can deploy temporary extensions which
do not require signing (at least not in the ESR45 series).

That leaves the network attacker. Here is what I wrote to some Mozilla
folks that claimed back then our code-signing exemption does not buy us
anything:

"""
First of all I think it is not true that we don't get any benefit of all
by the way the exemption is written. The main benefit is that users
can't download and install any [random, G.K.] unsigned extension. They
even can't get loaded them from the harddisk or any other local storage
device [by just dragging them over to Tor Browser, G.K.]. Note, this
includes *the extensions we exempted from the signing requirement as
well* [emphasis mine, G.K]. A second benefit is that the patch is pretty
small trying to minimize possible disruption in a sensitive area.

As to the social engineering risks. Yes, there have always been social
engineering risks around malicious extensions and while reducing them
drastically the patch does not completely eliminate them. An attacker
needs to convince a user to overwrite one of our extensions manually in
order to win which is an highly unusual request and many users probably
don't even know where their extensions folder is. But, yes, as said
there is still a risk left.
"""

Note that you are worse off if you are just flipping the preference. Oh,
and there are other extensions that already poked a hole in the
extension signing requirement: temporary and system extensions (for the
former see e.g.
https://blog.mozilla.org/addons/2015/12/23/loading-temporary-add-ons/).
It seems to me that the social engineering risks that still apply to our
approach apply to this kind of extensions as well.

Georg

> Cheers!
> 
> [1] https://labs.riseup.net/code/issues/11419
> [2] I tried something really ugly: I adjusted our build script to unpack the two affected omni.ja files and patch in our exceptions, but there are binary versions (e.g. XPIProvider.jsm) generated from the patched files, so this doesn't work (or is there an easy way to re-generate the .jsm files?).
> [3] https://trac.torproject.org/projects/tor/ticket/14970
> 
> _______________________________________________
> tbb-dev mailing list
> tbb-dev at lists.torproject.org
> https://lists.torproject.org/cgi-bin/mailman/listinfo/tbb-dev
> 


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <http://lists.torproject.org/pipermail/tbb-dev/attachments/20170412/be4598b8/attachment-0001.sig>


More information about the tbb-dev mailing list