[tbb-dev] Reproducible Builds Summit

Santiago Torres-Arias santiago at archlinux.org
Tue Dec 18 20:50:33 UTC 2018


On Tue, Dec 18, 2018 at 07:32:00PM +0000, Georg Koppen wrote:
> > *SNIP*
> > They have implemented some prototype for Debian.
> > 
> > Here is how it works:
> >  - some independent organisations are running "rebuilders". Those
> >    rebuilders fetch new Debian source packages and rebuild them.
> >    When the build is finished they publish some json files containing
> >    informations about the source package, and the result of their build,
> >    and their signature.
> >  - apt is modified (using an apt-transport) so that before installing a new
> >    package, it connects to known rebuilders and fetch their build
> >    informations. If the build has been reproduced by enough rebuilders,
> >    then the package is installed.
> > 
> > I think it might be possible to implement something similar in Tor Browser,
> > maybe reusing some parts of their framework.
> 
> How does that deal with content signing? Because if you look e.g. at the
> .dmg files we produce there is currently no direct link between the
> build X rebuilders reproduced and the bundle that is getting finally
> shipped. The same holds for Windows although dealing with the
> Authenticode signature is easier here.

Hi, full disclosure, I'm part of the in-toto project linked above, and
I'm also part of the people that are pushing for debian rebuilders. I'll
try to be brief about how it works:

- A layout file is shipped that contains the keys of trusted rebuilder
  entities. This layout file is signed by a debian developer although
  specific namespacing can be handled on a per-application basis.
- All the rebuilders create a signed json file that attests for the
  sources they used and the resulting hash for each artifact (e.g., the
  dmg you pointed out). A threshold of these files can be used to verify
  that the sources were adequately reproduced by a threshold number of
  trusted entities. There's work to be done in the sake of hardening,
  adaptative thresholds and so on, but this is a strong enough
  scaffolding to ensure compliance and security on the r-b side.
- The apt-transport basically contains a map of the rebuilders and a
  signed layout file (the layout should be signed with people from the
  debian org as pointed above). When fetching a package the attestations
  from the rebuilders (we call these link metadata on in-toto lingo) are
  fetched as well and verification is run. You can think of this
  transport as https+threshold verification on reproducible builds
  (in-toto is way more powerful than this, but this is a good first
  step).

> 
> I had the same doubts when reading, e.g. the CHAINIAC paper about
> proactive software-update transparency (with verified builds)[1], but
> did not have time to follow-up with its authors to check whether they
> got that part actually solved.

Don't worry about it, CHANIAC is basically the brain-child of the
in-toto team, granted it has some more ideas sprinkled on top, but the
core idea stems from the same requirement of end-to-end software supply
chain verification.

Let me know if this answers your questions :)
-Santiago.

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: not available
URL: <http://lists.torproject.org/pipermail/tbb-dev/attachments/20181218/f711f64f/attachment.sig>


More information about the tbb-dev mailing list