[tor-commits] [tech-reports/master] Add feedback from Sathya.

karsten at torproject.org karsten at torproject.org
Wed Oct 30 18:44:39 UTC 2013


commit d6cba8aa8383555d9e12e0e791c1466b055b8e45
Author: Karsten Loesing <karsten.loesing at gmx.net>
Date:   Tue Sep 24 12:00:45 2013 +0200

    Add feedback from Sathya.
---
 2013/torperf2/torperf2.tex |   14 ++++++++++----
 1 file changed, 10 insertions(+), 4 deletions(-)

diff --git a/2013/torperf2/torperf2.tex b/2013/torperf2/torperf2.tex
index 9a0b080..1fe89ec 100644
--- a/2013/torperf2/torperf2.tex
+++ b/2013/torperf2/torperf2.tex
@@ -148,6 +148,12 @@ it as soon as practical without additional user intervention.
 % speak Torperf's programming language.  If people want to add a simple
 % experiment, they can write a wrapper for using their tool in Torperf.
 
+This requirement is still subject to discussion, because it's unclear how
+this will work when users just \verb+apt-get install torperf+.
+Ideally if someone writes a good experiment, they should send the patches
+upstream and get it merged, and then we update Torperf to include those
+tests and then the users just update Torperf with their package managers.
+
 \subsubsection{User-defined tor version or binary}
 
 A key part of measurements is the tor software version or binary used to
@@ -169,7 +175,7 @@ across tor releases. Note that the tor version should be contained in the
 results.
 
 It might be beneficial to provide a mechanism to download and verify the
-signature of new tor versions as they are released. The user could speficy
+signature of new tor versions as they are released. The user could specify
 if they plan to test stable, beta or alpha versions of tor with their
 Torperf instance.
 
@@ -439,7 +445,7 @@ set up and tear down hidden services.
 should take care of the entire experiments lifecycle and reserve an
 exclusive tor instance for its lifetime. Finally, collect the results
 and post-process them (if applicable) before saving the data to the
-results database. If the experiment fails it should be possible to track
+results data store. If the experiment fails it should be possible to track
 down the reason for failure from looking at the experiment results.
 \item[request scheduler] Start new requests following a previously
 configured schedule.
@@ -459,8 +465,8 @@ median weg page, accept POST requests, provide measurement results via
 RESTful API, present measurement results on web page.
 \item[Alexa top-X web pages updater] Periodically retrieve list of top-X
 web pages.
-\item[results database] Store request details, retrieve results,
-periodically delete old results if configured.
+\item[results data store] Store request details, retrieve results,
+periodically delete old results if configured, possibly using a database.
 \item[results accumulator] Periodically collect results from other Torperf
 instances, warn if they're out of date.
 \item[analysis scripts] Command-line tools to process measurement results





More information about the tor-commits mailing list