tor-commits
Threads by month
- ----- 2025 -----
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
July 2016
- 21 participants
- 1271 discussions

[translation/https_everywhere] Update translations for https_everywhere
by translation@torproject.org 22 Jul '16
by translation@torproject.org 22 Jul '16
22 Jul '16
commit abd5dc1cd0bcf18b0c9cea5c5dc6543d7afbeebf
Author: Translation commit bot <translation(a)torproject.org>
Date: Fri Jul 22 09:15:27 2016 +0000
Update translations for https_everywhere
---
lt/ssl-observatory.dtd | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/lt/ssl-observatory.dtd b/lt/ssl-observatory.dtd
index 83abb7b..fe9748d 100644
--- a/lt/ssl-observatory.dtd
+++ b/lt/ssl-observatory.dtd
@@ -30,7 +30,7 @@ korporatyvinį tinklą:">
"Siųsti ir tikrinti sertifikatus, pasirašytus nestandartinių sertifikavimo centrų">
<!ENTITY ssl-observatory.prefs.alt_roots_tooltip
-"Įjungti šią parinktį yra saugu (ir tai yra gera mintis), nebent naudojate ribojamą korporatyvinį tinklą, kuris stebi jūsų naršymą su TLS įgaliotuoju serveriu ir privačiu sertifikavimo centru. Ši parinktis, jeigu bus įjungta tokiame tinkle, dėl jos kuriamų unikalių sertifikatų, gali palikti informaciją apie tai, kurios https:// svetainės, naudojant šį įgaliotąjį serverį, buvo aplankytos. Todėl pagal nutylėjimą tai išjungiame.">
+"Įjungti šį parametrą yra saugu (ir tai yra gera mintis), nebent naudojate ribojamą korporatyvinį tinklą, kuris stebi jūsų naršymą su TLS įgaliotuoju serveriu ir privačiu sertifikavimo centru. Šis parametras, jeigu bus įjungtas tokiame tinkle, dėl jo kuriamų unikalių sertifikatų, gali palikti informaciją apie tai, kurios https:// svetainės, naudojant šį įgaliotąjį serverį, buvo aplankytos. Todėl pagal nutylėjimą tai išjungiame.">
<!ENTITY ssl-observatory.prefs.anonymous "Tikrinti sertifikatus naudojant Tor tinklą anonimiškumui">
<!ENTITY ssl-observatory.prefs.anonymous_unavailable
@@ -45,10 +45,10 @@ korporatyvinį tinklą:">
"Gaus ir išsiųs jūsų tinklo "autonominį sistemos numerį". Tai padės mums aptikti atakas nukreiptas prieš HTTPS ir nustatyti, ar mes turime duomenų apie tinklus iš tokių vietų kaip Iranas ar Sirija, kur atakos yra gana dažnos.">
<!ENTITY ssl-observatory.prefs.show_cert_warning
-"Įspėti, kai jei Observatory aptiks Jūsų naršyklės neaptiktą sertifikatą, kurio galiojimas atšauktas.">
+"Įspėti, kai Observatorija aptiks jūsų naršyklės neaptiktą sertifikatą, kurio galiojimas panaikintas.">
<!ENTITY ssl-observatory.prefs.show_cert_warning_tooltip
-"Pateikti sertifikatai bus patikrinti Atšauktų Sertifikatų Sąraše. Deja, negalime garantuoti, kad bus nustatyti visi atšaukti sertifikatai, tačiau jei pamatysite įspėjimą, žinosite, kad tikriausiai kažkas yra negerai.">
+"Pateikti sertifikatai bus patikrinti Panaikintų Sertifikatų Sąraše. Deja, negalime garantuoti, kad bus nustatyti visi panaikinti sertifikatai, tačiau jei pamatysite įspėjimą, žinosite, kad tikriausiai kažkas yra negerai.">
<!ENTITY ssl-observatory.prefs.done "Atlikta">
1
0
commit 0c83055974f168d3646a8f1f13ec4ac6f2c03eba
Author: Sebastian Hahn <sebastian(a)torproject.org>
Date: Fri Jul 22 10:50:18 2016 +0200
Fix typo (brigde)
Thanks Andrew McGlashan for the report
---
docs/en/pluggable-transports.wml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/en/pluggable-transports.wml b/docs/en/pluggable-transports.wml
index d729353..cb487c3 100644
--- a/docs/en/pluggable-transports.wml
+++ b/docs/en/pluggable-transports.wml
@@ -66,7 +66,7 @@
<hr>
- <h2 id="operator">Become a PT brigde operator:</h2>
+ <h2 id="operator">Become a PT bridge operator:</h2>
<h3>How to run PTs to help censored users</h3>
1
0

22 Jul '16
commit 323b2b9dc49ecc285249aaf48599d7e8f6e2ebd6
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Wed Jul 20 20:12:39 2016 +0200
Resolve or suppress checkstyle warnings.
Implements part of #19614.
---
.../src/org/torproject/metrics/advbwdist/Main.java | 46 +++--
.../src/org/torproject/metrics/clients/Main.java | 96 ++++-----
.../org/torproject/metrics/collectdescs/Main.java | 28 +--
.../org/torproject/metrics/connbidirect/Main.java | 38 ++--
.../torproject/metrics/connbidirect/MainTest.java | 38 ++--
.../org/torproject/metrics/disagreement/Main.java | 116 ++++++-----
.../org/torproject/metrics/hidserv/Aggregator.java | 61 +++---
.../metrics/hidserv/ComputedNetworkFractions.java | 63 +++---
.../torproject/metrics/hidserv/DateTimeHelper.java | 58 +++---
.../org/torproject/metrics/hidserv/Document.java | 25 ++-
.../torproject/metrics/hidserv/DocumentStore.java | 38 ++--
.../metrics/hidserv/ExtrapolatedHidServStats.java | 62 +++---
.../torproject/metrics/hidserv/Extrapolator.java | 37 ++--
.../src/org/torproject/metrics/hidserv/Main.java | 28 +--
.../src/org/torproject/metrics/hidserv/Parser.java | 131 ++++++------
.../metrics/hidserv/ReportedHidServStats.java | 23 ++-
.../org/torproject/metrics/hidserv/Simulate.java | 85 ++++----
.../org/torproject/ernie/cron/Configuration.java | 37 +++-
.../src/org/torproject/ernie/cron/LockFile.java | 11 +-
.../ernie/cron/LoggingConfiguration.java | 32 +--
.../legacy/src/org/torproject/ernie/cron/Main.java | 35 ++--
.../cron/RelayDescriptorDatabaseImporter.java | 131 +++++++-----
.../cron/network/ConsensusStatsFileHandler.java | 65 +++---
.../ernie/cron/performance/TorperfProcessor.java | 41 ++--
shared/.gitignore | 4 +
shared/build.xml | 39 ++++
shared/resources/metrics_checks.xml | 221 +++++++++++++++++++++
.../org/torproject/metrics/web/AboutServlet.java | 4 +-
.../org/torproject/metrics/web/DataServlet.java | 16 +-
.../org/torproject/metrics/web/GraphServlet.java | 63 +++---
.../org/torproject/metrics/web/IndexServlet.java | 105 +++++-----
.../org/torproject/metrics/web/LinkServlet.java | 16 +-
website/src/org/torproject/metrics/web/Metric.java | 30 +++
.../org/torproject/metrics/web/MetricServlet.java | 2 +
.../torproject/metrics/web/MetricsProvider.java | 7 +-
.../torproject/metrics/web/RedirectServlet.java | 5 +-
.../org/torproject/metrics/web/TableServlet.java | 44 ++--
.../metrics/web/graphs/BubblesServlet.java | 4 +-
.../torproject/metrics/web/graphs/Countries.java | 35 ++--
.../metrics/web/graphs/GraphImageServlet.java | 27 +--
.../metrics/web/graphs/GraphParameterChecker.java | 62 +++---
.../org/torproject/metrics/web/graphs/RObject.java | 11 +-
.../metrics/web/graphs/RObjectGenerator.java | 170 +++++++++-------
.../metrics/web/graphs/TableParameterChecker.java | 31 +--
.../metrics/web/research/ResearchStatsServlet.java | 10 +-
45 files changed, 1414 insertions(+), 817 deletions(-)
diff --git a/modules/advbwdist/src/org/torproject/metrics/advbwdist/Main.java b/modules/advbwdist/src/org/torproject/metrics/advbwdist/Main.java
index 9ac2bbb..8dc6bc5 100644
--- a/modules/advbwdist/src/org/torproject/metrics/advbwdist/Main.java
+++ b/modules/advbwdist/src/org/torproject/metrics/advbwdist/Main.java
@@ -1,4 +1,16 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.advbwdist;
+
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.NetworkStatusEntry;
+import org.torproject.descriptor.RelayNetworkStatusConsensus;
+import org.torproject.descriptor.ServerDescriptor;
+
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
@@ -12,15 +24,9 @@ import java.util.List;
import java.util.Map;
import java.util.TimeZone;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.NetworkStatusEntry;
-import org.torproject.descriptor.RelayNetworkStatusConsensus;
-import org.torproject.descriptor.ServerDescriptor;
-
public class Main {
+
+ /** Executes this data-processing module. */
public static void main(String[] args) throws IOException {
/* Parse server descriptors, not keeping a parse history, and memorize
@@ -81,23 +87,23 @@ public class Main {
(RelayNetworkStatusConsensus) descriptor;
String validAfter = dateTimeFormat.format(
consensus.getValidAfterMillis());
- List<Long> advertisedBandwidthsAllRelays = new ArrayList<Long>(),
- advertisedBandwidthsExitsOnly = new ArrayList<Long>();
- for (NetworkStatusEntry relay :
- consensus.getStatusEntries().values()) {
+ List<Long> advertisedBandwidthsAllRelays = new ArrayList<Long>();
+ List<Long> advertisedBandwidthsExitsOnly = new ArrayList<Long>();
+ for (NetworkStatusEntry relay
+ : consensus.getStatusEntries().values()) {
if (!relay.getFlags().contains("Running")) {
continue;
}
- String serverDescriptorDigest = relay.getDescriptor().
- toUpperCase();
+ String serverDescriptorDigest = relay.getDescriptor()
+ .toUpperCase();
if (!serverDescriptors.containsKey(serverDescriptorDigest)) {
continue;
}
long advertisedBandwidth = serverDescriptors.get(
serverDescriptorDigest);
advertisedBandwidthsAllRelays.add(advertisedBandwidth);
- if (relay.getFlags().contains("Exit") &&
- !relay.getFlags().contains("BadExit")) {
+ if (relay.getFlags().contains("Exit")
+ && !relay.getFlags().contains("BadExit")) {
advertisedBandwidthsExitsOnly.add(advertisedBandwidth);
}
}
@@ -133,16 +139,16 @@ public class Main {
for (int percentile : percentiles) {
bw.write(String.format("%s,,,%d,%d%n", validAfter,
percentile, advertisedBandwidthsAllRelays.get(
- ((advertisedBandwidthsAllRelays.size() - 1) *
- percentile) / 100)));
+ ((advertisedBandwidthsAllRelays.size() - 1)
+ * percentile) / 100)));
}
}
if (!advertisedBandwidthsExitsOnly.isEmpty()) {
for (int percentile : percentiles) {
bw.write(String.format("%s,TRUE,,%d,%d%n", validAfter,
percentile, advertisedBandwidthsExitsOnly.get(
- ((advertisedBandwidthsExitsOnly.size() - 1) *
- percentile) / 100)));
+ ((advertisedBandwidthsExitsOnly.size() - 1)
+ * percentile) / 100)));
}
}
}
diff --git a/modules/clients/src/org/torproject/metrics/clients/Main.java b/modules/clients/src/org/torproject/metrics/clients/Main.java
index 63a3681..89faf56 100644
--- a/modules/clients/src/org/torproject/metrics/clients/Main.java
+++ b/modules/clients/src/org/torproject/metrics/clients/Main.java
@@ -1,7 +1,18 @@
-/* Copyright 2013 The Tor Project
+/* Copyright 2013--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.clients;
+import org.torproject.descriptor.BandwidthHistory;
+import org.torproject.descriptor.BridgeNetworkStatus;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.ExtraInfoDescriptor;
+import org.torproject.descriptor.NetworkStatusEntry;
+import org.torproject.descriptor.RelayNetworkStatusConsensus;
+
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
@@ -14,18 +25,9 @@ import java.util.SortedMap;
import java.util.TimeZone;
import java.util.TreeMap;
-import org.torproject.descriptor.BandwidthHistory;
-import org.torproject.descriptor.BridgeNetworkStatus;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.ExtraInfoDescriptor;
-import org.torproject.descriptor.NetworkStatusEntry;
-import org.torproject.descriptor.RelayNetworkStatusConsensus;
-
public class Main {
+ /** Executes this data-processing module. */
public static void main(String[] args) throws Exception {
parseArgs(args);
parseRelayDescriptors();
@@ -52,9 +54,11 @@ public class Main {
}
}
- private static final long ONE_HOUR_MILLIS = 60L * 60L * 1000L,
- ONE_DAY_MILLIS = 24L * ONE_HOUR_MILLIS,
- ONE_WEEK_MILLIS = 7L * ONE_DAY_MILLIS;
+ private static final long ONE_HOUR_MILLIS = 60L * 60L * 1000L;
+
+ private static final long ONE_DAY_MILLIS = 24L * ONE_HOUR_MILLIS;
+
+ private static final long ONE_WEEK_MILLIS = 7L * ONE_DAY_MILLIS;
private static void parseRelayDescriptors() throws Exception {
DescriptorReader descriptorReader =
@@ -87,8 +91,8 @@ public class Main {
private static void parseRelayExtraInfoDescriptor(
ExtraInfoDescriptor descriptor) throws IOException {
long publishedMillis = descriptor.getPublishedMillis();
- String fingerprint = descriptor.getFingerprint().
- toUpperCase();
+ String fingerprint = descriptor.getFingerprint()
+ .toUpperCase();
long dirreqStatsEndMillis = descriptor.getDirreqStatsEndMillis();
long dirreqStatsIntervalLengthMillis =
descriptor.getDirreqStatsIntervalLength() * 1000L;
@@ -105,9 +109,9 @@ public class Main {
long publishedMillis, long dirreqStatsEndMillis,
long dirreqStatsIntervalLengthMillis,
SortedMap<String, Integer> requests) throws IOException {
- if (requests == null ||
- publishedMillis - dirreqStatsEndMillis > ONE_WEEK_MILLIS ||
- dirreqStatsIntervalLengthMillis != ONE_DAY_MILLIS) {
+ if (requests == null
+ || publishedMillis - dirreqStatsEndMillis > ONE_WEEK_MILLIS
+ || dirreqStatsIntervalLengthMillis != ONE_DAY_MILLIS) {
/* Cut off all observations that are one week older than
* the descriptor publication time, or we'll have to update
* weeks of aggregate values every hour. */
@@ -144,8 +148,8 @@ public class Main {
private static void parseRelayDirreqWriteHistory(String fingerprint,
long publishedMillis, BandwidthHistory dirreqWriteHistory)
throws IOException {
- if (dirreqWriteHistory == null ||
- publishedMillis - dirreqWriteHistory.getHistoryEndMillis()
+ if (dirreqWriteHistory == null
+ || publishedMillis - dirreqWriteHistory.getHistoryEndMillis()
> ONE_WEEK_MILLIS) {
return;
/* Cut off all observations that are one week older than
@@ -154,8 +158,8 @@ public class Main {
}
long intervalLengthMillis =
dirreqWriteHistory.getIntervalLength() * 1000L;
- for (Map.Entry<Long, Long> e :
- dirreqWriteHistory.getBandwidthValues().entrySet()) {
+ for (Map.Entry<Long, Long> e
+ : dirreqWriteHistory.getBandwidthValues().entrySet()) {
long intervalEndMillis = e.getKey();
long intervalStartMillis =
intervalEndMillis - intervalLengthMillis;
@@ -163,8 +167,8 @@ public class Main {
long fromMillis = intervalStartMillis;
long toMillis = intervalEndMillis;
double writtenBytes = (double) e.getValue();
- if (intervalStartMillis / ONE_DAY_MILLIS <
- intervalEndMillis / ONE_DAY_MILLIS) {
+ if (intervalStartMillis / ONE_DAY_MILLIS
+ < intervalEndMillis / ONE_DAY_MILLIS) {
long utcBreakMillis = (intervalEndMillis
/ ONE_DAY_MILLIS) * ONE_DAY_MILLIS;
if (i == 0) {
@@ -188,10 +192,10 @@ public class Main {
RelayNetworkStatusConsensus consensus) throws IOException {
long fromMillis = consensus.getValidAfterMillis();
long toMillis = consensus.getFreshUntilMillis();
- for (NetworkStatusEntry statusEntry :
- consensus.getStatusEntries().values()) {
- String fingerprint = statusEntry.getFingerprint().
- toUpperCase();
+ for (NetworkStatusEntry statusEntry
+ : consensus.getStatusEntries().values()) {
+ String fingerprint = statusEntry.getFingerprint()
+ .toUpperCase();
if (statusEntry.getFlags().contains("Running")) {
writeOutputLine(fingerprint, "relay", "status", "", "", "",
fromMillis, toMillis, 0.0, fromMillis);
@@ -248,9 +252,9 @@ public class Main {
SortedMap<String, Integer> bridgeIps,
SortedMap<String, Integer> bridgeIpTransports,
SortedMap<String, Integer> bridgeIpVersions) throws IOException {
- if (responses == null ||
- publishedMillis - dirreqStatsEndMillis > ONE_WEEK_MILLIS ||
- dirreqStatsIntervalLengthMillis != ONE_DAY_MILLIS) {
+ if (responses == null
+ || publishedMillis - dirreqStatsEndMillis > ONE_WEEK_MILLIS
+ || dirreqStatsIntervalLengthMillis != ONE_DAY_MILLIS) {
/* Cut off all observations that are one week older than
* the descriptor publication time, or we'll have to update
* weeks of aggregate values every hour. */
@@ -300,9 +304,9 @@ public class Main {
if (e.getValue() < 4.0) {
continue;
}
- double r = ((double) e.getValue()) - 4.0;
- frequenciesCopy.put(e.getKey(), r);
- total += r;
+ double frequency = ((double) e.getValue()) - 4.0;
+ frequenciesCopy.put(e.getKey(), frequency);
+ total += frequency;
}
}
/* If we're not told any frequencies, or at least none of them are
@@ -338,8 +342,8 @@ public class Main {
private static void parseBridgeDirreqWriteHistory(String fingerprint,
long publishedMillis, BandwidthHistory dirreqWriteHistory)
throws IOException {
- if (dirreqWriteHistory == null ||
- publishedMillis - dirreqWriteHistory.getHistoryEndMillis()
+ if (dirreqWriteHistory == null
+ || publishedMillis - dirreqWriteHistory.getHistoryEndMillis()
> ONE_WEEK_MILLIS) {
/* Cut off all observations that are one week older than
* the descriptor publication time, or we'll have to update
@@ -348,8 +352,8 @@ public class Main {
}
long intervalLengthMillis =
dirreqWriteHistory.getIntervalLength() * 1000L;
- for (Map.Entry<Long, Long> e :
- dirreqWriteHistory.getBandwidthValues().entrySet()) {
+ for (Map.Entry<Long, Long> e
+ : dirreqWriteHistory.getBandwidthValues().entrySet()) {
long intervalEndMillis = e.getKey();
long intervalStartMillis =
intervalEndMillis - intervalLengthMillis;
@@ -357,8 +361,8 @@ public class Main {
long fromMillis = intervalStartMillis;
long toMillis = intervalEndMillis;
double writtenBytes = (double) e.getValue();
- if (intervalStartMillis / ONE_DAY_MILLIS <
- intervalEndMillis / ONE_DAY_MILLIS) {
+ if (intervalStartMillis / ONE_DAY_MILLIS
+ < intervalEndMillis / ONE_DAY_MILLIS) {
long utcBreakMillis = (intervalEndMillis
/ ONE_DAY_MILLIS) * ONE_DAY_MILLIS;
if (i == 0) {
@@ -384,10 +388,10 @@ public class Main {
long fromMillis = (publishedMillis / ONE_HOUR_MILLIS)
* ONE_HOUR_MILLIS;
long toMillis = fromMillis + ONE_HOUR_MILLIS;
- for (NetworkStatusEntry statusEntry :
- status.getStatusEntries().values()) {
- String fingerprint = statusEntry.getFingerprint().
- toUpperCase();
+ for (NetworkStatusEntry statusEntry
+ : status.getStatusEntries().values()) {
+ String fingerprint = statusEntry.getFingerprint()
+ .toUpperCase();
if (statusEntry.getFlags().contains("Running")) {
writeOutputLine(fingerprint, "bridge", "status", "", "", "",
fromMillis, toMillis, 0.0, publishedMillis);
@@ -397,6 +401,7 @@ public class Main {
private static Map<String, BufferedWriter> openOutputFiles =
new HashMap<String, BufferedWriter>();
+
private static void writeOutputLine(String fingerprint, String node,
String metric, String country, String transport, String version,
long fromMillis, long toMillis, double val, long publishedMillis)
@@ -413,6 +418,7 @@ public class Main {
}
private static SimpleDateFormat dateTimeFormat = null;
+
private static String formatDateTimeMillis(long millis) {
if (dateTimeFormat == null) {
dateTimeFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
diff --git a/modules/collectdescs/src/org/torproject/metrics/collectdescs/Main.java b/modules/collectdescs/src/org/torproject/metrics/collectdescs/Main.java
index dc6ef82..88955b4 100644
--- a/modules/collectdescs/src/org/torproject/metrics/collectdescs/Main.java
+++ b/modules/collectdescs/src/org/torproject/metrics/collectdescs/Main.java
@@ -1,28 +1,32 @@
-/* Copyright 2015 The Tor Project
+/* Copyright 2015--2016 The Tor Project
* See LICENSE for licensing information */
-package org.torproject.metrics.collectdescs;
-import java.io.File;
+package org.torproject.metrics.collectdescs;
import org.torproject.descriptor.DescriptorCollector;
import org.torproject.descriptor.DescriptorSourceFactory;
+import java.io.File;
+
public class Main {
+
+ /** Executes this data-processing module. */
public static void main(String[] args) {
/* Fetch recent descriptors from CollecTor. */
DescriptorCollector collector =
DescriptorSourceFactory.createDescriptorCollector();
collector.collectDescriptors(
"https://collector.torproject.org", new String[] {
- "/recent/bridge-descriptors/extra-infos/",
- "/recent/bridge-descriptors/server-descriptors/",
- "/recent/bridge-descriptors/statuses/",
- "/recent/exit-lists/",
- "/recent/relay-descriptors/consensuses/",
- "/recent/relay-descriptors/extra-infos/",
- "/recent/relay-descriptors/server-descriptors/",
- "/recent/relay-descriptors/votes/",
- "/recent/torperf/" }, 0L, new File("../../shared/in"), true);
+ "/recent/bridge-descriptors/extra-infos/",
+ "/recent/bridge-descriptors/server-descriptors/",
+ "/recent/bridge-descriptors/statuses/",
+ "/recent/exit-lists/",
+ "/recent/relay-descriptors/consensuses/",
+ "/recent/relay-descriptors/extra-infos/",
+ "/recent/relay-descriptors/server-descriptors/",
+ "/recent/relay-descriptors/votes/",
+ "/recent/torperf/"
+ }, 0L, new File("../../shared/in"), true);
}
}
diff --git a/modules/connbidirect/src/main/java/org/torproject/metrics/connbidirect/Main.java b/modules/connbidirect/src/main/java/org/torproject/metrics/connbidirect/Main.java
index 190b3df..579ef6b 100644
--- a/modules/connbidirect/src/main/java/org/torproject/metrics/connbidirect/Main.java
+++ b/modules/connbidirect/src/main/java/org/torproject/metrics/connbidirect/Main.java
@@ -1,7 +1,14 @@
-/* Copyright 2015 The Tor Project
+/* Copyright 2015--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.connbidirect;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.ExtraInfoDescriptor;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
@@ -23,12 +30,6 @@ import java.util.TimeZone;
import java.util.TreeMap;
import java.util.TreeSet;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.ExtraInfoDescriptor;
-
public class Main {
static class RawStat implements Comparable<RawStat> {
@@ -117,17 +118,17 @@ public class Main {
return false;
}
RawStat other = (RawStat) otherObject;
- return this.dateDays == other.dateDays &&
- this.fingerprint.equals(other.fingerprint);
+ return this.dateDays == other.dateDays
+ && this.fingerprint.equals(other.fingerprint);
}
}
static final long ONE_DAY_IN_MILLIS = 86400000L;
+ /** Executes this data-processing module. */
public static void main(String[] args) throws IOException {
File parseHistoryFile = new File("stats/parse-history");
File aggregateStatsFile = new File("stats/connbidirect2.csv");
- File rawStatsFile = new File("stats/raw-stats");
File[] descriptorsDirectories = new File[] {
new File("../../shared/in/archive/relay-descriptors/extra-infos"),
new File("../../shared/in/recent/relay-descriptors/extra-infos")};
@@ -156,6 +157,7 @@ public class Main {
+ "leave out those descriptors in future runs.");
return;
}
+ File rawStatsFile = new File("stats/raw-stats");
SortedSet<RawStat> rawStats = parseRawStats(
readStringFromFile(rawStatsFile));
if (rawStats == null) {
@@ -388,10 +390,10 @@ public class Main {
if (extraInfo.getConnBiDirectStatsEndMillis() <= 0L) {
return null;
}
- int below = extraInfo.getConnBiDirectBelow(),
- read = extraInfo.getConnBiDirectRead(),
- write = extraInfo.getConnBiDirectWrite(),
- both = extraInfo.getConnBiDirectBoth();
+ int below = extraInfo.getConnBiDirectBelow();
+ int read = extraInfo.getConnBiDirectRead();
+ int write = extraInfo.getConnBiDirectWrite();
+ int both = extraInfo.getConnBiDirectBoth();
if (below < 0 || read < 0 || write < 0 || both < 0) {
System.err.println("Could not parse incomplete conn-bi-direct "
+ "statistics. Skipping descriptor.");
@@ -420,8 +422,8 @@ public class Main {
static SortedSet<Long> mergeRawStats(
SortedSet<RawStat> rawStats, SortedSet<RawStat> newRawStats) {
rawStats.addAll(newRawStats);
- SortedSet<Long> discardedRawStats = new TreeSet<Long>(),
- availableRawStats = new TreeSet<Long>();
+ SortedSet<Long> discardedRawStats = new TreeSet<Long>();
+ SortedSet<Long> availableRawStats = new TreeSet<Long>();
for (RawStat rawStat : rawStats) {
if (rawStat.fingerprint != null) {
availableRawStats.add(rawStat.dateDays);
@@ -461,8 +463,8 @@ public class Main {
}
final String[] quantiles = new String[] { "0.25", "0.5", "0.75" };
final int[] centiles = new int[] { 25, 50, 75 };
- for (Map.Entry<String, List<Short>> e :
- fractionsByDateAndDirection.entrySet()) {
+ for (Map.Entry<String, List<Short>> e
+ : fractionsByDateAndDirection.entrySet()) {
String dateAndDirection = e.getKey();
List<Short> fractions = e.getValue();
Collections.sort(fractions);
diff --git a/modules/connbidirect/src/test/java/org/torproject/metrics/connbidirect/MainTest.java b/modules/connbidirect/src/test/java/org/torproject/metrics/connbidirect/MainTest.java
index a490dd2..106443e 100644
--- a/modules/connbidirect/src/test/java/org/torproject/metrics/connbidirect/MainTest.java
+++ b/modules/connbidirect/src/test/java/org/torproject/metrics/connbidirect/MainTest.java
@@ -1,9 +1,14 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.connbidirect;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertSame;
+import org.junit.Test;
+
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import java.util.SortedMap;
@@ -11,8 +16,6 @@ import java.util.SortedSet;
import java.util.TreeMap;
import java.util.TreeSet;
-import org.junit.Test;
-
public class MainTest {
private void assertParseHistoryCanBeSerializedAndDeserialized(
@@ -30,22 +33,26 @@ public class MainTest {
new TreeMap<String, Long>());
}
- private final String PATH_A = "a", PATH_B = "/b";
+ private final String pathA = "a";
- private final long LASTMOD_A = 1L, LASTMOD_B = 2L;
+ private final String pathB = "/b";
+
+ private final long lastmodA = 1L;
+
+ private final long lastmodB = 2L;
@Test
public void testParseHistoryOneEntry() {
SortedMap<String, Long> parseHistory = new TreeMap<String, Long>();
- parseHistory.put(PATH_A, LASTMOD_A);
+ parseHistory.put(pathA, lastmodA);
assertParseHistoryCanBeSerializedAndDeserialized(parseHistory);
}
@Test
public void testParseHistoryTwoEntries() {
SortedMap<String, Long> parseHistory = new TreeMap<String, Long>();
- parseHistory.put(PATH_A, LASTMOD_A);
- parseHistory.put(PATH_B, LASTMOD_B);
+ parseHistory.put(pathA, lastmodA);
+ parseHistory.put(pathB, lastmodB);
assertParseHistoryCanBeSerializedAndDeserialized(parseHistory);
}
@@ -61,13 +68,13 @@ public class MainTest {
@Test
public void testParseHistoryNoLastModifiedTime() {
- assertParseHistoryCannotBeDeserialized(String.format("%s%n", PATH_A));
+ assertParseHistoryCannotBeDeserialized(String.format("%s%n", pathA));
}
@Test
public void testParseHistoryLastModifiedTimeNoNumber() {
assertParseHistoryCannotBeDeserialized(String.format("%s%s%n",
- PATH_A, PATH_B));
+ pathA, pathB));
}
private void assertAggregateStatsCanBeSerializedAndDeserialized(
@@ -119,12 +126,15 @@ public class MainTest {
new TreeSet<Main.RawStat>());
}
- private static final long DATE_A = 16665, /* 2015-08-18 */
- DATE_B = 16680; /* 2015-09-02 */
+ private static final long DATE_A = 16665; /* 2015-08-18 */
+
+ private static final long DATE_B = 16680; /* 2015-09-02 */
+
+ private static final String FPR_A =
+ "1234567890123456789012345678901234567890";
- private static final String
- FPR_A = "1234567890123456789012345678901234567890",
- FPR_B = "2345678901234567890123456789012345678901";
+ private static final String FPR_B =
+ "2345678901234567890123456789012345678901";
@Test
public void testRawStatsOneEntry() {
diff --git a/modules/disagreement/src/main/java/org/torproject/metrics/disagreement/Main.java b/modules/disagreement/src/main/java/org/torproject/metrics/disagreement/Main.java
index b612c43..05159a3 100644
--- a/modules/disagreement/src/main/java/org/torproject/metrics/disagreement/Main.java
+++ b/modules/disagreement/src/main/java/org/torproject/metrics/disagreement/Main.java
@@ -1,5 +1,15 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.disagreement;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.NetworkStatusEntry;
+import org.torproject.descriptor.RelayNetworkStatusVote;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
@@ -22,13 +32,6 @@ import java.util.SortedMap;
import java.util.TimeZone;
import java.util.TreeMap;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.NetworkStatusEntry;
-import org.torproject.descriptor.RelayNetworkStatusVote;
-
/* Read all relay network status votes from the in/ subdirectory with a
* valid-after time of 12:00:00, extract attributes like relay flags or
* bandwidth measurements that the directory authorities assigned to
@@ -43,10 +46,13 @@ import org.torproject.descriptor.RelayNetworkStatusVote;
* in each execution. */
public class Main {
+ /** Creates a new instance of this class that executes this
+ * data-processing module. */
public static void main(String[] args) throws Exception {
new Main().run();
}
+ /** Executes this data-processing module. */
public void run() throws Exception {
readResults();
readDescriptors();
@@ -73,6 +79,7 @@ public class Main {
new HashMap<String, Integer>();
private List<String> attributeStrings = new ArrayList<String>();
+ /** Initializes this class. */
public Main() {
/* Initialize maps from strings to integers and back by adding the
@@ -168,18 +175,33 @@ public class Main {
* valid-after hour in December 2015 required to keep 420,000 long
* values in memory, which is roughly 3.2 MiB plus list overhead. */
protected List<Long> assignments = new ArrayList<Long>();
- protected final static int VALIDAFTER_LEN = 22, ATTRIBUTE_LEN = 8,
- FINGERPRINT_LEN = 24, AUTHORITY_LEN = 6;
- protected final static int
- VALIDAFTER_SHIFT = ATTRIBUTE_LEN + FINGERPRINT_LEN + AUTHORITY_LEN,
- ATTRIBUTE_SHIFT = FINGERPRINT_LEN + AUTHORITY_LEN,
- FINGERPRINT_SHIFT = AUTHORITY_LEN,
- AUTHORITY_SHIFT = 0;
+
+ protected static final int VALIDAFTER_LEN = 22;
+
+ protected static final int ATTRIBUTE_LEN = 8;
+
+ protected static final int FINGERPRINT_LEN = 24;
+
+ protected static final int AUTHORITY_LEN = 6;
+
+ protected static final int VALIDAFTER_SHIFT = ATTRIBUTE_LEN
+ + FINGERPRINT_LEN + AUTHORITY_LEN;
+
+ protected static final int ATTRIBUTE_SHIFT = FINGERPRINT_LEN
+ + AUTHORITY_LEN;
+
+ protected static final int FINGERPRINT_SHIFT = AUTHORITY_LEN;
+
+ protected static final int AUTHORITY_SHIFT = 0;
/* Define some constants for timestamp math. */
- protected final static long HALF_HOUR = 30L * 60L * 1000L,
- ONE_HOUR = 2L * HALF_HOUR, HALF_DAY = 12L * ONE_HOUR,
- ONE_DAY = 2L * HALF_DAY;
+ protected static final long HALF_HOUR = 30L * 60L * 1000L;
+
+ protected static final long ONE_HOUR = 2L * HALF_HOUR;
+
+ protected static final long HALF_DAY = 12L * ONE_HOUR;
+
+ protected static final long ONE_DAY = 2L * HALF_DAY;
/* Convert the given valid-after time in milliseconds, attribute index,
* fingerprint index, and authority index to a long integer following
@@ -188,15 +210,15 @@ public class Main {
protected static long convertToLongValue(long validAfterMillis,
int attributeIndex, int fingerprintIndex, int authorityIndex) {
long validAfterHalfHours = validAfterMillis / HALF_HOUR;
- if (validAfterHalfHours < 0L ||
- validAfterHalfHours >= (1L << VALIDAFTER_LEN)) {
+ if (validAfterHalfHours < 0L
+ || validAfterHalfHours >= (1L << VALIDAFTER_LEN)) {
return -1;
}
if (attributeIndex < 0 || attributeIndex >= (1 << ATTRIBUTE_LEN)) {
return -1;
}
- if (fingerprintIndex < 0 ||
- fingerprintIndex >= (1 << FINGERPRINT_LEN)) {
+ if (fingerprintIndex < 0
+ || fingerprintIndex >= (1 << FINGERPRINT_LEN)) {
return -1;
}
if (authorityIndex < 0 || authorityIndex >= (1 << AUTHORITY_LEN)) {
@@ -225,8 +247,8 @@ public class Main {
/* Extract the fingerprint index from the given long integer value. */
protected static int extractFingerprintIndexFromLongValue(
long longValue) {
- return (int) ((longValue >> FINGERPRINT_SHIFT) %
- (1 << FINGERPRINT_LEN));
+ return (int) ((longValue >> FINGERPRINT_SHIFT)
+ % (1 << FINGERPRINT_LEN));
}
/* Extract the authority index from the given long integer value. */
@@ -267,8 +289,8 @@ public class Main {
LineNumberReader lnr = new LineNumberReader(new BufferedReader(
new FileReader(this.resultsFile)));
String line;
- if ((line = lnr.readLine()) == null ||
- !line.equals(this.resultsHeaderLine)) {
+ if ((line = lnr.readLine()) == null
+ || !line.equals(this.resultsHeaderLine)) {
lnr.close();
throw new IOException("Unexpected line " + lnr.getLineNumber()
+ " in " + this.resultsFile + ".");
@@ -338,8 +360,9 @@ public class Main {
}
}
- private static final String LISTED_ATTRIBUTE = "Listed",
- MEASURED_ATTRIBUTE = "Measured";
+ private static final String LISTED_ATTRIBUTE = "Listed";
+
+ private static final String MEASURED_ATTRIBUTE = "Measured";
/* Process a single relay network status vote. */
private void processVote(RelayNetworkStatusVote vote) throws Exception {
@@ -363,8 +386,8 @@ public class Main {
/* Go through all status entries in this vote and remember which
* attributes this authority assigns to which relays. */
- for (NetworkStatusEntry entry :
- vote.getStatusEntries().values()) {
+ for (NetworkStatusEntry entry
+ : vote.getStatusEntries().values()) {
/* Use the relay's fingerprint to distinguish relays. */
String fingerprint = entry.getFingerprint();
@@ -475,8 +498,10 @@ public class Main {
/* Remember long value and some of its components from the last
* iteration. */
- long lastLongValue = -1L, lastValidAfterMillis = -1L;
- int lastAttributeIndex = -1, lastFingerprintIndex = -1;
+ long lastLongValue = -1L;
+ long lastValidAfterMillis = -1L;
+ int lastAttributeIndex = -1;
+ int lastFingerprintIndex = -1;
/* Keep a list of all output lines for a single valid-after time. */
List<String> outputLines = new ArrayList<String>();
@@ -484,8 +509,9 @@ public class Main {
/* Keep counters for the number of fingerprints seen at a valid-after
* time, the number of authorities voting on an attribute, and the
* number of votes that a relay received for a given attribute. */
- int knownFingerprintsByAllAuthorities = 0,
- authoritiesVotingOnAttribute = 0, votesForAttribute = 0;
+ int knownFingerprintsByAllAuthorities = 0;
+ int authoritiesVotingOnAttribute = 0;
+ int votesForAttribute = 0;
/* Keep counters of relays receiving a given number of votes on an
* attribute. The number at element i is the number of relays
@@ -505,8 +531,8 @@ public class Main {
* results for the last fingerprint before moving on. */
int fingerprintIndex = extractFingerprintIndexFromLongValue(
longValue);
- if (lastAttributeIndex > 0 && lastFingerprintIndex > 0 &&
- lastFingerprintIndex != fingerprintIndex) {
+ if (lastAttributeIndex > 0 && lastFingerprintIndex > 0
+ && lastFingerprintIndex != fingerprintIndex) {
/* This relay received at least one vote for the given attribute,
* or otherwise it wouldn't be contained in the list of long
@@ -522,8 +548,8 @@ public class Main {
* attribute before moving on. And if we just moved to the first
* attribute, initialize counters. */
int attributeIndex = extractAttributeIndexFromLongValue(longValue);
- if (lastAttributeIndex >= 0 &&
- lastAttributeIndex != attributeIndex) {
+ if (lastAttributeIndex >= 0
+ && lastAttributeIndex != attributeIndex) {
/* If we just finished a non-zero attribute, wrap it up.
* Determine the number of votes required for getting into the
@@ -555,8 +581,8 @@ public class Main {
* on. */
long validAfterMillis = extractValidAfterMillisFromLongValue(
longValue);
- if (lastValidAfterMillis >= 0L &&
- lastValidAfterMillis < validAfterMillis) {
+ if (lastValidAfterMillis >= 0L
+ && lastValidAfterMillis < validAfterMillis) {
/* Check if results already contain lines for this valid-after
* time. If so, only replace them with new results lines if there
@@ -565,9 +591,9 @@ public class Main {
* include as many votes as possible in the aggregation. */
String validAfterString = dateTimeFormat.format(
lastValidAfterMillis);
- if (!this.results.containsKey(validAfterString) ||
- this.results.get(validAfterString).size() <
- outputLines.size()) {
+ if (!this.results.containsKey(validAfterString)
+ || this.results.get(validAfterString).size()
+ < outputLines.size()) {
/* Sort results lines, and then put them in. */
Collections.sort(outputLines);
@@ -591,19 +617,17 @@ public class Main {
* valid-after time. */
if (attributeIndex == 0) {
knownFingerprintsByAllAuthorities++;
- }
/* Otherwise, if this value doesn't contain a fingerprint index, it
* was put in for counting authorities voting on a given attribute
* at the current valid-after time. */
- else if (fingerprintIndex == 0) {
+ } else if (fingerprintIndex == 0) {
authoritiesVotingOnAttribute++;
- }
/* Otherwise, if both indexes are non-zero, this value was put in to
* count how many authorities assign the attribute to this relay at
* this valid-after time. */
- else {
+ } else {
votesForAttribute++;
}
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/Aggregator.java b/modules/hidserv/src/org/torproject/metrics/hidserv/Aggregator.java
index 192a342..11be1d2 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/Aggregator.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/Aggregator.java
@@ -1,3 +1,6 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.io.BufferedWriter;
@@ -13,24 +16,24 @@ import java.util.Set;
import java.util.SortedMap;
import java.util.TreeMap;
-/* Aggregate extrapolated network totals of hidden-service statistics by
+/** Aggregate extrapolated network totals of hidden-service statistics by
* calculating statistics like the daily weighted interquartile mean.
* Also calculate simpler statistics like the number of reported
* statistics and the total network fraction of reporting relays. */
public class Aggregator {
- /* Document file containing extrapolated hidden-service statistics. */
+ /** Document file containing extrapolated hidden-service statistics. */
private File extrapolatedHidServStatsFile;
- /* Document store for storing and retrieving extrapolated hidden-service
+ /** Document store for storing and retrieving extrapolated hidden-service
* statistics. */
private DocumentStore<ExtrapolatedHidServStats>
extrapolatedHidServStatsStore;
- /* Output file for writing aggregated statistics. */
+ /** Output file for writing aggregated statistics. */
private File hidservStatsCsvFile;
- /* Initialize a new aggregator object using the given directory,
+ /** Initializes a new aggregator object using the given directory,
* document store, and output file for results. */
public Aggregator(File statusDirectory,
DocumentStore<ExtrapolatedHidServStats>
@@ -46,8 +49,8 @@ public class Aggregator {
this.hidservStatsCsvFile = hidservStatsCsvFile;
}
- /* Calculate aggregates for all extrapolated hidden-service statistics
- * and write them to the output file. */
+ /** Calculates aggregates for all extrapolated hidden-service statistics
+ * and writes them to the output file. */
public void aggregateHidServStats() {
/* Retrieve previously extrapolated network totals. */
@@ -67,9 +70,10 @@ public class Aggregator {
* dates, map values are double[] arrays with the extrapolated network
* total as first element and the corresponding computed network
* fraction as second element. */
- SortedMap<String, List<double[]>>
- extrapolatedCells = new TreeMap<String, List<double[]>>(),
- extrapolatedOnions = new TreeMap<String, List<double[]>>();
+ SortedMap<String, List<double[]>> extrapolatedCells =
+ new TreeMap<String, List<double[]>>();
+ SortedMap<String, List<double[]>> extrapolatedOnions =
+ new TreeMap<String, List<double[]>>();
for (ExtrapolatedHidServStats extrapolated : extrapolatedStats) {
String date = DateTimeHelper.format(
extrapolated.getStatsDateMillis(),
@@ -108,21 +112,22 @@ public class Aggregator {
? extrapolatedCells : extrapolatedOnions;
/* Go through all dates. */
- for (Map.Entry<String, List<double[]>> e :
- extrapolated.entrySet()) {
- String date = e.getKey();
+ for (Map.Entry<String, List<double[]>> e
+ : extrapolated.entrySet()) {
List<double[]> weightedValues = e.getValue();
- int numStats = weightedValues.size();
/* Sort extrapolated network totals contained in the first array
* element. (The second array element contains the computed
* network fraction as weight.) */
Collections.sort(weightedValues,
new Comparator<double[]>() {
- public int compare(double[] o1, double[] o2) {
- return o1[0] < o2[0] ? -1 : o1[0] > o2[0] ? 1 : 0;
- }
- });
+ public int compare(double[] first, double[] second) {
+ return first[0] < second[0] ? -1
+ : first[0] > second[0] ? 1
+ : 0;
+ }
+ }
+ );
/* For the weighted mean, sum up all previously extrapolated
* values weighted with their network fractions (which happens to
@@ -130,7 +135,8 @@ public class Aggregator {
* fractions. Once we have those two sums, we can divide the sum
* of weighted extrapolated values by the sum of network fractions
* to obtain the weighted mean of extrapolated values. */
- double sumReported = 0.0, sumFraction = 0.0;
+ double sumReported = 0.0;
+ double sumFraction = 0.0;
for (double[] d : weightedValues) {
sumReported += d[0] * d[1];
sumFraction += d[1];
@@ -146,18 +152,19 @@ public class Aggregator {
* 75% range and later compute the weighted mean of those. */
double weightIntervalEnd = 0.0;
Double weightedMedian = null;
- double sumFractionInterquartile = 0.0,
- sumReportedInterquartile = 0.0;
+ double sumFractionInterquartile = 0.0;
+ double sumReportedInterquartile = 0.0;
for (double[] d : weightedValues) {
- double extrapolatedValue = d[0], computedFraction = d[1];
+ double extrapolatedValue = d[0];
+ double computedFraction = d[1];
double weightIntervalStart = weightIntervalEnd;
weightIntervalEnd += computedFraction;
- if (weightedMedian == null &&
- weightIntervalEnd > sumFraction * 0.5) {
+ if (weightedMedian == null
+ && weightIntervalEnd > sumFraction * 0.5) {
weightedMedian = extrapolatedValue;
}
- if (weightIntervalEnd >= sumFraction * 0.25 &&
- weightIntervalStart <= sumFraction * 0.75) {
+ if (weightIntervalEnd >= sumFraction * 0.25
+ && weightIntervalStart <= sumFraction * 0.75) {
double fractionBetweenQuartiles =
Math.min(weightIntervalEnd, sumFraction * 0.75)
- Math.max(weightIntervalStart, sumFraction * 0.25);
@@ -170,6 +177,8 @@ public class Aggregator {
sumReportedInterquartile / sumFractionInterquartile;
/* Put together all aggregated values in a single line. */
+ String date = e.getKey();
+ int numStats = weightedValues.size();
sb.append(String.format("%s,%s,%.0f,%.0f,%.0f,%.8f,%d%n", date,
type, weightedMean, weightedMedian, weightedInterquartileMean,
sumFraction, numStats));
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/ComputedNetworkFractions.java b/modules/hidserv/src/org/torproject/metrics/hidserv/ComputedNetworkFractions.java
index cda6b67..9eeea78 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/ComputedNetworkFractions.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/ComputedNetworkFractions.java
@@ -1,51 +1,60 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
-/* Computed fraction of hidden-service activity that a single relay is
+/** Computed fraction of hidden-service activity that a single relay is
* assumed to observe in the network. These fractions are computed from
* status entries and bandwidth weights in a network status consensus. */
public class ComputedNetworkFractions implements Document {
- /* Relay fingerprint consisting of 40 upper-case hex characters. */
+ /** Relay fingerprint consisting of 40 upper-case hex characters. */
private String fingerprint;
+
public String getFingerprint() {
return this.fingerprint;
}
- /* Valid-after timestamp of the consensus in milliseconds. */
+ /** Valid-after timestamp of the consensus in milliseconds. */
private long validAfterMillis;
+
public long getValidAfterMillis() {
return this.validAfterMillis;
}
- /* Fraction of cells on rendezvous circuits that this relay is assumed
+ /** Fraction of cells on rendezvous circuits that this relay is assumed
* to observe in the network. */
private double fractionRendRelayedCells;
+
public void setFractionRendRelayedCells(
double fractionRendRelayedCells) {
this.fractionRendRelayedCells = fractionRendRelayedCells;
}
+
public double getFractionRendRelayedCells() {
return this.fractionRendRelayedCells;
}
- /* Fraction of descriptors that this relay is assumed to observe in the
+ /** Fraction of descriptors that this relay is assumed to observe in the
* network. This is calculated as the fraction of descriptors
* identifiers that this relay was responsible for, divided by 3,
* because each descriptor that is published to this directory is also
* published to two other directories. */
private double fractionDirOnionsSeen;
+
public void setFractionDirOnionsSeen(double fractionDirOnionsSeen) {
this.fractionDirOnionsSeen = fractionDirOnionsSeen;
}
+
public double getFractionDirOnionsSeen() {
return this.fractionDirOnionsSeen;
}
- /* Instantiate a new fractions object using fingerprint and consensus
+ /** Instantiates a new fractions object using fingerprint and consensus
* valid-after time which together uniquely identify the object. */
public ComputedNetworkFractions(String fingerprint,
long validAfterMillis) {
@@ -53,7 +62,7 @@ public class ComputedNetworkFractions implements Document {
this.validAfterMillis = validAfterMillis;
}
- /* Return whether this object contains the same fingerprint and
+ /** Returns whether this object contains the same fingerprint and
* consensus valid-after time as the passed object. */
@Override
public boolean equals(Object otherObject) {
@@ -62,22 +71,22 @@ public class ComputedNetworkFractions implements Document {
}
ComputedNetworkFractions other =
(ComputedNetworkFractions) otherObject;
- return this.fingerprint.equals(other.fingerprint) &&
- this.validAfterMillis == other.validAfterMillis;
+ return this.fingerprint.equals(other.fingerprint)
+ && this.validAfterMillis == other.validAfterMillis;
}
- /* Return a (hopefully unique) hash code based on this object's
+ /** Returns a (hopefully unique) hash code based on this object's
* fingerprint and consensus valid-after time. */
@Override
public int hashCode() {
- return this.fingerprint.hashCode() +
- (int) this.validAfterMillis;
+ return this.fingerprint.hashCode()
+ + (int) this.validAfterMillis;
}
private static Map<Long, String> previouslyFormattedDates =
Collections.synchronizedMap(new HashMap<Long, String>());
- /* Return a string representation of this object, consisting of two
+ /** Returns a string representation of this object, consisting of two
* strings: the first string contains fingerprint and valid-after date,
* the second string contains the concatenation of all other
* attributes. */
@@ -107,7 +116,7 @@ public class ComputedNetworkFractions implements Document {
return new String[] { first, second };
}
- /* Instantiate an empty fractions object that will be initialized more
+ /** Instantiates an empty fractions object that will be initialized more
* by the parse method. */
ComputedNetworkFractions() {
}
@@ -115,9 +124,9 @@ public class ComputedNetworkFractions implements Document {
private static Map<String, Long> previouslyParsedDates =
Collections.synchronizedMap(new HashMap<String, Long>());
- /* Initialize this fractions object using the two provided strings that
- * have been produced by the format method earlier. Return whether this
- * operation was successful. */
+ /** Initializes this fractions object using the two provided strings
+ * that have been produced by the format method earlier and returns
+ * whether this operation was successful. */
@Override
public boolean parse(String[] formattedStrings) {
if (formattedStrings.length != 2) {
@@ -138,8 +147,8 @@ public class ComputedNetworkFractions implements Document {
+ "Skipping.%n");
return false;
}
- String validAfterDate = firstParts[1],
- validAfterHour = secondParts[0];
+ String validAfterDate = firstParts[1];
+ String validAfterHour = secondParts[0];
long validAfterDateMillis;
if (previouslyParsedDates.containsKey(validAfterDate)) {
validAfterDateMillis = previouslyParsedDates.get(validAfterDate);
@@ -150,22 +159,20 @@ public class ComputedNetworkFractions implements Document {
}
long validAfterTimeMillis = Long.parseLong(validAfterHour)
* DateTimeHelper.ONE_HOUR;
- if (validAfterDateMillis == DateTimeHelper.NO_TIME_AVAILABLE ||
- validAfterTimeMillis < 0L ||
- validAfterTimeMillis >= DateTimeHelper.ONE_DAY) {
+ if (validAfterDateMillis == DateTimeHelper.NO_TIME_AVAILABLE
+ || validAfterTimeMillis < 0L
+ || validAfterTimeMillis >= DateTimeHelper.ONE_DAY) {
System.err.printf("Invalid date/hour format. Skipping.%n");
return false;
}
long validAfterMillis = validAfterDateMillis + validAfterTimeMillis;
try {
- double fractionRendRelayedCells = secondParts[1].equals("")
- ? 0.0 : Double.parseDouble(secondParts[1]);
- double fractionDirOnionsSeen = secondParts[2].equals("")
- ? 0.0 : Double.parseDouble(secondParts[2]);
this.fingerprint = fingerprint;
this.validAfterMillis = validAfterMillis;
- this.fractionRendRelayedCells = fractionRendRelayedCells;
- this.fractionDirOnionsSeen = fractionDirOnionsSeen;
+ this.fractionRendRelayedCells = secondParts[1].equals("")
+ ? 0.0 : Double.parseDouble(secondParts[1]);
+ this.fractionDirOnionsSeen = secondParts[2].equals("")
+ ? 0.0 : Double.parseDouble(secondParts[2]);
return true;
} catch (NumberFormatException e) {
System.err.printf("Invalid number format. Skipping.%n");
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/DateTimeHelper.java b/modules/hidserv/src/org/torproject/metrics/hidserv/DateTimeHelper.java
index c33a50d..d5cf847 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/DateTimeHelper.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/DateTimeHelper.java
@@ -1,3 +1,6 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.text.DateFormat;
@@ -7,49 +10,57 @@ import java.util.HashMap;
import java.util.Map;
import java.util.TimeZone;
-/* Utility class to format and parse dates and timestamps. */
+/** Utility class to format and parse dates and timestamps. */
public class DateTimeHelper {
- /* This class is not supposed to be instantiated, which is why its
+ /** This class is not supposed to be instantiated, which is why its
* constructor has private visibility. */
private DateTimeHelper() {
}
/* Some useful time constant. */
- public static final long
- ONE_SECOND = 1000L,
- ONE_MINUTE = 60L * ONE_SECOND,
- ONE_HOUR = 60L * ONE_MINUTE,
- ONE_DAY = 24L * ONE_HOUR;
+ public static final long ONE_SECOND = 1000L;
+
+ public static final long ONE_MINUTE = 60L * ONE_SECOND;
+
+ public static final long ONE_HOUR = 60L * ONE_MINUTE;
+
+ public static final long ONE_DAY = 24L * ONE_HOUR;
/* Some useful date/time formats. */
- public static final String
- ISO_DATETIME_FORMAT = "yyyy-MM-dd HH:mm:ss",
- ISO_DATE_HOUR_FORMAT = "yyyy-MM-dd HH",
- ISO_DATE_FORMAT = "yyyy-MM-dd",
- ISO_HOUR_FORMAT = "HH";
+ public static final String ISO_DATETIME_FORMAT = "yyyy-MM-dd HH:mm:ss";
+
+ public static final String ISO_DATE_HOUR_FORMAT = "yyyy-MM-dd HH";
+
+ public static final String ISO_DATE_FORMAT = "yyyy-MM-dd";
- /* Map of DateFormat instances for parsing and formatting dates and
+ public static final String ISO_HOUR_FORMAT = "HH";
+
+ /** Map of DateFormat instances for parsing and formatting dates and
* timestamps, protected using ThreadLocal to ensure that each thread
* uses its own instances. */
private static ThreadLocal<Map<String, DateFormat>> dateFormats =
- new ThreadLocal<Map<String, DateFormat>> () {
+ new ThreadLocal<Map<String, DateFormat>>() {
+
public Map<String, DateFormat> get() {
return super.get();
}
+
protected Map<String, DateFormat> initialValue() {
return new HashMap<String, DateFormat>();
}
+
public void remove() {
super.remove();
}
+
public void set(Map<String, DateFormat> value) {
super.set(value);
}
};
- /* Return an instance of DateFormat for the given format. If no such
- * instance exists, create one and put it in the map. */
+ /** Returns an instance of DateFormat for the given format, and if no
+ * such instance exists, creates one and puts it in the map. */
private static DateFormat getDateFormat(String format) {
Map<String, DateFormat> threadDateFormats = dateFormats.get();
if (!threadDateFormats.containsKey(format)) {
@@ -61,21 +72,22 @@ public class DateTimeHelper {
return threadDateFormats.get(format);
}
- /* Format the given time in milliseconds using the given format. */
+ /** Formats the given time in milliseconds using the given format. */
public static String format(long millis, String format) {
return getDateFormat(format).format(millis);
}
- /* Format the given time in milliseconds using ISO date/time format. */
+ /** Formats the given time in milliseconds using ISO date/time
+ * format. */
public static String format(long millis) {
return format(millis, ISO_DATETIME_FORMAT);
}
- /* Default result of the parse methods if the provided time could not be
- * parsed. */
- public final static long NO_TIME_AVAILABLE = -1L;
+ /** Default result of the parse methods if the provided time could not
+ * be parsed. */
+ public static final long NO_TIME_AVAILABLE = -1L;
- /* Parse the given string using the given format. */
+ /** Parses the given string using the given format. */
public static long parse(String string, String format) {
if (null == string) {
return NO_TIME_AVAILABLE;
@@ -87,7 +99,7 @@ public class DateTimeHelper {
}
}
- /* Parse the given string using ISO date/time format. */
+ /** Parses the given string using ISO date/time format. */
public static long parse(String string) {
return parse(string, ISO_DATETIME_FORMAT);
}
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/Document.java b/modules/hidserv/src/org/torproject/metrics/hidserv/Document.java
index 47614f3..0ac2aa3 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/Document.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/Document.java
@@ -1,19 +1,26 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
-/* Common interface of documents that are supposed to be serialized and
+/** Common interface of documents that are supposed to be serialized and
* stored in document files and later retrieved and de-serialized. */
public interface Document {
- /* Return an array of two strings with a string representation of this
- * document. The first string will be used to start a group of
- * documents, the second string will be used to represent a single
- * document in that group. Ideally, the first string is equivalent for
- * many documents stored in the same file, and the second string is
- * different for those documents. */
+ /** Returns an array of two strings with a string representation of this
+ * document.
+ *
+ * <p>The first string will be used to start a group of documents, the
+ * second string will be used to represent a single document in that
+ * group. Ideally, the first string is equivalent for many documents
+ * stored in the same file, and the second string is different for those
+ * documents.</p> */
public String[] format();
- /* Initialize an object using the given array of two strings. These are
- * the same two strings that the format method provides. */
+ /** Initializes an object using the given array of two strings.
+ *
+ * <p>These are the same two strings that the format method
+ * provides.</p> */
public boolean parse(String[] formattedStrings);
}
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/DocumentStore.java b/modules/hidserv/src/org/torproject/metrics/hidserv/DocumentStore.java
index e7ef0aa..a257f08 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/DocumentStore.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/DocumentStore.java
@@ -1,3 +1,6 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.io.BufferedReader;
@@ -15,23 +18,24 @@ import java.util.SortedSet;
import java.util.TreeMap;
import java.util.TreeSet;
-/* Utility class to store serialized objects implementing the Document
+/** Utility class to store serialized objects implementing the Document
* interface to a file and later to retrieve them. */
public class DocumentStore<T extends Document> {
- /* Document class, needed to create new instances when retrieving
+ /** Document class, needed to create new instances when retrieving
* documents. */
private Class<T> clazz;
- /* Initialize a new store object for the given type of documents. */
+ /** Initializes a new store object for the given type of documents. */
DocumentStore(Class<T> clazz) {
this.clazz = clazz;
}
- /* Store the provided documents in the given file and return whether the
- * storage operation was successful. If the file already existed and if
- * it contains documents, merge the new documents with the existing
- * ones. */
+ /** Stores the provided documents in the given file and returns whether
+ * the storage operation was successful.
+ *
+ * <p>If the file already existed and if it contains documents, merge
+ * the new documents with the existing ones.</p> */
public boolean store(File documentFile, Set<T> documentsToStore) {
/* Retrieve existing documents. */
@@ -75,8 +79,8 @@ public class DocumentStore<T extends Document> {
documentTempFile.getParentFile().mkdirs();
BufferedWriter bw = new BufferedWriter(new FileWriter(
documentTempFile));
- for (Map.Entry<String, SortedSet<String>> e :
- formattedDocuments.entrySet()) {
+ for (Map.Entry<String, SortedSet<String>> e
+ : formattedDocuments.entrySet()) {
bw.write(e.getKey() + "\n");
for (String s : e.getValue()) {
bw.write(" " + s + "\n");
@@ -95,12 +99,12 @@ public class DocumentStore<T extends Document> {
return true;
}
- /* Retrieve all previously stored documents from the given file. */
+ /** Retrieves all previously stored documents from the given file. */
public Set<T> retrieve(File documentFile) {
return this.retrieve(documentFile, "");
}
- /* Retrieve previously stored documents from the given file that start
+ /** Retrieves previously stored documents from the given file that start
* with the given prefix. */
public Set<T> retrieve(File documentFile, String prefix) {
@@ -116,7 +120,8 @@ public class DocumentStore<T extends Document> {
try {
LineNumberReader lnr = new LineNumberReader(new BufferedReader(
new FileReader(documentFile)));
- String line, formattedString0 = null;
+ String line;
+ String formattedString0 = null;
while ((line = lnr.readLine()) != null) {
if (!line.startsWith(" ")) {
formattedString0 = line;
@@ -126,12 +131,13 @@ public class DocumentStore<T extends Document> {
+ "documents.%n", documentFile.getAbsolutePath());
lnr.close();
return null;
- } else if (prefix.length() > formattedString0.length() &&
- !(formattedString0 + line.substring(1)).startsWith(prefix)) {
+ } else if (prefix.length() > formattedString0.length()
+ && !(formattedString0 + line.substring(1))
+ .startsWith(prefix)) {
/* Skip combined line not starting with prefix. */
continue;
- } else if (prefix.length() > 0 &&
- !formattedString0.startsWith(prefix)) {
+ } else if (prefix.length() > 0
+ && !formattedString0.startsWith(prefix)) {
/* Skip line not starting with prefix. */
continue;
} else {
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/ExtrapolatedHidServStats.java b/modules/hidserv/src/org/torproject/metrics/hidserv/ExtrapolatedHidServStats.java
index 52357d4..26c3dde 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/ExtrapolatedHidServStats.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/ExtrapolatedHidServStats.java
@@ -1,66 +1,79 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
-/* Extrapolated network totals of hidden-service statistics reported by a
+/** Extrapolated network totals of hidden-service statistics reported by a
* single relay. Extrapolated values are based on reported statistics and
* computed network fractions in the statistics interval. */
public class ExtrapolatedHidServStats implements Document {
- /* Date of statistics interval end in milliseconds. */
+ /** Date of statistics interval end in milliseconds. */
private long statsDateMillis;
+
public long getStatsDateMillis() {
return this.statsDateMillis;
}
- /* Relay fingerprint consisting of 40 upper-case hex characters. */
+ /** Relay fingerprint consisting of 40 upper-case hex characters. */
private String fingerprint;
+
public String getFingerprint() {
return this.fingerprint;
}
- /* Extrapolated number of cells on rendezvous circuits in the
+ /** Extrapolated number of cells on rendezvous circuits in the
* network. */
private double extrapolatedRendRelayedCells;
+
public void setExtrapolatedRendRelayedCells(
double extrapolatedRendRelayedCells) {
this.extrapolatedRendRelayedCells = extrapolatedRendRelayedCells;
}
+
public double getExtrapolatedRendRelayedCells() {
return this.extrapolatedRendRelayedCells;
}
- /* Computed fraction of observed cells on rendezvous circuits in the
+ /** Computed fraction of observed cells on rendezvous circuits in the
* network, used to weight this relay's extrapolated network total in
* the aggregation step. */
private double fractionRendRelayedCells;
+
public void setFractionRendRelayedCells(
double fractionRendRelayedCells) {
this.fractionRendRelayedCells = fractionRendRelayedCells;
}
+
public double getFractionRendRelayedCells() {
return this.fractionRendRelayedCells;
}
- /* Extrapolated number of .onions in the network. */
+ /** Extrapolated number of .onions in the network. */
private double extrapolatedDirOnionsSeen;
+
public void setExtrapolatedDirOnionsSeen(
double extrapolatedDirOnionsSeen) {
this.extrapolatedDirOnionsSeen = extrapolatedDirOnionsSeen;
}
+
public double getExtrapolatedDirOnionsSeen() {
return this.extrapolatedDirOnionsSeen;
}
- /* Computed fraction of observed .onions in the network, used to weight
+ /** Computed fraction of observed .onions in the network, used to weight
* this relay's extrapolated network total in the aggregation step. */
private double fractionDirOnionsSeen;
+
public void setFractionDirOnionsSeen(double fractionDirOnionsSeen) {
this.fractionDirOnionsSeen = fractionDirOnionsSeen;
}
+
public double getFractionDirOnionsSeen() {
return this.fractionDirOnionsSeen;
}
- /* Instantiate a new stats object using fingerprint and statistics
+ /** Instantiates a new stats object using fingerprint and statistics
* interval end date which together uniquely identify the object. */
public ExtrapolatedHidServStats(long statsDateMillis,
String fingerprint) {
@@ -68,7 +81,7 @@ public class ExtrapolatedHidServStats implements Document {
this.fingerprint = fingerprint;
}
- /* Return whether this object contains the same fingerprint and
+ /** Returns whether this object contains the same fingerprint and
* statistics interval end date as the passed object. */
@Override
public boolean equals(Object otherObject) {
@@ -77,42 +90,42 @@ public class ExtrapolatedHidServStats implements Document {
}
ExtrapolatedHidServStats other =
(ExtrapolatedHidServStats) otherObject;
- return this.fingerprint.equals(other.fingerprint) &&
- this.statsDateMillis == other.statsDateMillis;
+ return this.fingerprint.equals(other.fingerprint)
+ && this.statsDateMillis == other.statsDateMillis;
}
- /* Return a (hopefully unique) hash code based on this object's
+ /** Returns a (hopefully unique) hash code based on this object's
* fingerprint and statistics interval end date. */
@Override
public int hashCode() {
return this.fingerprint.hashCode() + (int) this.statsDateMillis;
}
- /* Return a string representation of this object, consisting of the
+ /** Returns a string representation of this object, consisting of the
* statistics interval end date and the concatenation of all other
* attributes. */
@Override
public String[] format() {
String first = DateTimeHelper.format(this.statsDateMillis,
DateTimeHelper.ISO_DATE_FORMAT);
- String second = this.fingerprint +
- (this.fractionRendRelayedCells == 0.0 ? ",,"
+ String second = this.fingerprint
+ + (this.fractionRendRelayedCells == 0.0 ? ",,"
: String.format(",%.0f,%f", this.extrapolatedRendRelayedCells,
- this.fractionRendRelayedCells)) +
- (this.fractionDirOnionsSeen == 0.0 ? ",,"
+ this.fractionRendRelayedCells))
+ + (this.fractionDirOnionsSeen == 0.0 ? ",,"
: String.format(",%.0f,%f", this.extrapolatedDirOnionsSeen,
this.fractionDirOnionsSeen));
return new String[] { first, second };
}
- /* Instantiate an empty stats object that will be initialized more by
+ /** Instantiates an empty stats object that will be initialized more by
* the parse method. */
ExtrapolatedHidServStats() {
}
- /* Initialize this stats object using the two provided strings that have
- * been produced by the format method earlier. Return whether this
- * operation was successful. */
+ /** Initializes this stats object using the two provided strings that
+ * have been produced by the format method earlier and returns whether
+ * this operation was successful. */
@Override
public boolean parse(String[] formattedStrings) {
if (formattedStrings.length != 2) {
@@ -129,9 +142,10 @@ public class ExtrapolatedHidServStats implements Document {
return false;
}
String fingerprint = secondParts[0];
- double extrapolatedRendRelayedCells = 0.0,
- fractionRendRelayedCells = 0.0, extrapolatedDirOnionsSeen = 0.0,
- fractionDirOnionsSeen = 0.0;
+ double extrapolatedRendRelayedCells = 0.0;
+ double fractionRendRelayedCells = 0.0;
+ double extrapolatedDirOnionsSeen = 0.0;
+ double fractionDirOnionsSeen = 0.0;
try {
extrapolatedRendRelayedCells = secondParts[1].equals("") ? 0.0
: Double.parseDouble(secondParts[1]);
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/Extrapolator.java b/modules/hidserv/src/org/torproject/metrics/hidserv/Extrapolator.java
index e926154..8dec411 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/Extrapolator.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/Extrapolator.java
@@ -1,3 +1,6 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.io.File;
@@ -9,37 +12,37 @@ import java.util.SortedSet;
import java.util.TreeMap;
import java.util.TreeSet;
-/* Extrapolate hidden-service statistics reported by single relays by
+/** Extrapolate hidden-service statistics reported by single relays by
* dividing them by the computed fraction of hidden-service activity
* observed by the relay. */
public class Extrapolator {
- /* Document file containing previously parsed reported hidden-service
+ /** Document file containing previously parsed reported hidden-service
* statistics. */
private File reportedHidServStatsFile;
- /* Document store for storing and retrieving reported hidden-service
+ /** Document store for storing and retrieving reported hidden-service
* statistics. */
private DocumentStore<ReportedHidServStats> reportedHidServStatsStore;
- /* Directory containing document files with previously computed network
+ /** Directory containing document files with previously computed network
* fractions. */
private File computedNetworkFractionsDirectory;
- /* Document store for storing and retrieving computed network
+ /** Document store for storing and retrieving computed network
* fractions. */
private DocumentStore<ComputedNetworkFractions>
computedNetworkFractionsStore;
- /* Document file containing extrapolated hidden-service statistics. */
+ /** Document file containing extrapolated hidden-service statistics. */
private File extrapolatedHidServStatsFile;
- /* Document store for storing and retrieving extrapolated hidden-service
+ /** Document store for storing and retrieving extrapolated hidden-service
* statistics. */
private DocumentStore<ExtrapolatedHidServStats>
extrapolatedHidServStatsStore;
- /* Initialize a new extrapolator object using the given directory and
+ /** Initializes a new extrapolator object using the given directory and
* document stores. */
public Extrapolator(File statusDirectory,
DocumentStore<ReportedHidServStats> reportedHidServStatsStore,
@@ -63,7 +66,7 @@ public class Extrapolator {
this.extrapolatedHidServStatsStore = extrapolatedHidServStatsStore;
}
- /* Iterate over all reported stats and extrapolate network totals for
+ /** Iterates over all reported stats and extrapolate network totals for
* those that have not been extrapolated before. */
public boolean extrapolateHidServStats() {
@@ -100,8 +103,8 @@ public class Extrapolator {
}
/* Go through reported stats by fingerprint. */
- for (Map.Entry<String, Set<ReportedHidServStats>> e :
- parsedStatsByFingerprint.entrySet()) {
+ for (Map.Entry<String, Set<ReportedHidServStats>> e
+ : parsedStatsByFingerprint.entrySet()) {
String fingerprint = e.getKey();
/* Iterate over all stats reported by this relay and make a list of
@@ -176,12 +179,12 @@ public class Extrapolator {
/* Sum up computed network fractions and count known consensus in
* the relevant interval, so that we can later compute means of
* network fractions. */
- double sumFractionRendRelayedCells = 0.0,
- sumFractionDirOnionsSeen = 0.0;
+ double sumFractionRendRelayedCells = 0.0;
+ double sumFractionDirOnionsSeen = 0.0;
int consensuses = 0;
for (long validAfterMillis : knownConsensuses) {
- if (statsStartMillis <= validAfterMillis &&
- validAfterMillis < statsEndMillis) {
+ if (statsStartMillis <= validAfterMillis
+ && validAfterMillis < statsEndMillis) {
if (computedNetworkFractions.containsKey(validAfterMillis)) {
ComputedNetworkFractions frac =
computedNetworkFractions.get(validAfterMillis);
@@ -208,8 +211,8 @@ public class Extrapolator {
/* If at least one fraction is positive, extrapolate network
* totals. */
- if (fractionRendRelayedCells > 0.0 ||
- fractionDirOnionsSeen > 0.0) {
+ if (fractionRendRelayedCells > 0.0
+ || fractionDirOnionsSeen > 0.0) {
ExtrapolatedHidServStats extrapolated =
new ExtrapolatedHidServStats(
statsDateMillis, fingerprint);
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/Main.java b/modules/hidserv/src/org/torproject/metrics/hidserv/Main.java
index 7405b78..f29c868 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/Main.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/Main.java
@@ -1,17 +1,20 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.io.File;
import java.util.HashSet;
import java.util.Set;
-/* Main class for updating extrapolated network totals of hidden-service
+/** Main class for updating extrapolated network totals of hidden-service
* statistics. The main method of this class can be executed as often as
* new statistics are needed, though callers must ensure that executions
* do not overlap. */
public class Main {
- /* Parse new descriptors, extrapolate contained statistics using
- * computed network fractions, aggregate results, and write results to
+ /** Parses new descriptors, extrapolate contained statistics using
+ * computed network fractions, aggregate results, and writes results to
* disk. */
public static void main(String[] args) {
@@ -22,10 +25,11 @@ public class Main {
inDirectories.add(
new File("../../shared/in/recent/relay-descriptors/extra-infos"));
File statusDirectory = new File("status");
- File hidservStatsExtrapolatedCsvFile = new File("stats/hidserv.csv");
- /* Initialize document stores that will handle writing documents to
- * files. */
+ /* Initialize parser and read parse history to avoid parsing
+ * descriptor files that haven't changed since the last execution. */
+ System.out.println("Initializing parser and reading parse "
+ + "history...");
DocumentStore<ReportedHidServStats> reportedHidServStatsStore =
new DocumentStore<ReportedHidServStats>(
ReportedHidServStats.class);
@@ -33,14 +37,6 @@ public class Main {
computedNetworkFractionsStore =
new DocumentStore<ComputedNetworkFractions>(
ComputedNetworkFractions.class);
- DocumentStore<ExtrapolatedHidServStats> extrapolatedHidServStatsStore
- = new DocumentStore<ExtrapolatedHidServStats>(
- ExtrapolatedHidServStats.class);
-
- /* Initialize parser and read parse history to avoid parsing
- * descriptor files that haven't changed since the last execution. */
- System.out.println("Initializing parser and reading parse "
- + "history...");
Parser parser = new Parser(inDirectories, statusDirectory,
reportedHidServStatsStore, computedNetworkFractionsStore);
parser.readParseHistory();
@@ -66,6 +62,9 @@ public class Main {
* a single file with extrapolated network totals based on reports by
* single relays. */
System.out.println("Extrapolating statistics...");
+ DocumentStore<ExtrapolatedHidServStats> extrapolatedHidServStatsStore
+ = new DocumentStore<ExtrapolatedHidServStats>(
+ ExtrapolatedHidServStats.class);
Extrapolator extrapolator = new Extrapolator(statusDirectory,
reportedHidServStatsStore, computedNetworkFractionsStore,
extrapolatedHidServStatsStore);
@@ -80,6 +79,7 @@ public class Main {
* other statistics. Write the result to a .csv file that can be
* processed by other tools. */
System.out.println("Aggregating statistics...");
+ File hidservStatsExtrapolatedCsvFile = new File("stats/hidserv.csv");
Aggregator aggregator = new Aggregator(statusDirectory,
extrapolatedHidServStatsStore, hidservStatsExtrapolatedCsvFile);
aggregator.aggregateHidServStats();
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/Parser.java b/modules/hidserv/src/org/torproject/metrics/hidserv/Parser.java
index 85f7d91..0acdb17 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/Parser.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/Parser.java
@@ -1,5 +1,16 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.ExtraInfoDescriptor;
+import org.torproject.descriptor.NetworkStatusEntry;
+import org.torproject.descriptor.RelayNetworkStatusConsensus;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.ByteArrayInputStream;
@@ -20,45 +31,37 @@ import java.util.SortedSet;
import java.util.TreeMap;
import java.util.TreeSet;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.ExtraInfoDescriptor;
-import org.torproject.descriptor.NetworkStatusEntry;
-import org.torproject.descriptor.RelayNetworkStatusConsensus;
-
-/* Parse hidden-service statistics from extra-info descriptors, compute
+/** Parse hidden-service statistics from extra-info descriptors, compute
* network fractions from consensuses, and write parsed contents to
* document files for later use. */
public class Parser {
- /* File containing tuples of last-modified times and file names of
+ /** File containing tuples of last-modified times and file names of
* descriptor files parsed in the previous execution. */
private File parseHistoryFile;
- /* Descriptor reader to provide parsed extra-info descriptors and
+ /** Descriptor reader to provide parsed extra-info descriptors and
* consensuses. */
private DescriptorReader descriptorReader;
- /* Document file containing previously parsed reported hidden-service
+ /** Document file containing previously parsed reported hidden-service
* statistics. */
private File reportedHidServStatsFile;
- /* Document store for storing and retrieving reported hidden-service
+ /** Document store for storing and retrieving reported hidden-service
* statistics. */
private DocumentStore<ReportedHidServStats> reportedHidServStatsStore;
- /* Directory containing document files with previously computed network
+ /** Directory containing document files with previously computed network
* fractions. */
private File computedNetworkFractionsDirectory;
- /* Document store for storing and retrieving computed network
+ /** Document store for storing and retrieving computed network
* fractions. */
private DocumentStore<ComputedNetworkFractions>
computedNetworkFractionsStore;
- /* Initialize a new parser object using the given directories and
+ /** Initializes a new parser object using the given directories and
* document stores. */
public Parser(Set<File> inDirectories, File statusDirectory,
DocumentStore<ReportedHidServStats> reportedHidServStatsStore,
@@ -90,11 +93,11 @@ public class Parser {
this.computedNetworkFractionsStore = computedNetworkFractionsStore;
}
- /* Read the parse history file to avoid parsing descriptor files that
+ /** Reads the parse history file to avoid parsing descriptor files that
* have not changed since the previous execution. */
public void readParseHistory() {
- if (this.parseHistoryFile.exists() &&
- this.parseHistoryFile.isFile()) {
+ if (this.parseHistoryFile.exists()
+ && this.parseHistoryFile.isFile()) {
SortedMap<String, Long> excludedFiles =
new TreeMap<String, Long>();
try {
@@ -125,9 +128,9 @@ public class Parser {
}
}
- /* Write parsed or skipped descriptor files with last-modified times and
- * absolute paths to the parse history file to avoid parsing these files
- * again, unless they change until the next execution. */
+ /** Writes parsed or skipped descriptor files with last-modified times
+ * and absolute paths to the parse history file to avoid parsing these
+ * files again, unless they change until the next execution. */
public void writeParseHistory() {
/* Obtain the list of descriptor files that were either parsed now or
@@ -141,8 +144,8 @@ public class Parser {
this.parseHistoryFile.getParentFile().mkdirs();
BufferedWriter bw = new BufferedWriter(new FileWriter(
this.parseHistoryFile));
- for (Map.Entry<String, Long> e :
- excludedAndParsedFiles.entrySet()) {
+ for (Map.Entry<String, Long> e
+ : excludedAndParsedFiles.entrySet()) {
/* Each line starts with the last-modified time of the descriptor
* file, followed by its absolute path. */
String absolutePath = e.getKey();
@@ -158,16 +161,17 @@ public class Parser {
}
}
- /* Set of all reported hidden-service statistics. To date, these
- * objects are small, and keeping them all in memory is easy. But if
- * this ever changes, e.g., when more and more statistics are added,
- * this may not scale. */
+ /** Set of all reported hidden-service statistics.
+ *
+ * <p>To date, these objects are small, and keeping them all in memory
+ * is easy. But if this ever changes, e.g., when more and more
+ * statistics are added, this may not scale.</p> */
private Set<ReportedHidServStats> reportedHidServStats =
new HashSet<ReportedHidServStats>();
- /* Instruct the descriptor reader to parse descriptor files, and handle
- * the resulting parsed descriptors if they are either extra-info
- * descriptors or consensuses. */
+ /** Instructs the descriptor reader to parse descriptor files, and
+ * handles the resulting parsed descriptors if they are either
+ * extra-info descriptors or consensuses. */
public boolean parseDescriptors() {
Iterator<DescriptorFile> descriptorFiles =
this.descriptorReader.readDescriptors();
@@ -194,10 +198,11 @@ public class Parser {
this.reportedHidServStatsFile, this.reportedHidServStats);
}
- /* Parse the given extra-info descriptor by extracting its fingerprint
- * and contained hidserv-* lines. If a valid set of hidserv-stats can
- * be extracted, create a new stats object that will later be stored to
- * a document file. */
+ /** Parses the given extra-info descriptor by extracting its fingerprint
+ * and contained hidserv-* lines.
+ *
+ * <p>If a valid set of hidserv-stats can be extracted, create a new
+ * stats object that will later be stored to a document file.</p> */
private void parseExtraInfoDescriptor(
ExtraInfoDescriptor extraInfoDescriptor) {
@@ -209,9 +214,12 @@ public class Parser {
* descriptor-parsing library. */
Scanner scanner = new Scanner(new ByteArrayInputStream(
extraInfoDescriptor.getRawDescriptorBytes()));
- Long statsEndMillis = null, statsIntervalSeconds = null,
- rendRelayedCells = null, rendRelayedCellsBinSize = null,
- dirOnionsSeen = null, dirOnionsSeenBinSize = null;
+ Long statsEndMillis = null;
+ Long statsIntervalSeconds = null;
+ Long rendRelayedCells = null;
+ Long rendRelayedCellsBinSize = null;
+ Long dirOnionsSeen = null;
+ Long dirOnionsSeenBinSize = null;
try {
while (scanner.hasNext()) {
String line = scanner.nextLine();
@@ -219,8 +227,8 @@ public class Parser {
String[] parts = line.split(" ");
if (parts[0].equals("hidserv-stats-end")) {
/* Parse statistics end and statistics interval length. */
- if (parts.length != 5 || !parts[3].startsWith("(") ||
- !parts[4].equals("s)")) {
+ if (parts.length != 5 || !parts[3].startsWith("(")
+ || !parts[4].equals("s)")) {
/* Will warn below, because statsEndMillis is still null. */
continue;
}
@@ -231,8 +239,8 @@ public class Parser {
/* Parse the reported number of cells on rendezvous circuits
* and the bin size used by the relay to obfuscate that
* number. */
- if (parts.length != 5 ||
- !parts[4].startsWith("bin_size=")) {
+ if (parts.length != 5
+ || !parts[4].startsWith("bin_size=")) {
/* Will warn below, because rendRelayedCells is still
* null. */
continue;
@@ -243,8 +251,8 @@ public class Parser {
} else if (parts[0].equals("hidserv-dir-onions-seen")) {
/* Parse the reported number of distinct .onion addresses and
* the bin size used by the relay to obfuscate that number. */
- if (parts.length != 5 ||
- !parts[4].startsWith("bin_size=")) {
+ if (parts.length != 5
+ || !parts[4].startsWith("bin_size=")) {
/* Will warn below, because dirOnionsSeen is still null. */
continue;
}
@@ -262,17 +270,17 @@ public class Parser {
* lines, don't do anything. This applies to the majority of
* descriptors, at least as long as only a minority of relays reports
* these statistics. */
- if (statsEndMillis == null && rendRelayedCells == null &&
- dirOnionsSeen == null) {
+ if (statsEndMillis == null && rendRelayedCells == null
+ && dirOnionsSeen == null) {
return;
/* If the descriptor contained all expected hidserv-* lines, create a
* new stats object and put it in the local map, so that it will later
* be written to a document file. */
- } else if (statsEndMillis != null &&
- statsEndMillis != DateTimeHelper.NO_TIME_AVAILABLE &&
- statsIntervalSeconds != null && rendRelayedCells != null &&
- dirOnionsSeen != null) {
+ } else if (statsEndMillis != null
+ && statsEndMillis != DateTimeHelper.NO_TIME_AVAILABLE
+ && statsIntervalSeconds != null && rendRelayedCells != null
+ && dirOnionsSeen != null) {
ReportedHidServStats reportedStats = new ReportedHidServStats(
fingerprint, statsEndMillis);
reportedStats.setStatsIntervalSeconds(statsIntervalSeconds);
@@ -292,7 +300,7 @@ public class Parser {
}
}
- /* Remove noise from a reported stats value by rounding to the nearest
+ /** Removes noise from a reported stats value by rounding to the nearest
* right side of a bin and subtracting half of the bin size. */
private long removeNoise(long reportedNumber, long binSize) {
long roundedToNearestRightSideOfTheBin =
@@ -302,6 +310,7 @@ public class Parser {
return subtractedHalfOfBinSize;
}
+ /** Parses the given consensus. */
public boolean parseRelayNetworkStatusConsensus(
RelayNetworkStatusConsensus consensus) {
@@ -345,8 +354,8 @@ public class Parser {
new TreeMap<String, Double>();
/* Go through all status entries contained in the consensus. */
- for (Map.Entry<String, NetworkStatusEntry> e :
- consensus.getStatusEntries().entrySet()) {
+ for (Map.Entry<String, NetworkStatusEntry> e
+ : consensus.getStatusEntries().entrySet()) {
String fingerprint = e.getKey();
NetworkStatusEntry statusEntry = e.getValue();
SortedSet<String> flags = statusEntry.getFlags();
@@ -399,18 +408,18 @@ public class Parser {
/* Define the total ring size to compute fractions below. This is
* 16^40 or 2^160. */
- final double RING_SIZE = new BigInteger(
+ final double ringSize = new BigInteger(
"10000000000000000000000000000000000000000",
16).doubleValue();
/* Go through all status entries again, this time computing network
* fractions. */
- for (Map.Entry<String, NetworkStatusEntry> e :
- consensus.getStatusEntries().entrySet()) {
+ for (Map.Entry<String, NetworkStatusEntry> e
+ : consensus.getStatusEntries().entrySet()) {
String fingerprint = e.getKey();
NetworkStatusEntry statusEntry = e.getValue();
- double fractionRendRelayedCells = 0.0,
- fractionDirOnionsSeen = 0.0;
+ double fractionRendRelayedCells = 0.0;
+ double fractionDirOnionsSeen = 0.0;
if (statusEntry != null) {
/* Check if the relay is a hidden-service directory by looking up
@@ -424,8 +433,8 @@ public class Parser {
* this directory by three positions. */
String startResponsible = fingerprint;
int positionsToGo = 3;
- for (String hsDirFingerprint :
- hsDirs.tailSet(fingerprintPrecededByOne)) {
+ for (String hsDirFingerprint
+ : hsDirs.tailSet(fingerprintPrecededByOne)) {
startResponsible = hsDirFingerprint;
if (positionsToGo-- <= 0) {
break;
@@ -438,7 +447,7 @@ public class Parser {
fractionDirOnionsSeen =
new BigInteger(fingerprintPrecededByOne, 16).subtract(
new BigInteger(startResponsible, 16)).doubleValue()
- / RING_SIZE;
+ / ringSize;
/* Divide this fraction by three to obtain the fraction of
* descriptors that this directory has seen. This step is
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/ReportedHidServStats.java b/modules/hidserv/src/org/torproject/metrics/hidserv/ReportedHidServStats.java
index 996a70a..932c945 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/ReportedHidServStats.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/ReportedHidServStats.java
@@ -1,3 +1,6 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
/* Hidden-service statistics reported by a single relay covering a single
@@ -7,21 +10,25 @@ public class ReportedHidServStats implements Document {
/* Relay fingerprint consisting of 40 upper-case hex characters. */
private String fingerprint;
+
public String getFingerprint() {
return this.fingerprint;
}
/* Hidden-service statistics end timestamp in milliseconds. */
private long statsEndMillis;
+
public long getStatsEndMillis() {
return this.statsEndMillis;
}
/* Statistics interval length in seconds. */
private long statsIntervalSeconds;
+
public void setStatsIntervalSeconds(long statsIntervalSeconds) {
this.statsIntervalSeconds = statsIntervalSeconds;
}
+
public long getStatsIntervalSeconds() {
return this.statsIntervalSeconds;
}
@@ -30,9 +37,11 @@ public class ReportedHidServStats implements Document {
* relay and adjusted by rounding to the nearest right side of a bin and
* subtracting half of the bin size. */
private long rendRelayedCells;
+
public void setRendRelayedCells(long rendRelayedCells) {
this.rendRelayedCells = rendRelayedCells;
}
+
public long getRendRelayedCells() {
return this.rendRelayedCells;
}
@@ -41,9 +50,11 @@ public class ReportedHidServStats implements Document {
* adjusted by rounding to the nearest right side of a bin and
* subtracting half of the bin size. */
private long dirOnionsSeen;
+
public void setDirOnionsSeen(long dirOnionsSeen) {
this.dirOnionsSeen = dirOnionsSeen;
}
+
public long getDirOnionsSeen() {
return this.dirOnionsSeen;
}
@@ -63,8 +74,8 @@ public class ReportedHidServStats implements Document {
return false;
}
ReportedHidServStats other = (ReportedHidServStats) otherObject;
- return this.fingerprint.equals(other.fingerprint) &&
- this.statsEndMillis == other.statsEndMillis;
+ return this.fingerprint.equals(other.fingerprint)
+ && this.statsEndMillis == other.statsEndMillis;
}
/* Return a (hopefully unique) hash code based on this object's
@@ -101,7 +112,6 @@ public class ReportedHidServStats implements Document {
+ "Skipping.%n", formattedStrings.length);
return false;
}
- String fingerprint = formattedStrings[0];
String[] secondParts = formattedStrings[1].split(",", 4);
if (secondParts.length != 4) {
return false;
@@ -110,8 +120,9 @@ public class ReportedHidServStats implements Document {
if (statsEndMillis == DateTimeHelper.NO_TIME_AVAILABLE) {
return false;
}
- long statsIntervalSeconds = -1L, rendRelayedCells = -1L,
- dirOnionsSeen = -1L;
+ long statsIntervalSeconds = -1L;
+ long rendRelayedCells = -1L;
+ long dirOnionsSeen = -1L;
try {
statsIntervalSeconds = Long.parseLong(secondParts[1]);
rendRelayedCells = Long.parseLong(secondParts[2]);
@@ -119,7 +130,7 @@ public class ReportedHidServStats implements Document {
} catch (NumberFormatException e) {
return false;
}
- this.fingerprint = fingerprint;
+ this.fingerprint = formattedStrings[0];
this.statsEndMillis = statsEndMillis;
this.statsIntervalSeconds = statsIntervalSeconds;
this.rendRelayedCells = rendRelayedCells;
diff --git a/modules/hidserv/src/org/torproject/metrics/hidserv/Simulate.java b/modules/hidserv/src/org/torproject/metrics/hidserv/Simulate.java
index db7d065..d699ca5 100644
--- a/modules/hidserv/src/org/torproject/metrics/hidserv/Simulate.java
+++ b/modules/hidserv/src/org/torproject/metrics/hidserv/Simulate.java
@@ -1,3 +1,6 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
package org.torproject.metrics.hidserv;
import java.io.BufferedWriter;
@@ -23,6 +26,7 @@ public class Simulate {
private static File simOnionsCsvFile =
new File("out/csv/sim-onions.csv");
+ /** Runs two simulations to evaluate this data-processing module. */
public static void main(String[] args) throws Exception {
System.out.print("Simulating extrapolation of rendezvous cells");
simulateManyCells();
@@ -108,9 +112,9 @@ public class Simulate {
for (int i = 0; i < numberRendPoints; i++) {
long observed = observedCells[i];
long afterBinning = ((observed + binSize - 1L) / binSize) * binSize;
- double p = rnd.nextDouble();
- double laplaceNoise = -b * (p > 0.5 ? 1.0 : -1.0) *
- Math.log(1.0 - 2.0 * Math.abs(p - 0.5));
+ double randomDouble = rnd.nextDouble();
+ double laplaceNoise = -b * (randomDouble > 0.5 ? 1.0 : -1.0)
+ * Math.log(1.0 - 2.0 * Math.abs(randomDouble - 0.5));
long reported = afterBinning + (long) laplaceNoise;
reportedCells[i] = reported;
long roundedToNearestRightSideOfTheBin =
@@ -166,27 +170,29 @@ public class Simulate {
reportingRelays.remove(removeRelay);
nonReportingRelays.add(removeRelay);
}
- } while (totalReportingProbability < fraction - 0.001 ||
- totalReportingProbability > fraction + 0.001);
+ } while (totalReportingProbability < fraction - 0.001
+ || totalReportingProbability > fraction + 0.001);
Collections.sort(singleRelayExtrapolations,
new Comparator<double[]>() {
- public int compare(double[] o1, double[] o2) {
- return o1[0] < o2[0] ? -1 : o1[0] > o2[0] ? 1 : 0;
- }
- });
- double totalProbability = 0.0, totalValues = 0.0;
- double totalInterquartileProbability = 0.0,
- totalInterquartileValues = 0.0;
+ public int compare(double[] o1, double[] o2) {
+ return o1[0] < o2[0] ? -1 : o1[0] > o2[0] ? 1 : 0;
+ }
+ }
+ );
+ double totalProbability = 0.0;
+ double totalValues = 0.0;
+ double totalInterquartileProbability = 0.0;
+ double totalInterquartileValues = 0.0;
Double weightedMedian = null;
for (double[] extrapolation : singleRelayExtrapolations) {
totalValues += extrapolation[1];
totalProbability += extrapolation[2];
- if (weightedMedian == null &&
- totalProbability > totalReportingProbability * 0.5) {
+ if (weightedMedian == null
+ && totalProbability > totalReportingProbability * 0.5) {
weightedMedian = extrapolation[0];
}
- if (totalProbability > totalReportingProbability * 0.25 &&
- totalProbability < totalReportingProbability * 0.75) {
+ if (totalProbability > totalReportingProbability * 0.25
+ && totalProbability < totalReportingProbability * 0.75) {
totalInterquartileValues += extrapolation[1];
totalInterquartileProbability += extrapolation[2];
}
@@ -240,8 +246,8 @@ public class Simulate {
for (int i = 0; i < numberOnions; i++) {
for (int j = 0; j < replicas; j++) {
int leftToStore = storeOnDirs;
- for (double fingerprint :
- hsDirFingerprints.tailSet(rnd.nextDouble())) {
+ for (double fingerprint
+ : hsDirFingerprints.tailSet(rnd.nextDouble())) {
storedDescs.get(fingerprint).add(i);
if (--leftToStore <= 0) {
break;
@@ -262,16 +268,17 @@ public class Simulate {
* to remove noise again. */
final long binSize = 8L;
final double b = 8.0 / 0.3;
- SortedMap<Double, Long> reportedOnions = new TreeMap<Double, Long>(),
- removedNoiseOnions = new TreeMap<Double, Long>();
- for (Map.Entry<Double, SortedSet<Integer>> e :
- storedDescs.entrySet()) {
+ SortedMap<Double, Long> reportedOnions = new TreeMap<Double, Long>();
+ SortedMap<Double, Long> removedNoiseOnions =
+ new TreeMap<Double, Long>();
+ for (Map.Entry<Double, SortedSet<Integer>> e
+ : storedDescs.entrySet()) {
double fingerprint = e.getKey();
long observed = (long) e.getValue().size();
long afterBinning = ((observed + binSize - 1L) / binSize) * binSize;
- double p = rnd.nextDouble();
- double laplaceNoise = -b * (p > 0.5 ? 1.0 : -1.0) *
- Math.log(1.0 - 2.0 * Math.abs(p - 0.5));
+ double randomDouble = rnd.nextDouble();
+ double laplaceNoise = -b * (randomDouble > 0.5 ? 1.0 : -1.0)
+ * Math.log(1.0 - 2.0 * Math.abs(randomDouble - 0.5));
long reported = afterBinning + (long) laplaceNoise;
reportedOnions.put(fingerprint, reported);
long roundedToNearestRightSideOfTheBin =
@@ -326,27 +333,29 @@ public class Simulate {
reportingRelays.remove(removeRelay);
nonReportingRelays.add(removeRelay);
}
- } while (totalReportingProbability < fraction - 0.001 ||
- totalReportingProbability > fraction + 0.001);
+ } while (totalReportingProbability < fraction - 0.001
+ || totalReportingProbability > fraction + 0.001);
Collections.sort(singleRelayExtrapolations,
new Comparator<double[]>() {
- public int compare(double[] o1, double[] o2) {
- return o1[0] < o2[0] ? -1 : o1[0] > o2[0] ? 1 : 0;
- }
- });
- double totalProbability = 0.0, totalValues = 0.0;
- double totalInterquartileProbability = 0.0,
- totalInterquartileValues = 0.0;
+ public int compare(double[] first, double[] second) {
+ return first[0] < second[0] ? -1 : first[0] > second[0] ? 1 : 0;
+ }
+ }
+ );
+ double totalProbability = 0.0;
+ double totalValues = 0.0;
+ double totalInterquartileProbability = 0.0;
+ double totalInterquartileValues = 0.0;
Double weightedMedian = null;
for (double[] extrapolation : singleRelayExtrapolations) {
totalValues += extrapolation[1];
totalProbability += extrapolation[2];
- if (weightedMedian == null &&
- totalProbability > totalReportingProbability * 0.5) {
+ if (weightedMedian == null
+ && totalProbability > totalReportingProbability * 0.5) {
weightedMedian = extrapolation[0];
}
- if (totalProbability > totalReportingProbability * 0.25 &&
- totalProbability < totalReportingProbability * 0.75) {
+ if (totalProbability > totalReportingProbability * 0.25
+ && totalProbability < totalReportingProbability * 0.75) {
totalInterquartileValues += extrapolation[1];
totalInterquartileProbability += extrapolation[2];
}
diff --git a/modules/legacy/src/org/torproject/ernie/cron/Configuration.java b/modules/legacy/src/org/torproject/ernie/cron/Configuration.java
index 86d5d10..1bc2af7 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/Configuration.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/Configuration.java
@@ -1,5 +1,6 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.ernie.cron;
import java.io.BufferedReader;
@@ -19,25 +20,41 @@ import java.util.logging.Logger;
* configuration.
*/
public class Configuration {
+
private boolean importDirectoryArchives = false;
+
private List<String> directoryArchivesDirectories =
new ArrayList<String>();
+
private boolean keepDirectoryArchiveImportHistory = false;
+
private boolean importSanitizedBridges = false;
+
private String sanitizedBridgesDirectory = "in/bridge-descriptors/";
+
private boolean keepSanitizedBridgesImportHistory = false;
+
private boolean writeRelayDescriptorDatabase = false;
+
private String relayDescriptorDatabaseJdbc =
"jdbc:postgresql://localhost/tordir?user=metrics&password=password";
+
private boolean writeRelayDescriptorsRawFiles = false;
+
private String relayDescriptorRawFilesDirectory = "pg-import/";
+
private boolean writeBridgeStats = false;
+
private boolean importWriteTorperfStats = false;
+
private String torperfDirectory = "in/torperf/";
+
private String exoneraTorDatabaseJdbc = "jdbc:postgresql:"
+ "//localhost/exonerator?user=metrics&password=password";
+
private String exoneraTorImportDirectory = "exonerator-import/";
+ /** Initializes this configuration class. */
public Configuration() {
/* Initialize logger. */
@@ -118,9 +135,12 @@ public class Configuration {
System.exit(1);
}
}
+
public boolean getImportDirectoryArchives() {
return this.importDirectoryArchives;
}
+
+ /** Returns directories containing archived descriptors. */
public List<String> getDirectoryArchivesDirectories() {
if (this.directoryArchivesDirectories.isEmpty()) {
String prefix = "../../shared/in/recent/relay-descriptors/";
@@ -131,42 +151,55 @@ public class Configuration {
return this.directoryArchivesDirectories;
}
}
+
public boolean getKeepDirectoryArchiveImportHistory() {
return this.keepDirectoryArchiveImportHistory;
}
+
public boolean getWriteRelayDescriptorDatabase() {
return this.writeRelayDescriptorDatabase;
}
+
public boolean getImportSanitizedBridges() {
return this.importSanitizedBridges;
}
+
public String getSanitizedBridgesDirectory() {
return this.sanitizedBridgesDirectory;
}
+
public boolean getKeepSanitizedBridgesImportHistory() {
return this.keepSanitizedBridgesImportHistory;
}
- public String getRelayDescriptorDatabaseJDBC() {
+
+ public String getRelayDescriptorDatabaseJdbc() {
return this.relayDescriptorDatabaseJdbc;
}
+
public boolean getWriteRelayDescriptorsRawFiles() {
return this.writeRelayDescriptorsRawFiles;
}
+
public String getRelayDescriptorRawFilesDirectory() {
return this.relayDescriptorRawFilesDirectory;
}
+
public boolean getWriteBridgeStats() {
return this.writeBridgeStats;
}
+
public boolean getImportWriteTorperfStats() {
return this.importWriteTorperfStats;
}
+
public String getTorperfDirectory() {
return this.torperfDirectory;
}
+
public String getExoneraTorDatabaseJdbc() {
return this.exoneraTorDatabaseJdbc;
}
+
public String getExoneraTorImportDirectory() {
return this.exoneraTorImportDirectory;
}
diff --git a/modules/legacy/src/org/torproject/ernie/cron/LockFile.java b/modules/legacy/src/org/torproject/ernie/cron/LockFile.java
index 1a18504..bc79fad 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/LockFile.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/LockFile.java
@@ -1,5 +1,6 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.ernie.cron;
import java.io.BufferedReader;
@@ -20,6 +21,9 @@ public class LockFile {
this.logger = Logger.getLogger(LockFile.class.getName());
}
+ /** Acquires the lock by checking whether a lock file already exists,
+ * and if not, by creating one with the current system time as
+ * content. */
public boolean acquireLock() {
this.logger.fine("Trying to acquire lock...");
try {
@@ -27,8 +31,8 @@ public class LockFile {
BufferedReader br = new BufferedReader(new FileReader("lock"));
long runStarted = Long.parseLong(br.readLine());
br.close();
- if (System.currentTimeMillis() - runStarted <
- 23L * 60L * 60L * 1000L) {
+ if (System.currentTimeMillis() - runStarted
+ < 23L * 60L * 60L * 1000L) {
return false;
}
}
@@ -44,6 +48,7 @@ public class LockFile {
}
}
+ /** Releases the lock by deleting the lock file, if present. */
public void releaseLock() {
this.logger.fine("Releasing lock...");
this.lockFile.delete();
diff --git a/modules/legacy/src/org/torproject/ernie/cron/LoggingConfiguration.java b/modules/legacy/src/org/torproject/ernie/cron/LoggingConfiguration.java
index c261d95..f6749cb 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/LoggingConfiguration.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/LoggingConfiguration.java
@@ -1,5 +1,6 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.ernie.cron;
import java.io.IOException;
@@ -17,22 +18,27 @@ import java.util.logging.Logger;
/**
* Initialize logging configuration.
*
- * Log levels used by ERNIE:
+ * <p>Log levels used by ERNIE:</p>
*
- * - SEVERE: An event made it impossible to continue program execution.
- * - WARNING: A potential problem occurred that requires the operator to
- * look after the otherwise unattended setup
- * - INFO: Messages on INFO level are meant to help the operator in making
- * sure that operation works as expected.
- * - FINE: Debug messages that are used to identify problems and which are
- * turned on by default.
- * - FINER: More detailed debug messages to investigate problems in more
- * detail. Not turned on by default. Increase log file limit when using
- * FINER.
- * - FINEST: Most detailed debug messages. Not used.
+ * <p>
+ * <ul>
+ * <li>SEVERE: An event made it impossible to continue program execution.
+ * WARNING: A potential problem occurred that requires the operator to
+ * look after the otherwise unattended setup</li>
+ * <li>INFO: Messages on INFO level are meant to help the operator in
+ * making sure that operation works as expected.</li>
+ * <li>FINE: Debug messages that are used to identify problems and which
+ * are turned on by default.</li>
+ * <li>FINER: More detailed debug messages to investigate problems in more
+ * detail. Not turned on by default. Increase log file limit when
+ * using FINER.</li>
+ * <li>FINEST: Most detailed debug messages. Not used.</li>
+ * </ul>
+ * </p>
*/
public class LoggingConfiguration {
+ /** Initializes the logging configuration. */
public LoggingConfiguration() {
/* Remove default console handler. */
diff --git a/modules/legacy/src/org/torproject/ernie/cron/Main.java b/modules/legacy/src/org/torproject/ernie/cron/Main.java
index 7319efa..b004476 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/Main.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/Main.java
@@ -1,18 +1,21 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
-package org.torproject.ernie.cron;
-import java.io.File;
-import java.util.logging.Logger;
+package org.torproject.ernie.cron;
import org.torproject.ernie.cron.network.ConsensusStatsFileHandler;
import org.torproject.ernie.cron.performance.TorperfProcessor;
+import java.io.File;
+import java.util.logging.Logger;
+
/**
* Coordinate downloading and parsing of descriptors and extraction of
* statistically relevant data for later processing with R.
*/
public class Main {
+
+ /** Executes this data-processing module. */
public static void main(String[] args) {
/* Initialize logging configuration. */
@@ -38,13 +41,13 @@ public class Main {
// Import relay descriptors
if (config.getImportDirectoryArchives()) {
RelayDescriptorDatabaseImporter rddi =
- config.getWriteRelayDescriptorDatabase() ||
- config.getWriteRelayDescriptorsRawFiles() ?
- new RelayDescriptorDatabaseImporter(
- config.getWriteRelayDescriptorDatabase() ?
- config.getRelayDescriptorDatabaseJDBC() : null,
- config.getWriteRelayDescriptorsRawFiles() ?
- config.getRelayDescriptorRawFilesDirectory() : null,
+ config.getWriteRelayDescriptorDatabase()
+ || config.getWriteRelayDescriptorsRawFiles()
+ ? new RelayDescriptorDatabaseImporter(
+ config.getWriteRelayDescriptorDatabase()
+ ? config.getRelayDescriptorDatabaseJdbc() : null,
+ config.getWriteRelayDescriptorsRawFiles()
+ ? config.getRelayDescriptorRawFilesDirectory() : null,
config.getDirectoryArchivesDirectories(),
statsDirectory,
config.getKeepDirectoryArchiveImportHistory()) : null;
@@ -56,12 +59,12 @@ public class Main {
// Prepare consensus stats file handler (used for stats on running
// bridges only)
- ConsensusStatsFileHandler csfh = config.getWriteBridgeStats() ?
- new ConsensusStatsFileHandler(
- config.getRelayDescriptorDatabaseJDBC(),
+ ConsensusStatsFileHandler csfh = config.getWriteBridgeStats()
+ ? new ConsensusStatsFileHandler(
+ config.getRelayDescriptorDatabaseJdbc(),
new File(config.getSanitizedBridgesDirectory()),
- statsDirectory, config.getKeepSanitizedBridgesImportHistory()) :
- null;
+ statsDirectory, config.getKeepSanitizedBridgesImportHistory())
+ : null;
// Import sanitized bridges and write updated stats files to disk
if (csfh != null) {
diff --git a/modules/legacy/src/org/torproject/ernie/cron/RelayDescriptorDatabaseImporter.java b/modules/legacy/src/org/torproject/ernie/cron/RelayDescriptorDatabaseImporter.java
index 35128e7..d80b400 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/RelayDescriptorDatabaseImporter.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/RelayDescriptorDatabaseImporter.java
@@ -1,7 +1,19 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.ernie.cron;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.ExtraInfoDescriptor;
+import org.torproject.descriptor.NetworkStatusEntry;
+import org.torproject.descriptor.RelayNetworkStatusConsensus;
+import org.torproject.descriptor.ServerDescriptor;
+
+import org.postgresql.util.PGbytea;
+
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
@@ -29,16 +41,6 @@ import java.util.TreeSet;
import java.util.logging.Level;
import java.util.logging.Logger;
-import org.postgresql.util.PGbytea;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.ExtraInfoDescriptor;
-import org.torproject.descriptor.NetworkStatusEntry;
-import org.torproject.descriptor.RelayNetworkStatusConsensus;
-import org.torproject.descriptor.ServerDescriptor;
-
/**
* Parse directory data.
*/
@@ -53,15 +55,21 @@ public final class RelayDescriptorDatabaseImporter {
*/
private final long autoCommitCount = 500;
- /**
- * Keep track of the number of records committed before each transaction
- */
+ /* Counters to keep track of the number of records committed before
+ * each transaction. */
+
private int rdsCount = 0;
+
private int resCount = 0;
+
private int rhsCount = 0;
+
private int rrsCount = 0;
+
private int rcsCount = 0;
+
private int rvsCount = 0;
+
private int rqsCount = 0;
/**
@@ -171,22 +179,24 @@ public final class RelayDescriptorDatabaseImporter {
private Set<String> insertedStatusEntries = new HashSet<String>();
private boolean importIntoDatabase;
+
private boolean writeRawImportFiles;
private List<String> archivesDirectories;
+
private File statsDirectory;
+
private boolean keepImportHistory;
/**
* Initialize database importer by connecting to the database and
* preparing statements.
*/
- public RelayDescriptorDatabaseImporter(String connectionURL,
+ public RelayDescriptorDatabaseImporter(String connectionUrl,
String rawFilesDirectory, List<String> archivesDirectories,
File statsDirectory, boolean keepImportHistory) {
- if (archivesDirectories == null ||
- statsDirectory == null) {
+ if (archivesDirectories == null || statsDirectory == null) {
throw new IllegalArgumentException();
}
this.archivesDirectories = archivesDirectories;
@@ -197,10 +207,10 @@ public final class RelayDescriptorDatabaseImporter {
this.logger = Logger.getLogger(
RelayDescriptorDatabaseImporter.class.getName());
- if (connectionURL != null) {
+ if (connectionUrl != null) {
try {
/* Connect to database. */
- this.conn = DriverManager.getConnection(connectionURL);
+ this.conn = DriverManager.getConnection(connectionUrl);
/* Turn autocommit off */
this.conn.setAutoCommit(false);
@@ -275,7 +285,7 @@ public final class RelayDescriptorDatabaseImporter {
/**
* Insert network status consensus entry into database.
*/
- public void addStatusEntry(long validAfter, String nickname,
+ public void addStatusEntryContents(long validAfter, String nickname,
String fingerprint, String descriptor, long published,
String address, long orPort, long dirPort,
SortedSet<String> flags, String version, long bandwidth,
@@ -375,8 +385,8 @@ public final class RelayDescriptorDatabaseImporter {
+ (version != null ? version : "\\N") + "\t"
+ (bandwidth >= 0 ? bandwidth : "\\N") + "\t"
+ (ports != null ? ports : "\\N") + "\t");
- this.statusentryOut.write(PGbytea.toPGString(rawDescriptor).
- replaceAll("\\\\", "\\\\\\\\") + "\n");
+ this.statusentryOut.write(PGbytea.toPGString(rawDescriptor)
+ .replaceAll("\\\\", "\\\\\\\\") + "\n");
} catch (SQLException e) {
this.logger.log(Level.WARNING, "Could not write network status "
+ "consensus entry to raw database import file. We won't "
@@ -396,11 +406,11 @@ public final class RelayDescriptorDatabaseImporter {
/**
* Insert server descriptor into database.
*/
- public void addServerDescriptor(String descriptor, String nickname,
- String address, int orPort, int dirPort, String relayIdentifier,
- long bandwidthAvg, long bandwidthBurst, long bandwidthObserved,
- String platform, long published, long uptime,
- String extraInfoDigest) {
+ public void addServerDescriptorContents(String descriptor,
+ String nickname, String address, int orPort, int dirPort,
+ String relayIdentifier, long bandwidthAvg, long bandwidthBurst,
+ long bandwidthObserved, String platform, long published,
+ long uptime, String extraInfoDigest) {
if (this.importIntoDatabase) {
try {
this.addDateToScheduledUpdates(published);
@@ -481,7 +491,7 @@ public final class RelayDescriptorDatabaseImporter {
/**
* Insert extra-info descriptor into database.
*/
- public void addExtraInfoDescriptor(String extraInfoDigest,
+ public void addExtraInfoDescriptorContents(String extraInfoDigest,
String nickname, String fingerprint, long published,
List<String> bandwidthHistoryLines) {
if (!bandwidthHistoryLines.isEmpty()) {
@@ -520,37 +530,47 @@ public final class RelayDescriptorDatabaseImporter {
public void free() {
throw new UnsupportedOperationException();
}
+
public Object getArray() {
throw new UnsupportedOperationException();
}
+
public Object getArray(long index, int count) {
throw new UnsupportedOperationException();
}
+
public Object getArray(long index, int count,
Map<String, Class<?>> map) {
throw new UnsupportedOperationException();
}
+
public Object getArray(Map<String, Class<?>> map) {
throw new UnsupportedOperationException();
}
+
public int getBaseType() {
throw new UnsupportedOperationException();
}
+
public ResultSet getResultSet() {
throw new UnsupportedOperationException();
}
+
public ResultSet getResultSet(long index, int count) {
throw new UnsupportedOperationException();
}
+
public ResultSet getResultSet(long index, int count,
Map<String, Class<?>> map) {
throw new UnsupportedOperationException();
}
+
public ResultSet getResultSet(Map<String, Class<?>> map) {
throw new UnsupportedOperationException();
}
}
+ /** Inserts a bandwidth history into database. */
public void addBandwidthHistory(String fingerprint, long published,
List<String> bandwidthHistoryStrings) {
@@ -600,18 +620,19 @@ public final class RelayDescriptorDatabaseImporter {
}
String type = parts[0];
String intervalEndTime = parts[1] + " " + parts[2];
- long intervalEnd, dateStart;
+ long intervalEnd;
+ long dateStart;
try {
intervalEnd = dateTimeFormat.parse(intervalEndTime).getTime();
- dateStart = dateTimeFormat.parse(parts[1] + " 00:00:00").
- getTime();
+ dateStart = dateTimeFormat.parse(parts[1] + " 00:00:00")
+ .getTime();
} catch (ParseException e) {
this.logger.fine("Parse exception while parsing timestamp in "
+ "bandwidth history line. Ignoring this line.");
continue;
}
- if (Math.abs(published - intervalEnd) >
- 7L * 24L * 60L * 60L * 1000L) {
+ if (Math.abs(published - intervalEnd)
+ > 7L * 24L * 60L * 60L * 1000L) {
this.logger.fine("Extra-info descriptor publication time "
+ dateTimeFormat.format(published) + " and last interval "
+ "time " + intervalEndTime + " in " + type + " line differ "
@@ -651,15 +672,19 @@ public final class RelayDescriptorDatabaseImporter {
/* Add split history lines to database. */
String lastDate = null;
historyLinesByDate.add("EOL");
- long[] readArray = null, writtenArray = null, dirreadArray = null,
- dirwrittenArray = null;
- int readOffset = 0, writtenOffset = 0, dirreadOffset = 0,
- dirwrittenOffset = 0;
+ long[] readArray = null;
+ long[] writtenArray = null;
+ long[] dirreadArray = null;
+ long[] dirwrittenArray = null;
+ int readOffset = 0;
+ int writtenOffset = 0;
+ int dirreadOffset = 0;
+ int dirwrittenOffset = 0;
for (String historyLine : historyLinesByDate) {
String[] parts = historyLine.split(" ");
String currentDate = parts[0];
- if (lastDate != null && (historyLine.equals("EOL") ||
- !currentDate.equals(lastDate))) {
+ if (lastDate != null && (historyLine.equals("EOL")
+ || !currentDate.equals(lastDate))) {
BigIntArray readIntArray = new BigIntArray(readArray,
readOffset);
BigIntArray writtenIntArray = new BigIntArray(writtenArray,
@@ -804,6 +829,7 @@ public final class RelayDescriptorDatabaseImporter {
}
}
+ /** Imports relay descriptors into the database. */
public void importRelayDescriptors() {
logger.fine("Importing files in directories " + archivesDirectories
+ "/...");
@@ -845,9 +871,9 @@ public final class RelayDescriptorDatabaseImporter {
private void addRelayNetworkStatusConsensus(
RelayNetworkStatusConsensus consensus) {
- for (NetworkStatusEntry statusEntry :
- consensus.getStatusEntries().values()) {
- this.addStatusEntry(consensus.getValidAfterMillis(),
+ for (NetworkStatusEntry statusEntry
+ : consensus.getStatusEntries().values()) {
+ this.addStatusEntryContents(consensus.getValidAfterMillis(),
statusEntry.getNickname(),
statusEntry.getFingerprint().toLowerCase(),
statusEntry.getDescriptor().toLowerCase(),
@@ -861,13 +887,14 @@ public final class RelayDescriptorDatabaseImporter {
}
private void addServerDescriptor(ServerDescriptor descriptor) {
- this.addServerDescriptor(descriptor.getServerDescriptorDigest(),
- descriptor.getNickname(), descriptor.getAddress(),
- descriptor.getOrPort(), descriptor.getDirPort(),
- descriptor.getFingerprint(), descriptor.getBandwidthRate(),
- descriptor.getBandwidthBurst(), descriptor.getBandwidthObserved(),
- descriptor.getPlatform(), descriptor.getPublishedMillis(),
- descriptor.getUptime(), descriptor.getExtraInfoDigest());
+ this.addServerDescriptorContents(
+ descriptor.getServerDescriptorDigest(), descriptor.getNickname(),
+ descriptor.getAddress(), descriptor.getOrPort(),
+ descriptor.getDirPort(), descriptor.getFingerprint(),
+ descriptor.getBandwidthRate(), descriptor.getBandwidthBurst(),
+ descriptor.getBandwidthObserved(), descriptor.getPlatform(),
+ descriptor.getPublishedMillis(), descriptor.getUptime(),
+ descriptor.getExtraInfoDigest());
}
private void addExtraInfoDescriptor(ExtraInfoDescriptor descriptor) {
@@ -886,7 +913,7 @@ public final class RelayDescriptorDatabaseImporter {
bandwidthHistoryLines.add(
descriptor.getDirreqReadHistory().getLine());
}
- this.addExtraInfoDescriptor(descriptor.getExtraInfoDigest(),
+ this.addExtraInfoDescriptorContents(descriptor.getExtraInfoDigest(),
descriptor.getNickname(),
descriptor.getFingerprint().toLowerCase(),
descriptor.getPublishedMillis(), bandwidthHistoryLines);
@@ -930,8 +957,8 @@ public final class RelayDescriptorDatabaseImporter {
this.conn.commit();
} catch (SQLException e) {
- this.logger.log(Level.WARNING, "Could not commit final records to "
- + "database", e);
+ this.logger.log(Level.WARNING, "Could not commit final records "
+ + "to database", e);
}
try {
this.conn.close();
diff --git a/modules/legacy/src/org/torproject/ernie/cron/network/ConsensusStatsFileHandler.java b/modules/legacy/src/org/torproject/ernie/cron/network/ConsensusStatsFileHandler.java
index d5cae37..6222859 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/network/ConsensusStatsFileHandler.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/network/ConsensusStatsFileHandler.java
@@ -1,7 +1,15 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.ernie.cron.network;
+import org.torproject.descriptor.BridgeNetworkStatus;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.NetworkStatusEntry;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
@@ -25,13 +33,6 @@ import java.util.TreeMap;
import java.util.logging.Level;
import java.util.logging.Logger;
-import org.torproject.descriptor.BridgeNetworkStatus;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.NetworkStatusEntry;
-
/**
* Generates statistics on the average number of relays and bridges per
* day. Accepts parse results from <code>RelayDescriptorParser</code> and
@@ -71,7 +72,7 @@ public class ConsensusStatsFileHandler {
private int bridgeResultsAdded = 0;
/* Database connection string. */
- private String connectionURL = null;
+ private String connectionUrl = null;
private SimpleDateFormat dateTimeFormat;
@@ -81,13 +82,13 @@ public class ConsensusStatsFileHandler {
private boolean keepImportHistory;
- /**
- * Initializes this class, including reading in intermediate results
- * files <code>stats/consensus-stats-raw</code> and
- * <code>stats/bridge-consensus-stats-raw</code> and final results file
- * <code>stats/consensus-stats</code>.
- */
- public ConsensusStatsFileHandler(String connectionURL,
+ /**
+ * Initializes this class, including reading in intermediate results
+ * files <code>stats/consensus-stats-raw</code> and
+ * <code>stats/bridge-consensus-stats-raw</code> and final results file
+ * <code>stats/consensus-stats</code>.
+ */
+ public ConsensusStatsFileHandler(String connectionUrl,
File bridgesDir, File statsDirectory,
boolean keepImportHistory) {
@@ -108,7 +109,7 @@ public class ConsensusStatsFileHandler {
"stats/bridge-consensus-stats-raw");
/* Initialize database connection string. */
- this.connectionURL = connectionURL;
+ this.connectionUrl = connectionUrl;
this.dateTimeFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
this.dateTimeFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
@@ -168,13 +169,14 @@ public class ConsensusStatsFileHandler {
this.bridgeResultsAdded++;
} else if (!line.equals(this.bridgesRaw.get(published))) {
this.logger.warning("The numbers of running bridges we were just "
- + "given (" + line + ") are different from what we learned "
- + "before (" + this.bridgesRaw.get(published) + ")! "
- + "Overwriting!");
+ + "given (" + line + ") are different from what we learned "
+ + "before (" + this.bridgesRaw.get(published) + ")! "
+ + "Overwriting!");
this.bridgesRaw.put(published, line);
}
}
+ /** Imports sanitized bridge descriptors. */
public void importSanitizedBridges() {
if (bridgesDir.exists()) {
logger.fine("Importing files in directory " + bridgesDir + "/...");
@@ -202,9 +204,10 @@ public class ConsensusStatsFileHandler {
}
private void addBridgeNetworkStatus(BridgeNetworkStatus status) {
- int runningBridges = 0, runningEc2Bridges = 0;
- for (NetworkStatusEntry statusEntry :
- status.getStatusEntries().values()) {
+ int runningBridges = 0;
+ int runningEc2Bridges = 0;
+ for (NetworkStatusEntry statusEntry
+ : status.getStatusEntries().values()) {
if (statusEntry.getFlags().contains("Running")) {
runningBridges++;
if (statusEntry.getNickname().startsWith("ec2bridge")) {
@@ -227,7 +230,9 @@ public class ConsensusStatsFileHandler {
* final results. */
if (!this.bridgesRaw.isEmpty()) {
String tempDate = null;
- int brunning = 0, brunningEc2 = 0, statuses = 0;
+ int brunning = 0;
+ int brunningEc2 = 0;
+ int statuses = 0;
Iterator<String> it = this.bridgesRaw.values().iterator();
boolean haveWrittenFinalLine = false;
while (it.hasNext() || !haveWrittenFinalLine) {
@@ -287,12 +292,12 @@ public class ConsensusStatsFileHandler {
}
/* Add average number of bridges per day to the database. */
- if (connectionURL != null) {
+ if (connectionUrl != null) {
try {
- Map<String, String> insertRows = new HashMap<String, String>(),
- updateRows = new HashMap<String, String>();
+ Map<String, String> insertRows = new HashMap<String, String>();
+ Map<String, String> updateRows = new HashMap<String, String>();
insertRows.putAll(this.bridgesPerDay);
- Connection conn = DriverManager.getConnection(connectionURL);
+ Connection conn = DriverManager.getConnection(connectionUrl);
conn.setAutoCommit(false);
Statement statement = conn.createStatement();
ResultSet rs = statement.executeQuery(
@@ -307,8 +312,8 @@ public class ConsensusStatsFileHandler {
long newAvgRunningEc2 = Long.parseLong(parts[1]);
long oldAvgRunning = rs.getLong(2);
long oldAvgRunningEc2 = rs.getLong(3);
- if (newAvgRunning != oldAvgRunning ||
- newAvgRunningEc2 != oldAvgRunningEc2) {
+ if (newAvgRunning != oldAvgRunning
+ || newAvgRunningEc2 != oldAvgRunningEc2) {
updateRows.put(date, insertRow);
}
}
diff --git a/modules/legacy/src/org/torproject/ernie/cron/performance/TorperfProcessor.java b/modules/legacy/src/org/torproject/ernie/cron/performance/TorperfProcessor.java
index ed3a0af..b59345f 100644
--- a/modules/legacy/src/org/torproject/ernie/cron/performance/TorperfProcessor.java
+++ b/modules/legacy/src/org/torproject/ernie/cron/performance/TorperfProcessor.java
@@ -1,7 +1,14 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.ernie.cron.performance;
+import org.torproject.descriptor.Descriptor;
+import org.torproject.descriptor.DescriptorFile;
+import org.torproject.descriptor.DescriptorReader;
+import org.torproject.descriptor.DescriptorSourceFactory;
+import org.torproject.descriptor.TorperfResult;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
@@ -20,13 +27,10 @@ import java.util.TreeMap;
import java.util.logging.Level;
import java.util.logging.Logger;
-import org.torproject.descriptor.Descriptor;
-import org.torproject.descriptor.DescriptorFile;
-import org.torproject.descriptor.DescriptorReader;
-import org.torproject.descriptor.DescriptorSourceFactory;
-import org.torproject.descriptor.TorperfResult;
-
public class TorperfProcessor {
+
+ /** Processes Torperf data from the given directory and writes
+ * aggregates statistics to the given stats directory. */
public TorperfProcessor(File torperfDirectory, File statsDirectory) {
if (torperfDirectory == null || statsDirectory == null) {
@@ -114,9 +118,9 @@ public class TorperfProcessor {
- result.getStartMillis();
String key = source + "," + dateTime;
String value = key;
- if ((result.didTimeout() == null &&
- result.getDataCompleteMillis() < 1) ||
- (result.didTimeout() != null && result.didTimeout())) {
+ if ((result.didTimeout() == null
+ && result.getDataCompleteMillis() < 1)
+ || (result.didTimeout() != null && result.didTimeout())) {
value += ",-2"; // -2 for timeout
} else if (result.getReadBytes() < fileSize) {
value += ",-1"; // -1 for failure
@@ -146,9 +150,12 @@ public class TorperfProcessor {
new TreeMap<String, List<Long>>();
SortedMap<String, long[]> statusesAllSources =
new TreeMap<String, long[]>();
- long failures = 0, timeouts = 0, requests = 0;
+ long failures = 0;
+ long timeouts = 0;
+ long requests = 0;
while (it.hasNext() || !haveWrittenFinalLine) {
- Map.Entry<String, String> next = it.hasNext() ? it.next() : null;
+ Map.Entry<String, String> next =
+ it.hasNext() ? it.next() : null;
if (tempSourceDate != null
&& (next == null || !(next.getValue().split(",")[0] + ","
+ next.getValue().split(",")[1]).equals(tempSourceDate))) {
@@ -211,18 +218,18 @@ public class TorperfProcessor {
}
}
bw.close();
- for (Map.Entry<String, List<Long>> e :
- dlTimesAllSources.entrySet()) {
+ for (Map.Entry<String, List<Long>> e
+ : dlTimesAllSources.entrySet()) {
String allDateSizeSource = e.getKey();
dlTimes = e.getValue();
Collections.sort(dlTimes);
- long q1 = dlTimes.get(dlTimes.size() / 4 - 1);
- long md = dlTimes.get(dlTimes.size() / 2 - 1);
- long q3 = dlTimes.get(dlTimes.size() * 3 / 4 - 1);
long[] status = statusesAllSources.get(allDateSizeSource);
timeouts = status[0];
failures = status[1];
requests = status[2];
+ long q1 = dlTimes.get(dlTimes.size() / 4 - 1);
+ long md = dlTimes.get(dlTimes.size() / 2 - 1);
+ long q3 = dlTimes.get(dlTimes.size() * 3 / 4 - 1);
stats.put(allDateSizeSource,
String.format("%s,%s,%s,%s,%s,%s,%s",
allDateSizeSource, q1, md, q3, timeouts, failures,
diff --git a/shared/.gitignore b/shared/.gitignore
new file mode 100644
index 0000000..c3e32d5
--- /dev/null
+++ b/shared/.gitignore
@@ -0,0 +1,4 @@
+/generated/
+/lib/
+/stats/
+
diff --git a/shared/build.xml b/shared/build.xml
new file mode 100644
index 0000000..ae4d292
--- /dev/null
+++ b/shared/build.xml
@@ -0,0 +1,39 @@
+<project default="checks" name="metrics-web" basedir=".">
+ <property name="libs" value="lib"/>
+ <property name="checks" value="resources/metrics_checks.xml"/>
+ <property name="generated" value="generated/"/>
+ <property name="report" value="${generated}/checkstyle_report.txt"/>
+ <path id="checkstyle.classpath" >
+ <fileset dir="${libs}">
+ <include name="checkstyle-6.17-all.jar" />
+ </fileset>
+ </path>
+ <target name="init">
+ <mkdir dir="${generated}"/>
+ </target>
+ <target name="clean">
+ <delete includeEmptyDirs="true" quiet="true">
+ <fileset dir="${generated}" defaultexcludes="false" includes="**" />
+ </delete>
+ </target>
+ <taskdef resource="com/puppycrawl/tools/checkstyle/ant/checkstyle-ant-task.properties">
+ <classpath refid="checkstyle.classpath" />
+ </taskdef>
+ <target name="checks" depends="init">
+ <checkstyle config="${checks}">
+ <fileset dir="../website/src" includes="**/*.java"/>
+ <fileset dir="../modules/advbwdist/src" includes="**/*.java"/>
+ <fileset dir="../modules/clients/src" includes="**/*.java"/>
+ <fileset dir="../modules/collectdescs/src" includes="**/*.java"/>
+ <fileset dir="../modules/connbidirect/src" includes="**/*.java"/>
+ <fileset dir="../modules/disagreement/src" includes="**/*.java"/>
+ <fileset dir="../modules/hidserv/src" includes="**/*.java"/>
+ <fileset dir="../modules/legacy/src" includes="**/*.java"/>
+ <classpath>
+ <path refid="checkstyle.classpath" />
+ </classpath>
+ <formatter type="plain" toFile="${report}" />
+ </checkstyle>
+ </target>
+</project>
+
diff --git a/shared/resources/metrics_checks.xml b/shared/resources/metrics_checks.xml
new file mode 100644
index 0000000..2df2f2a
--- /dev/null
+++ b/shared/resources/metrics_checks.xml
@@ -0,0 +1,221 @@
+<?xml version="1.0"?>
+<!DOCTYPE module PUBLIC
+ "-//Puppy Crawl//DTD Check Configuration 1.3//EN"
+ "http://www.puppycrawl.com/dtds/configuration_1_3.dtd">
+
+<!--
+ Checkstyle configuration that checks the Google coding conventions from Google Java Style
+ that can be found at https://google.github.io/styleguide/javaguide.html with the following
+ modifications:
+
+ - Replaced com.google with org.torproject in import statement ordering
+ [CustomImportOrder].
+
+ - Relaxed requirement that catch parameters must be at least two
+ characters long [CatchParameterName].
+
+ - Enabled suppression of warnings using annotations.
+
+ Checkstyle is very configurable. Be sure to read the documentation at
+ http://checkstyle.sf.net (or in your downloaded distribution).
+
+ To completely disable a check, just comment it out or delete it from the file.
+
+ Authors: Max Vetrenko, Ruslan Diachenko, Roman Ivanov.
+ -->
+
+<module name = "Checker">
+ <property name="charset" value="UTF-8"/>
+
+ <property name="severity" value="warning"/>
+
+ <property name="fileExtensions" value="java, properties, xml"/>
+ <!-- Checks for whitespace -->
+ <!-- See http://checkstyle.sf.net/config_whitespace.html -->
+ <module name="FileTabCharacter">
+ <property name="eachLine" value="true"/>
+ </module>
+
+ <module name="SuppressWarningsFilter" />
+ <module name="TreeWalker">
+ <module name="OuterTypeFilename"/>
+ <module name="IllegalTokenText">
+ <property name="tokens" value="STRING_LITERAL, CHAR_LITERAL"/>
+ <property name="format" value="\\u00(08|09|0(a|A)|0(c|C)|0(d|D)|22|27|5(C|c))|\\(0(10|11|12|14|15|42|47)|134)"/>
+ <property name="message" value="Avoid using corresponding octal or Unicode escape."/>
+ </module>
+ <module name="AvoidEscapedUnicodeCharacters">
+ <property name="allowEscapesForControlCharacters" value="true"/>
+ <property name="allowByTailComment" value="true"/>
+ <property name="allowNonPrintableEscapes" value="true"/>
+ </module>
+ <module name="LineLength">
+ <property name="max" value="100"/>
+ <property name="ignorePattern" value="^package.*|^import.*|a href|href|http://|https://|ftp://"/>
+ </module>
+ <module name="AvoidStarImport"/>
+ <module name="OneTopLevelClass"/>
+ <module name="NoLineWrap"/>
+ <module name="EmptyBlock">
+ <property name="option" value="TEXT"/>
+ <property name="tokens" value="LITERAL_TRY, LITERAL_FINALLY, LITERAL_IF, LITERAL_ELSE, LITERAL_SWITCH"/>
+ </module>
+ <module name="NeedBraces"/>
+ <module name="LeftCurly">
+ <property name="maxLineLength" value="100"/>
+ </module>
+ <module name="RightCurly"/>
+ <module name="RightCurly">
+ <property name="option" value="alone"/>
+ <property name="tokens" value="CLASS_DEF, METHOD_DEF, CTOR_DEF, LITERAL_FOR, LITERAL_WHILE, LITERAL_DO, STATIC_INIT, INSTANCE_INIT"/>
+ </module>
+ <module name="WhitespaceAround">
+ <property name="allowEmptyConstructors" value="true"/>
+ <property name="allowEmptyMethods" value="true"/>
+ <property name="allowEmptyTypes" value="true"/>
+ <property name="allowEmptyLoops" value="true"/>
+ <message key="ws.notFollowed"
+ value="WhitespaceAround: ''{0}'' is not followed by whitespace. Empty blocks may only be represented as '{}' when not part of a multi-block statement (4.1.3)"/>
+ <message key="ws.notPreceded"
+ value="WhitespaceAround: ''{0}'' is not preceded with whitespace."/>
+ </module>
+ <module name="OneStatementPerLine"/>
+ <module name="MultipleVariableDeclarations"/>
+ <module name="ArrayTypeStyle"/>
+ <module name="MissingSwitchDefault"/>
+ <module name="FallThrough"/>
+ <module name="UpperEll"/>
+ <module name="ModifierOrder"/>
+ <module name="EmptyLineSeparator">
+ <property name="allowNoEmptyLineBetweenFields" value="true"/>
+ </module>
+ <module name="SeparatorWrap">
+ <property name="tokens" value="DOT"/>
+ <property name="option" value="nl"/>
+ </module>
+ <module name="SeparatorWrap">
+ <property name="tokens" value="COMMA"/>
+ <property name="option" value="EOL"/>
+ </module>
+ <module name="PackageName">
+ <property name="format" value="^[a-z]+(\.[a-z][a-z0-9]*)*$"/>
+ <message key="name.invalidPattern"
+ value="Package name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="TypeName">
+ <message key="name.invalidPattern"
+ value="Type name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="MemberName">
+ <property name="format" value="^[a-z][a-z0-9][a-zA-Z0-9]*$"/>
+ <message key="name.invalidPattern"
+ value="Member name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="ParameterName">
+ <property name="format" value="^[a-z][a-z0-9][a-zA-Z0-9]*$"/>
+ <message key="name.invalidPattern"
+ value="Parameter name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="CatchParameterName">
+ <property name="format" value="^[a-z][a-zA-Z0-9]*$"/>
+ <message key="name.invalidPattern"
+ value="Catch parameter name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="LocalVariableName">
+ <property name="tokens" value="VARIABLE_DEF"/>
+ <property name="format" value="^[a-z][a-z0-9][a-zA-Z0-9]*$"/>
+ <property name="allowOneCharVarInForLoop" value="true"/>
+ <message key="name.invalidPattern"
+ value="Local variable name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="ClassTypeParameterName">
+ <property name="format" value="(^[A-Z][0-9]?)$|([A-Z][a-zA-Z0-9]*[T]$)"/>
+ <message key="name.invalidPattern"
+ value="Class type name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="MethodTypeParameterName">
+ <property name="format" value="(^[A-Z][0-9]?)$|([A-Z][a-zA-Z0-9]*[T]$)"/>
+ <message key="name.invalidPattern"
+ value="Method type name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="InterfaceTypeParameterName">
+ <property name="format" value="(^[A-Z][0-9]?)$|([A-Z][a-zA-Z0-9]*[T]$)"/>
+ <message key="name.invalidPattern"
+ value="Interface type name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="NoFinalizer"/>
+ <module name="GenericWhitespace">
+ <message key="ws.followed"
+ value="GenericWhitespace ''{0}'' is followed by whitespace."/>
+ <message key="ws.preceded"
+ value="GenericWhitespace ''{0}'' is preceded with whitespace."/>
+ <message key="ws.illegalFollow"
+ value="GenericWhitespace ''{0}'' should followed by whitespace."/>
+ <message key="ws.notPreceded"
+ value="GenericWhitespace ''{0}'' is not preceded with whitespace."/>
+ </module>
+ <module name="Indentation">
+ <property name="basicOffset" value="2"/>
+ <property name="braceAdjustment" value="0"/>
+ <property name="caseIndent" value="2"/>
+ <property name="throwsIndent" value="4"/>
+ <property name="lineWrappingIndentation" value="4"/>
+ <property name="arrayInitIndent" value="2"/>
+ </module>
+ <module name="AbbreviationAsWordInName">
+ <property name="ignoreFinal" value="false"/>
+ <property name="allowedAbbreviationLength" value="1"/>
+ </module>
+ <module name="OverloadMethodsDeclarationOrder"/>
+ <module name="VariableDeclarationUsageDistance"/>
+ <module name="CustomImportOrder">
+ <property name="specialImportsRegExp" value="org.torproject"/>
+ <property name="sortImportsInGroupAlphabetically" value="true"/>
+ <property name="customImportOrderRules" value="STATIC###SPECIAL_IMPORTS###THIRD_PARTY_PACKAGE###STANDARD_JAVA_PACKAGE"/>
+ </module>
+ <module name="MethodParamPad"/>
+ <module name="OperatorWrap">
+ <property name="option" value="NL"/>
+ <property name="tokens" value="BAND, BOR, BSR, BXOR, DIV, EQUAL, GE, GT, LAND, LE, LITERAL_INSTANCEOF, LOR, LT, MINUS, MOD, NOT_EQUAL, PLUS, QUESTION, SL, SR, STAR "/>
+ </module>
+ <module name="AnnotationLocation">
+ <property name="tokens" value="CLASS_DEF, INTERFACE_DEF, ENUM_DEF, METHOD_DEF, CTOR_DEF"/>
+ </module>
+ <module name="AnnotationLocation">
+ <property name="tokens" value="VARIABLE_DEF"/>
+ <property name="allowSamelineMultipleAnnotations" value="true"/>
+ </module>
+ <module name="NonEmptyAtclauseDescription"/>
+ <module name="JavadocTagContinuationIndentation"/>
+ <module name="SummaryJavadoc">
+ <property name="forbiddenSummaryFragments" value="^@return the *|^This method returns |^A [{]@code [a-zA-Z0-9]+[}]( is a )"/>
+ </module>
+ <module name="JavadocParagraph"/>
+ <module name="AtclauseOrder">
+ <property name="tagOrder" value="@param, @return, @throws, @deprecated"/>
+ <property name="target" value="CLASS_DEF, INTERFACE_DEF, ENUM_DEF, METHOD_DEF, CTOR_DEF, VARIABLE_DEF"/>
+ </module>
+ <module name="JavadocMethod">
+ <property name="scope" value="public"/>
+ <property name="allowMissingParamTags" value="true"/>
+ <property name="allowMissingThrowsTags" value="true"/>
+ <property name="allowMissingReturnTag" value="true"/>
+ <property name="minLineCount" value="2"/>
+ <property name="allowedAnnotations" value="Override, Test"/>
+ <property name="allowThrowsTagsForSubclasses" value="true"/>
+ </module>
+ <module name="MethodName">
+ <property name="format" value="^[a-z][a-z0-9][a-zA-Z0-9_]*$"/>
+ <message key="name.invalidPattern"
+ value="Method name ''{0}'' must match pattern ''{1}''."/>
+ </module>
+ <module name="SingleLineJavadoc">
+ <property name="ignoreInlineTags" value="false"/>
+ </module>
+ <module name="EmptyCatchBlock">
+ <property name="exceptionVariableName" value="expected"/>
+ </module>
+ <module name="CommentsIndentation"/>
+ <module name="SuppressWarningsHolder" />
+ </module>
+</module>
diff --git a/website/src/org/torproject/metrics/web/AboutServlet.java b/website/src/org/torproject/metrics/web/AboutServlet.java
index 5a59ce0..3e377c1 100644
--- a/website/src/org/torproject/metrics/web/AboutServlet.java
+++ b/website/src/org/torproject/metrics/web/AboutServlet.java
@@ -1,5 +1,6 @@
-/* Copyright 2014 The Tor Project
+/* Copyright 2014--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
import java.io.IOException;
@@ -13,6 +14,7 @@ public class AboutServlet extends HttpServlet {
private static final long serialVersionUID = 97168997894664L;
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
diff --git a/website/src/org/torproject/metrics/web/DataServlet.java b/website/src/org/torproject/metrics/web/DataServlet.java
index bbc60c5..ac7cb2a 100644
--- a/website/src/org/torproject/metrics/web/DataServlet.java
+++ b/website/src/org/torproject/metrics/web/DataServlet.java
@@ -1,5 +1,6 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
import java.io.IOException;
@@ -11,18 +12,19 @@ import javax.servlet.http.HttpServletResponse;
@SuppressWarnings("serial")
public class DataServlet extends MetricServlet {
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
- String requestURI = request.getRequestURI();
- if (requestURI == null || !requestURI.endsWith(".html")) {
+ String requestUri = request.getRequestURI();
+ if (requestUri == null || !requestUri.endsWith(".html")) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
- String requestedId = requestURI.substring(
- requestURI.contains("/") ? requestURI.lastIndexOf("/") + 1 : 0,
- requestURI.length() - 5);
- if (!this.idsByType.containsKey("Data") ||
- !this.idsByType.get("Data").contains(requestedId)) {
+ String requestedId = requestUri.substring(
+ requestUri.contains("/") ? requestUri.lastIndexOf("/") + 1 : 0,
+ requestUri.length() - 5);
+ if (!this.idsByType.containsKey("Data")
+ || !this.idsByType.get("Data").contains(requestedId)) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
diff --git a/website/src/org/torproject/metrics/web/GraphServlet.java b/website/src/org/torproject/metrics/web/GraphServlet.java
index 05139ed..189406c 100644
--- a/website/src/org/torproject/metrics/web/GraphServlet.java
+++ b/website/src/org/torproject/metrics/web/GraphServlet.java
@@ -1,7 +1,11 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
+import org.torproject.metrics.web.graphs.Countries;
+import org.torproject.metrics.web.graphs.GraphParameterChecker;
+
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Arrays;
@@ -17,15 +21,13 @@ import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
-import org.torproject.metrics.web.graphs.Countries;
-import org.torproject.metrics.web.graphs.GraphParameterChecker;
-
@SuppressWarnings("serial")
public class GraphServlet extends MetricServlet {
private Map<String, String[][]> defaultParameters =
new HashMap<String, String[][]>();
+ @Override
public void init() throws ServletException {
super.init();
this.defaultParameters.put("p", new String[][] {
@@ -79,10 +81,10 @@ public class GraphServlet extends MetricServlet {
List<String[]> knownCountries =
Countries.getInstance().getCountryList();
String[][] countries = new String[knownCountries.size() + 1][];
- int i = 0;
- countries[i++] = new String[] { "all", " selected", "All users" };
+ int index = 0;
+ countries[index++] = new String[] { "all", " selected", "All users" };
for (String[] country : knownCountries) {
- countries[i++] = new String[] { country[0], "", country[1] };
+ countries[index++] = new String[] { country[0], "", country[1] };
}
this.defaultParameters.put("country", countries);
this.defaultParameters.put("events", new String[][] {
@@ -115,18 +117,19 @@ public class GraphServlet extends MetricServlet {
{ "5mb", "", "5 MiB" } });
}
+ @Override
protected void doGet(HttpServletRequest request,
HttpServletResponse response) throws ServletException, IOException {
- String requestURI = request.getRequestURI();
- if (requestURI == null || !requestURI.endsWith(".html")) {
+ String requestUri = request.getRequestURI();
+ if (requestUri == null || !requestUri.endsWith(".html")) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
- String requestedId = requestURI.substring(
- requestURI.contains("/") ? requestURI.lastIndexOf("/") + 1 : 0,
- requestURI.length() - 5);
- if (!this.idsByType.containsKey("Graph") ||
- !this.idsByType.get("Graph").contains(requestedId)) {
+ String requestedId = requestUri.substring(
+ requestUri.contains("/") ? requestUri.lastIndexOf("/") + 1 : 0,
+ requestUri.length() - 5);
+ if (!this.idsByType.containsKey("Graph")
+ || !this.idsByType.get("Graph").contains(requestedId)) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
@@ -142,15 +145,15 @@ public class GraphServlet extends MetricServlet {
Date defaultStartDate = new Date(defaultEndDate.getTime()
- 90L * 24L * 60L * 60L * 1000L);
if (this.parameters.containsKey(requestedId)) {
- Map<String, String[]> checkedParameters = GraphParameterChecker.
- getInstance().checkParameters(requestedId,
+ Map<String, String[]> checkedParameters = GraphParameterChecker
+ .getInstance().checkParameters(requestedId,
request.getParameterMap());
StringBuilder urlBuilder = new StringBuilder();
for (String parameter : this.parameters.get(requestedId)) {
if (parameter.equals("start") || parameter.equals("end")) {
String[] requestParameter;
- if (checkedParameters != null &&
- checkedParameters.containsKey(parameter)) {
+ if (checkedParameters != null
+ && checkedParameters.containsKey(parameter)) {
requestParameter = checkedParameters.get(parameter);
} else {
requestParameter = new String[] {
@@ -160,27 +163,27 @@ public class GraphServlet extends MetricServlet {
urlBuilder.append(String.format("&%s=%s", parameter,
requestParameter[0]));
request.setAttribute(parameter, requestParameter);
- } else if (parameter.equals("p") ||
- parameter.equals("n") ||
- parameter.equals("flag") ||
- parameter.equals("country") ||
- parameter.equals("events") ||
- parameter.equals("transport") ||
- parameter.equals("version") ||
- parameter.equals("source") ||
- parameter.equals("filesize")) {
+ } else if (parameter.equals("p")
+ || parameter.equals("n")
+ || parameter.equals("flag")
+ || parameter.equals("country")
+ || parameter.equals("events")
+ || parameter.equals("transport")
+ || parameter.equals("version")
+ || parameter.equals("source")
+ || parameter.equals("filesize")) {
String[][] defaultParameters =
this.defaultParameters.get(parameter);
String[][] requestParameters =
new String[defaultParameters.length][];
Set<String> checked = null;
- if (checkedParameters != null &&
- checkedParameters.containsKey(parameter)) {
+ if (checkedParameters != null
+ && checkedParameters.containsKey(parameter)) {
checked = new HashSet<String>(Arrays.asList(
checkedParameters.get(parameter)));
}
- String checkedOrSelected = parameter.equals("country") ||
- parameter.equals("events") || parameter.equals("version")
+ String checkedOrSelected = parameter.equals("country")
+ || parameter.equals("events") || parameter.equals("version")
? " selected" : " checked";
for (int i = 0; i < defaultParameters.length; i++) {
requestParameters[i] =
diff --git a/website/src/org/torproject/metrics/web/IndexServlet.java b/website/src/org/torproject/metrics/web/IndexServlet.java
index d3c0b35..576bac2 100644
--- a/website/src/org/torproject/metrics/web/IndexServlet.java
+++ b/website/src/org/torproject/metrics/web/IndexServlet.java
@@ -1,5 +1,6 @@
/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
import java.io.IOException;
@@ -60,16 +61,18 @@ public class IndexServlet extends HttpServlet {
private List<Metric> availableMetrics;
+ @Override
public void init() throws ServletException {
this.availableMetrics = new ArrayList<Metric>();
- for (org.torproject.metrics.web.Metric metric :
- MetricsProvider.getInstance().getMetricsList()) {
+ for (org.torproject.metrics.web.Metric metric
+ : MetricsProvider.getInstance().getMetricsList()) {
this.availableMetrics.add(new Metric(metric.getId() + ".html",
metric.getTitle(), metric.getTags(), metric.getType(),
metric.getLevel()));
}
}
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
@SuppressWarnings("rawtypes")
@@ -102,8 +105,8 @@ public class IndexServlet extends HttpServlet {
private BitSet parseParameter(String[] unparsedValues,
String[][] knownValues, String[] defaultValues) {
BitSet result = new BitSet();
- if (unparsedValues == null || unparsedValues.length == 0 ||
- unparsedValues.length > knownValues.length) {
+ if (unparsedValues == null || unparsedValues.length == 0
+ || unparsedValues.length > knownValues.length) {
unparsedValues = defaultValues;
}
Set<String> requestedValues =
@@ -130,11 +133,17 @@ public class IndexServlet extends HttpServlet {
}
private static class Metric {
+
private String url;
+
private String name;
+
private BitSet tags;
+
private BitSet type;
+
private BitSet level;
+
private Metric(String url, String name, String[] tagStrings,
String typeString, String levelString) {
this.url = url;
@@ -143,6 +152,7 @@ public class IndexServlet extends HttpServlet {
this.type = this.convertStringToBitSet(knownTypes, typeString);
this.level = this.convertStringToBitSet(knownLevels, levelString);
}
+
private BitSet convertStringsToBitSet(String[][] knownKeysAndValues,
String[] givenKeyStrings) {
BitSet result = new BitSet(knownKeysAndValues.length);
@@ -158,23 +168,26 @@ public class IndexServlet extends HttpServlet {
}
return result;
}
+
private BitSet convertStringToBitSet(String[][] knownKeysAndValues,
String givenKeyString) {
return this.convertStringsToBitSet(knownKeysAndValues,
new String[] { givenKeyString });
}
+
private String[] toStrings() {
return new String[] { this.url, this.name,
this.convertBitSetToString(knownTags, this.tags),
this.convertBitSetToString(knownTypes, this.type),
this.convertBitSetToString(knownLevels, this.level) };
}
+
private String convertBitSetToString(String[][] knownKeysAndValues,
BitSet bitSet) {
StringBuilder sb = new StringBuilder();
- int i = -1;
- while ((i = bitSet.nextSetBit(i + 1)) >= 0) {
- sb.append(", " + knownKeysAndValues[i][1]);
+ int index = -1;
+ while ((index = bitSet.nextSetBit(index + 1)) >= 0) {
+ sb.append(", " + knownKeysAndValues[index][1]);
}
return sb.substring(Math.min(sb.length(), 2));
}
@@ -184,9 +197,9 @@ public class IndexServlet extends HttpServlet {
BitSet requestedTypes, BitSet requestedLevels) {
List<Metric> filteredMetrics = new ArrayList<Metric>();
for (Metric metric : availableMetrics) {
- if (requestedTags.intersects(metric.tags) &&
- requestedTypes.intersects(metric.type) &&
- requestedLevels.intersects(metric.level)) {
+ if (requestedTags.intersects(metric.tags)
+ && requestedTypes.intersects(metric.type)
+ && requestedLevels.intersects(metric.level)) {
filteredMetrics.add(metric);
}
}
@@ -196,47 +209,47 @@ public class IndexServlet extends HttpServlet {
private void orderMetrics(List<Metric> resultMetrics,
BitSet requestedOrder) {
switch (requestedOrder.nextSetBit(0)) {
- case 0:
- Collections.sort(resultMetrics, new Comparator<Metric>() {
- public int compare(Metric a, Metric b) {
- return a.name.compareTo(b.name);
- }
- });
- break;
- case 1:
- Collections.sort(resultMetrics, new Comparator<Metric>() {
- public int compare(Metric a, Metric b) {
- return compareTwoBitSets(a.tags, b.tags);
- }
- });
- break;
- case 2:
- Collections.sort(resultMetrics, new Comparator<Metric>() {
- public int compare(Metric a, Metric b) {
- return compareTwoBitSets(a.type, b.type);
- }
- });
- break;
- case 3:
- Collections.sort(resultMetrics, new Comparator<Metric>() {
- public int compare(Metric a, Metric b) {
- return compareTwoBitSets(a.level, b.level);
- }
- });
- break;
- default:
- Collections.shuffle(resultMetrics);
- break;
+ case 0:
+ Collections.sort(resultMetrics, new Comparator<Metric>() {
+ public int compare(Metric first, Metric second) {
+ return first.name.compareTo(second.name);
+ }
+ });
+ break;
+ case 1:
+ Collections.sort(resultMetrics, new Comparator<Metric>() {
+ public int compare(Metric first, Metric second) {
+ return compareTwoBitSets(first.tags, second.tags);
+ }
+ });
+ break;
+ case 2:
+ Collections.sort(resultMetrics, new Comparator<Metric>() {
+ public int compare(Metric first, Metric second) {
+ return compareTwoBitSets(first.type, second.type);
+ }
+ });
+ break;
+ case 3:
+ Collections.sort(resultMetrics, new Comparator<Metric>() {
+ public int compare(Metric first, Metric second) {
+ return compareTwoBitSets(first.level, second.level);
+ }
+ });
+ break;
+ default:
+ Collections.shuffle(resultMetrics);
+ break;
}
}
- private int compareTwoBitSets(BitSet a, BitSet b) {
- if (a.equals(b)) {
+ private int compareTwoBitSets(BitSet first, BitSet second) {
+ if (first.equals(second)) {
return 0;
}
- BitSet xor = (BitSet) a.clone();
- xor.xor(b);
- return xor.length() == b.length() ? -1 : 1;
+ BitSet xor = (BitSet) first.clone();
+ xor.xor(second);
+ return xor.length() == second.length() ? -1 : 1;
}
private String[][] formatMetrics(
diff --git a/website/src/org/torproject/metrics/web/LinkServlet.java b/website/src/org/torproject/metrics/web/LinkServlet.java
index 4066909..fc413f5 100644
--- a/website/src/org/torproject/metrics/web/LinkServlet.java
+++ b/website/src/org/torproject/metrics/web/LinkServlet.java
@@ -1,5 +1,6 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
import java.io.IOException;
@@ -11,18 +12,19 @@ import javax.servlet.http.HttpServletResponse;
@SuppressWarnings("serial")
public class LinkServlet extends MetricServlet {
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
- String requestURI = request.getRequestURI();
- if (requestURI == null || !requestURI.endsWith(".html")) {
+ String requestUri = request.getRequestURI();
+ if (requestUri == null || !requestUri.endsWith(".html")) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
- String requestedId = requestURI.substring(
- requestURI.contains("/") ? requestURI.lastIndexOf("/") + 1 : 0,
- requestURI.length() - 5);
- if (!this.idsByType.containsKey("Link") ||
- !this.idsByType.get("Link").contains(requestedId)) {
+ String requestedId = requestUri.substring(
+ requestUri.contains("/") ? requestUri.lastIndexOf("/") + 1 : 0,
+ requestUri.length() - 5);
+ if (!this.idsByType.containsKey("Link")
+ || !this.idsByType.get("Link").contains(requestedId)) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
diff --git a/website/src/org/torproject/metrics/web/Metric.java b/website/src/org/torproject/metrics/web/Metric.java
index 03ff4af..31dcbd7 100644
--- a/website/src/org/torproject/metrics/web/Metric.java
+++ b/website/src/org/torproject/metrics/web/Metric.java
@@ -1,61 +1,91 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
+@SuppressWarnings("checkstyle:membername")
public class Metric {
+
private String id;
+
private String title;
+
private String[] tags;
+
private String type;
+
private String level;
+
private String description;
+
private String function;
+
private String[] parameters;
+
private String[] data;
+
private String[] related;
+
private String[] table_headers;
+
private String[] table_cell_formats;
+
private String data_file;
+
private String[] data_column_spec;
+
public String getId() {
return this.id;
}
+
public String getTitle() {
return this.title;
}
+
public String[] getTags() {
return this.tags;
}
+
public String getType() {
return this.type;
}
+
public String getLevel() {
return this.level;
}
+
public String getDescription() {
return this.description;
}
+
public String getFunction() {
return this.function;
}
+
public String[] getParameters() {
return this.parameters;
}
+
public String[] getTableHeaders() {
return this.table_headers;
}
+
public String[] getTableCellFormats() {
return this.table_cell_formats;
}
+
public String getDataFile() {
return this.data_file;
}
+
public String[] getDataColumnSpec() {
return this.data_column_spec;
}
+
public String[] getData() {
return this.data;
}
+
public String[] getRelated() {
return this.related;
}
diff --git a/website/src/org/torproject/metrics/web/MetricServlet.java b/website/src/org/torproject/metrics/web/MetricServlet.java
index deb7fe3..086f9e7 100644
--- a/website/src/org/torproject/metrics/web/MetricServlet.java
+++ b/website/src/org/torproject/metrics/web/MetricServlet.java
@@ -1,5 +1,6 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
import java.util.ArrayList;
@@ -46,6 +47,7 @@ public abstract class MetricServlet extends HttpServlet {
protected Map<String, List<String[]>> related =
new HashMap<String, List<String[]>>();
+ @Override
public void init() throws ServletException {
this.metrics = MetricsProvider.getInstance().getMetricsList();
Map<String, String> allTypesAndTitles = new HashMap<String, String>();
diff --git a/website/src/org/torproject/metrics/web/MetricsProvider.java b/website/src/org/torproject/metrics/web/MetricsProvider.java
index 9b66d7e..606e7db 100644
--- a/website/src/org/torproject/metrics/web/MetricsProvider.java
+++ b/website/src/org/torproject/metrics/web/MetricsProvider.java
@@ -1,16 +1,17 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
import java.io.InputStream;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
-import com.google.gson.Gson;
-import com.google.gson.GsonBuilder;
-
public class MetricsProvider {
private static MetricsProvider instance = new MetricsProvider();
diff --git a/website/src/org/torproject/metrics/web/RedirectServlet.java b/website/src/org/torproject/metrics/web/RedirectServlet.java
index 7c627d7..c0a29cc 100644
--- a/website/src/org/torproject/metrics/web/RedirectServlet.java
+++ b/website/src/org/torproject/metrics/web/RedirectServlet.java
@@ -1,5 +1,6 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
import java.io.IOException;
@@ -17,7 +18,8 @@ public class RedirectServlet extends HttpServlet {
/* Available permanent internal and external redirects. */
private Map<String, String> redirects = new HashMap<String, String>();
- public RedirectServlet() {
+ @Override
+ public void init() throws ServletException {
/* Internal redirects: */
this.redirects.put("/metrics/graphs.html",
@@ -50,6 +52,7 @@ public class RedirectServlet extends HttpServlet {
"https://collector.torproject.org/#related-work");
}
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
String redirect = this.redirects.get(request.getRequestURI());
diff --git a/website/src/org/torproject/metrics/web/TableServlet.java b/website/src/org/torproject/metrics/web/TableServlet.java
index beedfde..ad2b10a 100644
--- a/website/src/org/torproject/metrics/web/TableServlet.java
+++ b/website/src/org/torproject/metrics/web/TableServlet.java
@@ -1,7 +1,13 @@
/* Copyright 2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web;
+import org.torproject.metrics.web.graphs.RObjectGenerator;
+import org.torproject.metrics.web.graphs.TableParameterChecker;
+
+import org.apache.commons.lang.text.StrSubstitutor;
+
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
@@ -14,33 +20,31 @@ import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
-import org.apache.commons.lang.text.StrSubstitutor;
-import org.torproject.metrics.web.graphs.RObjectGenerator;
-import org.torproject.metrics.web.graphs.TableParameterChecker;
-
@SuppressWarnings("serial")
public class TableServlet extends MetricServlet {
- private RObjectGenerator rObjectGenerator;
+ private RObjectGenerator objectGenerator;
+ @Override
public void init() throws ServletException {
super.init();
- this.rObjectGenerator = (RObjectGenerator) getServletContext().
- getAttribute("RObjectGenerator");
+ this.objectGenerator = (RObjectGenerator) getServletContext()
+ .getAttribute("RObjectGenerator");
}
+ @Override
protected void doGet(HttpServletRequest request,
HttpServletResponse response) throws ServletException, IOException {
- String requestURI = request.getRequestURI();
- if (requestURI == null || !requestURI.endsWith(".html")) {
+ String requestUri = request.getRequestURI();
+ if (requestUri == null || !requestUri.endsWith(".html")) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
- String requestedId = requestURI.substring(
- requestURI.contains("/") ? requestURI.lastIndexOf("/") + 1 : 0,
- requestURI.length() - 5);
- if (!this.idsByType.containsKey("Table") ||
- !this.idsByType.get("Table").contains(requestedId)) {
+ String requestedId = requestUri.substring(
+ requestUri.contains("/") ? requestUri.lastIndexOf("/") + 1 : 0,
+ requestUri.length() - 5);
+ if (!this.idsByType.containsKey("Table")
+ || !this.idsByType.get("Table").contains(requestedId)) {
response.sendError(HttpServletResponse.SC_BAD_REQUEST);
return;
}
@@ -58,14 +62,14 @@ public class TableServlet extends MetricServlet {
Date defaultStartDate = new Date(defaultEndDate.getTime()
- 90L * 24L * 60L * 60L * 1000L);
if (this.parameters.containsKey(requestedId)) {
- Map<String, String[]> checkedParameters = TableParameterChecker.
- getInstance().checkParameters(requestedId,
+ Map<String, String[]> checkedParameters = TableParameterChecker
+ .getInstance().checkParameters(requestedId,
request.getParameterMap());
for (String parameter : this.parameters.get(requestedId)) {
if (parameter.equals("start") || parameter.equals("end")) {
String[] requestParameter;
- if (checkedParameters != null &&
- checkedParameters.containsKey(parameter)) {
+ if (checkedParameters != null
+ && checkedParameters.containsKey(parameter)) {
requestParameter = checkedParameters.get(parameter);
} else {
requestParameter = new String[] {
@@ -76,8 +80,8 @@ public class TableServlet extends MetricServlet {
}
}
}
- List<Map<String, String>> tableData = rObjectGenerator.
- generateTable(requestedId, request.getParameterMap(), true);
+ List<Map<String, String>> tableData = objectGenerator
+ .generateTable(requestedId, request.getParameterMap(), true);
List<List<String>> formattedTableData =
new ArrayList<List<String>>();
String[] contents = this.tableCellFormats.get(requestedId);
diff --git a/website/src/org/torproject/metrics/web/graphs/BubblesServlet.java b/website/src/org/torproject/metrics/web/graphs/BubblesServlet.java
index f273194..c990eac 100644
--- a/website/src/org/torproject/metrics/web/graphs/BubblesServlet.java
+++ b/website/src/org/torproject/metrics/web/graphs/BubblesServlet.java
@@ -1,5 +1,6 @@
-/* Copyright 2013 The Tor Project
+/* Copyright 2013--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
import java.io.IOException;
@@ -13,6 +14,7 @@ public class BubblesServlet extends HttpServlet {
private static final long serialVersionUID = -6011833075497881033L;
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
diff --git a/website/src/org/torproject/metrics/web/graphs/Countries.java b/website/src/org/torproject/metrics/web/graphs/Countries.java
index 574fd0c..b0e2c88 100644
--- a/website/src/org/torproject/metrics/web/graphs/Countries.java
+++ b/website/src/org/torproject/metrics/web/graphs/Countries.java
@@ -1,5 +1,6 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
import java.util.ArrayList;
@@ -46,14 +47,14 @@ public class Countries {
this.knownCountries.add("bm;Bermuda".split(";"));
this.knownCountries.add("bt;Bhutan".split(";"));
this.knownCountries.add("bo;Bolivia".split(";"));
- this.knownCountries.add("bq;Bonaire, Sint Eustatius and Saba".
- split(";"));
+ this.knownCountries.add("bq;Bonaire, Sint Eustatius and Saba"
+ .split(";"));
this.knownCountries.add("ba;Bosnia and Herzegovina".split(";"));
this.knownCountries.add("bw;Botswana".split(";"));
this.knownCountries.add("bv;Bouvet Island".split(";"));
this.knownCountries.add("br;Brazil".split(";"));
- this.knownCountries.add("io;British Indian Ocean Territory".
- split(";"));
+ this.knownCountries.add("io;British Indian Ocean Territory"
+ .split(";"));
this.knownCountries.add("bn;Brunei".split(";"));
this.knownCountries.add("bg;Bulgaria".split(";"));
this.knownCountries.add("bf;Burkina Faso".split(";"));
@@ -72,8 +73,8 @@ public class Countries {
this.knownCountries.add("cc;Cocos (Keeling) Islands".split(";"));
this.knownCountries.add("co;Colombia".split(";"));
this.knownCountries.add("km;Comoros".split(";"));
- this.knownCountries.add("cd;Congo, The Democratic Republic of the".
- split(";"));
+ this.knownCountries.add("cd;Congo, The Democratic Republic of the"
+ .split(";"));
this.knownCountries.add("cg;Congo".split(";"));
this.knownCountries.add("ck;Cook Islands".split(";"));
this.knownCountries.add("cr;Costa Rica".split(";"));
@@ -119,8 +120,8 @@ public class Countries {
this.knownCountries.add("gw;Guinea-Bissau".split(";"));
this.knownCountries.add("gy;Guyana".split(";"));
this.knownCountries.add("ht;Haiti".split(";"));
- this.knownCountries.add("hm;Heard Island and McDonald Islands".
- split(";"));
+ this.knownCountries.add("hm;Heard Island and McDonald Islands"
+ .split(";"));
this.knownCountries.add("va;Vatican City".split(";"));
this.knownCountries.add("hn;Honduras".split(";"));
this.knownCountries.add("hk;Hong Kong".split(";"));
@@ -169,8 +170,8 @@ public class Countries {
this.knownCountries.add("mu;Mauritius".split(";"));
this.knownCountries.add("yt;Mayotte".split(";"));
this.knownCountries.add("mx;Mexico".split(";"));
- this.knownCountries.add("fm;Micronesia, Federated States of".
- split(";"));
+ this.knownCountries.add("fm;Micronesia, Federated States of"
+ .split(";"));
this.knownCountries.add("md;Moldova, Republic of".split(";"));
this.knownCountries.add("mc;Monaco".split(";"));
this.knownCountries.add("mn;Mongolia".split(";"));
@@ -217,12 +218,12 @@ public class Countries {
this.knownCountries.add("lc;Saint Lucia".split(";"));
this.knownCountries.add("mf;Saint Martin".split(";"));
this.knownCountries.add("pm;Saint Pierre and Miquelon".split(";"));
- this.knownCountries.add("vc;Saint Vincent and the Grenadines".
- split(";"));
+ this.knownCountries.add("vc;Saint Vincent and the Grenadines"
+ .split(";"));
this.knownCountries.add("ws;Samoa".split(";"));
this.knownCountries.add("sm;San Marino".split(";"));
- this.knownCountries.add("st:São Tomé and Príncipe".
- split(":"));
+ this.knownCountries.add("st:São Tomé and Príncipe"
+ .split(":"));
this.knownCountries.add("sa;Saudi Arabia".split(";"));
this.knownCountries.add("sn;Senegal".split(";"));
this.knownCountries.add("rs;Serbia".split(";"));
@@ -265,8 +266,8 @@ public class Countries {
this.knownCountries.add("ua;Ukraine".split(";"));
this.knownCountries.add("ae;United Arab Emirates".split(";"));
this.knownCountries.add("gb;United Kingdom".split(";"));
- this.knownCountries.add("um;United States Minor Outlying Islands".
- split(";"));
+ this.knownCountries.add("um;United States Minor Outlying Islands"
+ .split(";"));
this.knownCountries.add("us;United States".split(";"));
this.knownCountries.add("uy;Uruguay".split(";"));
this.knownCountries.add("uz;Uzbekistan".split(";"));
diff --git a/website/src/org/torproject/metrics/web/graphs/GraphImageServlet.java b/website/src/org/torproject/metrics/web/graphs/GraphImageServlet.java
index 08f256a..f39ab00 100644
--- a/website/src/org/torproject/metrics/web/graphs/GraphImageServlet.java
+++ b/website/src/org/torproject/metrics/web/graphs/GraphImageServlet.java
@@ -1,5 +1,6 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
import java.io.BufferedOutputStream;
@@ -19,16 +20,18 @@ public class GraphImageServlet extends HttpServlet {
private static final long serialVersionUID = -7356818641689744288L;
- private RObjectGenerator rObjectGenerator;
+ private RObjectGenerator objectGenerator;
+ @Override
public void init() {
/* Get a reference to the R object generator that we need to generate
* graph images. */
- this.rObjectGenerator = (RObjectGenerator) getServletContext().
- getAttribute("RObjectGenerator");
+ this.objectGenerator = (RObjectGenerator) getServletContext()
+ .getAttribute("RObjectGenerator");
}
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException,
ServletException {
@@ -37,21 +40,21 @@ public class GraphImageServlet extends HttpServlet {
* graph type and file type. */
String requestedGraph = request.getRequestURI();
String fileType = null;
- if (requestedGraph.endsWith(".png") ||
- requestedGraph.endsWith(".pdf") ||
- requestedGraph.endsWith(".svg")) {
+ if (requestedGraph.endsWith(".png")
+ || requestedGraph.endsWith(".pdf")
+ || requestedGraph.endsWith(".svg")) {
fileType = requestedGraph.substring(requestedGraph.length() - 3);
requestedGraph = requestedGraph.substring(0, requestedGraph.length()
- 4);
}
if (requestedGraph.contains("/")) {
- requestedGraph = requestedGraph.substring(requestedGraph.
- lastIndexOf("/") + 1);
+ requestedGraph = requestedGraph.substring(requestedGraph
+ .lastIndexOf("/") + 1);
}
/* Request graph from R object generator, which either returns it from
* its cache or asks Rserve to generate it. */
- RObject graph = rObjectGenerator.generateGraph(requestedGraph,
+ RObject graph = objectGenerator.generateGraph(requestedGraph,
fileType, request.getParameterMap(), true);
/* Make sure that we have a graph to return. */
@@ -61,13 +64,13 @@ public class GraphImageServlet extends HttpServlet {
}
/* Write graph bytes to response. */
- BufferedOutputStream output = null;
response.setContentType("image/" + fileType);
response.setHeader("Content-Length",
String.valueOf(graph.getBytes().length));
response.setHeader("Content-Disposition",
"inline; filename=\"" + graph.getFileName() + "\"");
- output = new BufferedOutputStream(response.getOutputStream(), 1024);
+ BufferedOutputStream output = new BufferedOutputStream(
+ response.getOutputStream(), 1024);
output.write(graph.getBytes(), 0, graph.getBytes().length);
output.flush();
output.close();
diff --git a/website/src/org/torproject/metrics/web/graphs/GraphParameterChecker.java b/website/src/org/torproject/metrics/web/graphs/GraphParameterChecker.java
index 5067789..b40885c 100644
--- a/website/src/org/torproject/metrics/web/graphs/GraphParameterChecker.java
+++ b/website/src/org/torproject/metrics/web/graphs/GraphParameterChecker.java
@@ -1,7 +1,11 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
+import org.torproject.metrics.web.Metric;
+import org.torproject.metrics.web.MetricsProvider;
+
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Arrays;
@@ -12,9 +16,6 @@ import java.util.Map;
import java.util.Set;
import java.util.TimeZone;
-import org.torproject.metrics.web.Metric;
-import org.torproject.metrics.web.MetricsProvider;
-
/**
* Checks request parameters passed to graph-generating servlets.
*/
@@ -79,19 +80,20 @@ public class GraphParameterChecker {
* of recognized parameters, or null if the graph type doesn't exist or
* the parameters are invalid.
*/
+ @SuppressWarnings("checkstyle:localvariablename")
public Map<String, String[]> checkParameters(String graphType,
Map requestParameters) {
/* Check if the graph type exists. */
- if (graphType == null ||
- !this.availableGraphs.containsKey(graphType)) {
+ if (graphType == null
+ || !this.availableGraphs.containsKey(graphType)) {
return null;
}
/* Find out which other parameters are supported by this graph type
* and parse them if they are given. */
- Set<String> supportedGraphParameters = new HashSet<String>(Arrays.
- asList(this.availableGraphs.get(graphType)));
+ Set<String> supportedGraphParameters = new HashSet<String>(
+ Arrays.asList(this.availableGraphs.get(graphType)));
Map<String, String[]> recognizedGraphParameters =
new HashMap<String, String[]>();
@@ -99,13 +101,13 @@ public class GraphParameterChecker {
* date is provided, set it to today. If no start date is provided,
* set it to 90 days before the end date. Make sure that start date
* precedes end date. */
- if (supportedGraphParameters.contains("start") ||
- supportedGraphParameters.contains("end")) {
+ if (supportedGraphParameters.contains("start")
+ || supportedGraphParameters.contains("end")) {
String[] startParameter = (String[]) requestParameters.get("start");
String[] endParameter = (String[]) requestParameters.get("end");
long endTimestamp = System.currentTimeMillis();
- if (endParameter != null && endParameter.length > 0 &&
- endParameter[0].length() > 0) {
+ if (endParameter != null && endParameter.length > 0
+ && endParameter[0].length() > 0) {
try {
endTimestamp = dateFormat.parse(endParameter[0]).getTime();
} catch (ParseException e) {
@@ -117,8 +119,8 @@ public class GraphParameterChecker {
}
endParameter = new String[] { dateFormat.format(endTimestamp) };
long startTimestamp = endTimestamp - 90L * 24L * 60L * 60L * 1000L;
- if (startParameter != null && startParameter.length > 0 &&
- startParameter[0].length() > 0) {
+ if (startParameter != null && startParameter.length > 0
+ && startParameter[0].length() > 0) {
try {
startTimestamp = dateFormat.parse(startParameter[0]).getTime();
} catch (ParseException e) {
@@ -130,7 +132,7 @@ public class GraphParameterChecker {
}
startParameter = new String[] { dateFormat.format(startTimestamp) };
if (startTimestamp > endTimestamp) {
- return null;
+ return null;
}
recognizedGraphParameters.put("start", startParameter);
recognizedGraphParameters.put("end", endParameter);
@@ -145,8 +147,8 @@ public class GraphParameterChecker {
this.knownParameterValues.get("flag").split(","));
if (flagParameters != null) {
for (String flag : flagParameters) {
- if (flag == null || flag.length() == 0 ||
- !knownFlags.contains(flag)) {
+ if (flag == null || flag.length() == 0
+ || !knownFlags.contains(flag)) {
return null;
}
}
@@ -168,8 +170,8 @@ public class GraphParameterChecker {
return null;
}
for (String country : countryParameters) {
- if (country == null || country.length() == 0 ||
- !knownCountries.contains(country)) {
+ if (country == null || country.length() == 0
+ || !knownCountries.contains(country)) {
return null;
}
}
@@ -188,9 +190,9 @@ public class GraphParameterChecker {
List<String> knownRanges = Arrays.asList(
this.knownParameterValues.get("events").split(","));
if (eventsParameter != null) {
- if (eventsParameter.length != 1 ||
- eventsParameter[0].length() == 0 ||
- !knownRanges.contains(eventsParameter[0])) {
+ if (eventsParameter.length != 1
+ || eventsParameter[0].length() == 0
+ || !knownRanges.contains(eventsParameter[0])) {
return null;
}
} else {
@@ -211,8 +213,8 @@ public class GraphParameterChecker {
if (sourceParameter.length != 1) {
return null;
}
- if (sourceParameter[0].length() == 0 ||
- !knownSources.contains(sourceParameter[0])) {
+ if (sourceParameter[0].length() == 0
+ || !knownSources.contains(sourceParameter[0])) {
return null;
}
} else {
@@ -233,8 +235,8 @@ public class GraphParameterChecker {
if (filesizeParameter.length != 1) {
return null;
}
- if (filesizeParameter[0].length() == 0 ||
- !knownFilesizes.contains(filesizeParameter[0])) {
+ if (filesizeParameter[0].length() == 0
+ || !knownFilesizes.contains(filesizeParameter[0])) {
return null;
}
} else {
@@ -252,8 +254,8 @@ public class GraphParameterChecker {
this.knownParameterValues.get("transport").split(","));
if (transportParameters != null) {
for (String transport : transportParameters) {
- if (transport == null || transport.length() == 0 ||
- !knownTransports.contains(transport)) {
+ if (transport == null || transport.length() == 0
+ || !knownTransports.contains(transport)) {
return null;
}
}
@@ -275,8 +277,8 @@ public class GraphParameterChecker {
return null;
}
for (String version : versionParameters) {
- if (version == null || version.length() == 0 ||
- !knownVersions.contains(version)) {
+ if (version == null || version.length() == 0
+ || !knownVersions.contains(version)) {
return null;
}
}
diff --git a/website/src/org/torproject/metrics/web/graphs/RObject.java b/website/src/org/torproject/metrics/web/graphs/RObject.java
index db8f362..a5562df 100644
--- a/website/src/org/torproject/metrics/web/graphs/RObject.java
+++ b/website/src/org/torproject/metrics/web/graphs/RObject.java
@@ -1,22 +1,31 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
public class RObject {
+
private byte[] bytes;
+
private String fileName;
+
private long lastModified;
+
+ /** Initializes an R object. */
public RObject(byte[] bytes, String fileName, long lastModified) {
this.bytes = bytes;
this.fileName = fileName;
this.lastModified = lastModified;
}
+
public String getFileName() {
return this.fileName;
}
+
public byte[] getBytes() {
return this.bytes;
}
+
public long getLastModified() {
return this.lastModified;
}
diff --git a/website/src/org/torproject/metrics/web/graphs/RObjectGenerator.java b/website/src/org/torproject/metrics/web/graphs/RObjectGenerator.java
index fb7d7b0..526e3d3 100644
--- a/website/src/org/torproject/metrics/web/graphs/RObjectGenerator.java
+++ b/website/src/org/torproject/metrics/web/graphs/RObjectGenerator.java
@@ -1,7 +1,14 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
+import org.torproject.metrics.web.Metric;
+import org.torproject.metrics.web.MetricsProvider;
+
+import org.rosuda.REngine.Rserve.RConnection;
+import org.rosuda.REngine.Rserve.RserveException;
+
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
@@ -21,23 +28,23 @@ import javax.servlet.ServletContext;
import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;
-import org.rosuda.REngine.Rserve.RConnection;
-import org.rosuda.REngine.Rserve.RserveException;
-import org.torproject.metrics.web.Metric;
-import org.torproject.metrics.web.MetricsProvider;
-
public class RObjectGenerator implements ServletContextListener {
/* Host and port where Rserve is listening. */
private String rserveHost;
+
private int rservePort;
/* Some parameters for our cache of graph images. */
private String cachedGraphsDirectory;
+
private long maxCacheAge;
- private Map<String, Metric> availableGraphs, availableTables;
+ private Map<String, Metric> availableGraphs;
+ private Map<String, Metric> availableTables;
+
+ @Override
public void contextInitialized(ServletContextEvent event) {
/* Initialize using context parameters. */
@@ -53,7 +60,8 @@ public class RObjectGenerator implements ServletContextListener {
this.availableGraphs = new LinkedHashMap<String, Metric>();
this.availableTables = new LinkedHashMap<String, Metric>();
for (Metric metric : MetricsProvider.getInstance().getMetricsList()) {
- String type = metric.getType(), id = metric.getId();
+ String type = metric.getType();
+ String id = metric.getId();
if ("Graph".equals(type)) {
this.availableGraphs.put(id, metric);
} else if ("Table".equals(type)) {
@@ -66,14 +74,17 @@ public class RObjectGenerator implements ServletContextListener {
/* Periodically generate R objects with default parameters. */
new Thread() {
+ @Override
public void run() {
- long lastUpdated = 0L, sleep;
+ long lastUpdated = 0L;
+ long sleep;
while (true) {
while ((sleep = maxCacheAge * 1000L / 2L + lastUpdated
- System.currentTimeMillis()) > 0L) {
try {
Thread.sleep(sleep);
} catch (InterruptedException e) {
+ /* Nothing we can handle. */
}
}
for (String tableId : availableTables.keySet()) {
@@ -84,111 +95,122 @@ public class RObjectGenerator implements ServletContextListener {
}
lastUpdated = System.currentTimeMillis();
}
- };
+ }
}.start();
}
+ @Override
public void contextDestroyed(ServletContextEvent event) {
/* Nothing to do. */
}
+ /** Generates a graph of the given type, given image file type, and with
+ * the given parameters, possibly after checking whether the cache
+ * already contains that graph. */
public RObject generateGraph(String requestedGraph, String fileType,
Map parameterMap, boolean checkCache) {
- if (!this.availableGraphs.containsKey(requestedGraph) ||
- this.availableGraphs.get(requestedGraph).getFunction() == null) {
+ if (!this.availableGraphs.containsKey(requestedGraph)
+ || this.availableGraphs.get(requestedGraph).getFunction()
+ == null) {
return null;
}
- String function = this.availableGraphs.get(requestedGraph).
- getFunction();
- Map<String, String[]> checkedParameters = GraphParameterChecker.
- getInstance().checkParameters(requestedGraph, parameterMap);
+ String function = this.availableGraphs.get(requestedGraph)
+ .getFunction();
+ Map<String, String[]> checkedParameters = GraphParameterChecker
+ .getInstance().checkParameters(requestedGraph, parameterMap);
if (checkedParameters == null) {
return null;
}
- StringBuilder
- rQueryBuilder = new StringBuilder().append(function).append("("),
- imageFilenameBuilder = new StringBuilder(requestedGraph);
- for (Map.Entry<String, String[]> parameter :
- checkedParameters.entrySet()) {
+ StringBuilder queryBuilder =
+ new StringBuilder().append(function).append("(");
+ StringBuilder imageFilenameBuilder =
+ new StringBuilder(requestedGraph);
+ for (Map.Entry<String, String[]> parameter
+ : checkedParameters.entrySet()) {
String parameterName = parameter.getKey();
String[] parameterValues = parameter.getValue();
for (String param : parameterValues) {
imageFilenameBuilder.append("-" + param);
}
if (parameterValues.length < 2) {
- rQueryBuilder.append(parameterName + " = '" + parameterValues[0]
+ queryBuilder.append(parameterName + " = '" + parameterValues[0]
+ "', ");
} else {
- rQueryBuilder.append(parameterName + " = c(");
+ queryBuilder.append(parameterName + " = c(");
for (int i = 0; i < parameterValues.length - 1; i++) {
- rQueryBuilder.append("'" + parameterValues[i] + "', ");
+ queryBuilder.append("'" + parameterValues[i] + "', ");
}
- rQueryBuilder.append("'" + parameterValues[
+ queryBuilder.append("'" + parameterValues[
parameterValues.length - 1] + "'), ");
}
}
imageFilenameBuilder.append("." + fileType);
String imageFilename = imageFilenameBuilder.toString();
- rQueryBuilder.append("path = '%s')");
- String rQuery = rQueryBuilder.toString();
+ queryBuilder.append("path = '%s')");
+ String query = queryBuilder.toString();
File imageFile = new File(this.cachedGraphsDirectory + "/"
+ imageFilename);
- return this.generateRObject(rQuery, imageFile, imageFilename,
+ return this.generateObject(query, imageFile, imageFilename,
checkCache);
}
+ /** Generates a table of the given type and with the given parameters,
+ * possibly after checking whether the cache already contains that
+ * table. */
public List<Map<String, String>> generateTable(String requestedTable,
Map parameterMap, boolean checkCache) {
- if (!this.availableTables.containsKey(requestedTable) ||
- this.availableTables.get(requestedTable).getFunction() == null) {
+ if (!this.availableTables.containsKey(requestedTable)
+ || this.availableTables.get(requestedTable).getFunction()
+ == null) {
return null;
}
- String function = this.availableTables.get(requestedTable).
- getFunction();
- Map<String, String[]> checkedParameters = TableParameterChecker.
- getInstance().checkParameters(requestedTable, parameterMap);
+ String function = this.availableTables.get(requestedTable)
+ .getFunction();
+ Map<String, String[]> checkedParameters = TableParameterChecker
+ .getInstance().checkParameters(requestedTable, parameterMap);
if (checkedParameters == null) {
return null;
}
- StringBuilder
- rQueryBuilder = new StringBuilder().append(function).append("("),
- tableFilenameBuilder = new StringBuilder(requestedTable);
- for (Map.Entry<String, String[]> parameter :
- checkedParameters.entrySet()) {
+ StringBuilder queryBuilder = new StringBuilder().append(function)
+ .append("(");
+ StringBuilder tableFilenameBuilder = new StringBuilder(
+ requestedTable);
+ for (Map.Entry<String, String[]> parameter
+ : checkedParameters.entrySet()) {
String parameterName = parameter.getKey();
String[] parameterValues = parameter.getValue();
for (String param : parameterValues) {
tableFilenameBuilder.append("-" + param);
}
if (parameterValues.length < 2) {
- rQueryBuilder.append(parameterName + " = '"
+ queryBuilder.append(parameterName + " = '"
+ parameterValues[0] + "', ");
} else {
- rQueryBuilder.append(parameterName + " = c(");
+ queryBuilder.append(parameterName + " = c(");
for (int i = 0; i < parameterValues.length - 1; i++) {
- rQueryBuilder.append("'" + parameterValues[i] + "', ");
+ queryBuilder.append("'" + parameterValues[i] + "', ");
}
- rQueryBuilder.append("'" + parameterValues[
+ queryBuilder.append("'" + parameterValues[
parameterValues.length - 1] + "'), ");
}
}
tableFilenameBuilder.append(".tbl");
String tableFilename = tableFilenameBuilder.toString();
- rQueryBuilder.append("path = '%s')");
- String rQuery = rQueryBuilder.toString();
- return this.generateTable(rQuery, tableFilename, checkCache);
+ queryBuilder.append("path = '%s')");
+ String query = queryBuilder.toString();
+ return this.generateTable(query, tableFilename, checkCache);
}
/* Generate table data using the given R query and filename or read
* previously generated table data from disk if it's not too old and
* return table data. */
- private List<Map<String, String>> generateTable(String rQuery,
+ private List<Map<String, String>> generateTable(String query,
String tableFilename, boolean checkCache) {
/* See if we need to generate this table. */
File tableFile = new File(this.cachedGraphsDirectory + "/"
+ tableFilename);
- byte[] tableBytes = this.generateRObject(rQuery, tableFile,
+ byte[] tableBytes = this.generateObject(query, tableFile,
tableFilename, checkCache).getBytes();
/* Write the table content to a map. */
@@ -223,47 +245,48 @@ public class RObjectGenerator implements ServletContextListener {
/* Generate an R object in a separate worker thread, or wait for an
* already running worker thread to finish and get its result. */
- private RObject generateRObject(String rQuery, File rObjectFile,
+ private RObject generateObject(String query, File objectFile,
String fileName, boolean checkCache) {
RObjectGeneratorWorker worker = null;
- synchronized (this.rObjectGeneratorThreads) {
- if (this.rObjectGeneratorThreads.containsKey(rQuery)) {
- worker = this.rObjectGeneratorThreads.get(rQuery);
+ synchronized (this.objectGeneratorThreads) {
+ if (this.objectGeneratorThreads.containsKey(query)) {
+ worker = this.objectGeneratorThreads.get(query);
} else {
- worker = new RObjectGeneratorWorker(rQuery, rObjectFile,
- fileName, checkCache);
- this.rObjectGeneratorThreads.put(rQuery, worker);
+ worker = new RObjectGeneratorWorker(query, objectFile, fileName,
+ checkCache);
+ this.objectGeneratorThreads.put(query, worker);
worker.start();
}
}
try {
worker.join();
} catch (InterruptedException e) {
+ /* Nothing we can handle here. */
}
- synchronized (this.rObjectGeneratorThreads) {
- if (this.rObjectGeneratorThreads.containsKey(rQuery) &&
- this.rObjectGeneratorThreads.get(rQuery) == worker) {
- this.rObjectGeneratorThreads.remove(rQuery);
+ synchronized (this.objectGeneratorThreads) {
+ if (this.objectGeneratorThreads.containsKey(query)
+ && this.objectGeneratorThreads.get(query) == worker) {
+ this.objectGeneratorThreads.remove(query);
}
}
return worker.getRObject();
}
- private Map<String, RObjectGeneratorWorker> rObjectGeneratorThreads =
+ private Map<String, RObjectGeneratorWorker> objectGeneratorThreads =
new HashMap<String, RObjectGeneratorWorker>();
private class RObjectGeneratorWorker extends Thread {
- private String rQuery;
- private File rObjectFile;
+ private String query;
+ private File objectFile;
private String fileName;
private boolean checkCache;
private RObject result = null;
- public RObjectGeneratorWorker(String rQuery, File rObjectFile,
+ public RObjectGeneratorWorker(String query, File objectFile,
String fileName, boolean checkCache) {
- this.rQuery = rQuery;
- this.rObjectFile = rObjectFile;
+ this.query = query;
+ this.objectFile = objectFile;
this.fileName = fileName;
this.checkCache = checkCache;
}
@@ -272,35 +295,36 @@ public class RObjectGenerator implements ServletContextListener {
/* See if we need to generate this R object. */
long now = System.currentTimeMillis();
- if (!this.checkCache || !this.rObjectFile.exists() ||
- this.rObjectFile.lastModified() < now - maxCacheAge * 1000L) {
+ if (!this.checkCache || !this.objectFile.exists()
+ || this.objectFile.lastModified()
+ < now - maxCacheAge * 1000L) {
/* We do. Update the R query to contain the absolute path to the
* file to be generated, create a connection to Rserve, run the R
* query, and close the connection. The generated object will be
* on disk. */
- this.rQuery = String.format(this.rQuery,
- this.rObjectFile.getAbsolutePath());
+ this.query = String.format(this.query,
+ this.objectFile.getAbsolutePath());
try {
RConnection rc = new RConnection(rserveHost, rservePort);
- rc.eval(this.rQuery);
+ rc.eval(this.query);
rc.close();
} catch (RserveException e) {
return;
}
/* Check that we really just generated the R object. */
- if (!this.rObjectFile.exists() || this.rObjectFile.lastModified()
+ if (!this.objectFile.exists() || this.objectFile.lastModified()
< now - maxCacheAge * 1000L) {
return;
}
}
/* Read the R object from disk and write it to a byte array. */
- long lastModified = this.rObjectFile.lastModified();
+ long lastModified = this.objectFile.lastModified();
try {
BufferedInputStream bis = new BufferedInputStream(
- new FileInputStream(this.rObjectFile), 1024);
+ new FileInputStream(this.objectFile), 1024);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
int length;
diff --git a/website/src/org/torproject/metrics/web/graphs/TableParameterChecker.java b/website/src/org/torproject/metrics/web/graphs/TableParameterChecker.java
index c92393b..b441ab6 100644
--- a/website/src/org/torproject/metrics/web/graphs/TableParameterChecker.java
+++ b/website/src/org/torproject/metrics/web/graphs/TableParameterChecker.java
@@ -1,7 +1,11 @@
-/* Copyright 2011, 2012 The Tor Project
+/* Copyright 2011--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.graphs;
+import org.torproject.metrics.web.Metric;
+import org.torproject.metrics.web.MetricsProvider;
+
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Arrays;
@@ -11,9 +15,6 @@ import java.util.Map;
import java.util.Set;
import java.util.TimeZone;
-import org.torproject.metrics.web.Metric;
-import org.torproject.metrics.web.MetricsProvider;
-
/**
* Checks request parameters passed to generate tables.
*/
@@ -62,15 +63,15 @@ public class TableParameterChecker {
Map requestParameters) {
/* Check if the table type exists. */
- if (tableType == null ||
- !this.availableTables.containsKey(tableType)) {
+ if (tableType == null
+ || !this.availableTables.containsKey(tableType)) {
return null;
}
/* Find out which other parameters are supported by this table type
* and parse them if they are given. */
- Set<String> supportedTableParameters = new HashSet<String>(Arrays.
- asList(this.availableTables.get(tableType)));
+ Set<String> supportedTableParameters = new HashSet<String>(
+ Arrays.asList(this.availableTables.get(tableType)));
Map<String, String[]> recognizedTableParameters =
new HashMap<String, String[]>();
@@ -78,8 +79,8 @@ public class TableParameterChecker {
* date is provided, set it to today. If no start date is provided,
* set it to 90 days before the end date. Make sure that start date
* precedes end date. */
- if (supportedTableParameters.contains("start") ||
- supportedTableParameters.contains("end")) {
+ if (supportedTableParameters.contains("start")
+ || supportedTableParameters.contains("end")) {
String[] startParameter = null;
String[] endParameter = null;
if (requestParameters != null) {
@@ -87,8 +88,8 @@ public class TableParameterChecker {
endParameter = (String[]) requestParameters.get("end");
}
long endTimestamp = System.currentTimeMillis();
- if (endParameter != null && endParameter.length > 0 &&
- endParameter[0].length() > 0) {
+ if (endParameter != null && endParameter.length > 0
+ && endParameter[0].length() > 0) {
try {
endTimestamp = dateFormat.parse(endParameter[0]).getTime();
} catch (ParseException e) {
@@ -100,8 +101,8 @@ public class TableParameterChecker {
}
endParameter = new String[] { dateFormat.format(endTimestamp) };
long startTimestamp = endTimestamp - 90L * 24L * 60L * 60L * 1000L;
- if (startParameter != null && startParameter.length > 0 &&
- startParameter[0].length() > 0) {
+ if (startParameter != null && startParameter.length > 0
+ && startParameter[0].length() > 0) {
try {
startTimestamp = dateFormat.parse(startParameter[0]).getTime();
} catch (ParseException e) {
@@ -113,7 +114,7 @@ public class TableParameterChecker {
}
startParameter = new String[] { dateFormat.format(startTimestamp) };
if (startTimestamp > endTimestamp) {
- return null;
+ return null;
}
recognizedTableParameters.put("start", startParameter);
recognizedTableParameters.put("end", endParameter);
diff --git a/website/src/org/torproject/metrics/web/research/ResearchStatsServlet.java b/website/src/org/torproject/metrics/web/research/ResearchStatsServlet.java
index f1d0ad4..7ddeff1 100644
--- a/website/src/org/torproject/metrics/web/research/ResearchStatsServlet.java
+++ b/website/src/org/torproject/metrics/web/research/ResearchStatsServlet.java
@@ -1,5 +1,6 @@
-/* Copyright 2013 The Tor Project
+/* Copyright 2013--2016 The Tor Project
* See LICENSE for licensing information */
+
package org.torproject.metrics.web.research;
import java.io.BufferedInputStream;
@@ -24,6 +25,7 @@ public class ResearchStatsServlet extends HttpServlet {
private SortedSet<String> availableStatisticsFiles;
+ @Override
public void init(ServletConfig config) throws ServletException {
super.init(config);
this.statsDir = new File(config.getInitParameter("statsDir"));
@@ -40,6 +42,7 @@ public class ResearchStatsServlet extends HttpServlet {
this.availableStatisticsFiles.add("disagreement");
}
+ @Override
public long getLastModified(HttpServletRequest request) {
File statsFile = this.determineStatsFile(request);
if (statsFile == null || !statsFile.exists()) {
@@ -49,10 +52,11 @@ public class ResearchStatsServlet extends HttpServlet {
}
}
+ @Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) throws IOException, ServletException {
- String requestURI = request.getRequestURI();
- if (requestURI.equals("/metrics/stats/")) {
+ String requestUri = request.getRequestURI();
+ if (requestUri.equals("/metrics/stats/")) {
this.writeDirectoryListing(request, response);
} else {
File statsFile = this.determineStatsFile(request);
1
0
commit edc089e5af06eebf6a5b1d84d278082746a22c48
Author: David Fifield <david(a)bamsoftware.com>
Date: Thu Jul 21 16:04:54 2016 -0700
Link to #18904.
---
meek-client-torbrowser/mac.go | 1 +
1 file changed, 1 insertion(+)
diff --git a/meek-client-torbrowser/mac.go b/meek-client-torbrowser/mac.go
index de946be..31f57f8 100644
--- a/meek-client-torbrowser/mac.go
+++ b/meek-client-torbrowser/mac.go
@@ -9,6 +9,7 @@ const (
// During startup of meek-client-torbrowser, the browser profile is
// created and maintained under firefoxProfilePath by making a
// recursive copy of everything under profileTemplatePath.
+ // https://bugs.torproject.org/18904
firefoxPath = "../firefox"
firefoxProfilePath = "../../../../TorBrowser-Data/Tor/PluggableTransports/profile.meek-http-helper"
profileTemplatePath = "../../Resources/TorBrowser/Tor/PluggableTransports/template-profile.meek-http-helper"
1
0

[meek/master] Bug 18371: symlinks incompatible with Gatekeeper signing
by dcf@torproject.org 21 Jul '16
by dcf@torproject.org 21 Jul '16
21 Jul '16
commit f256d56828e7246326fac6cb9b74d8c8c8ef0775
Author: Kathy Brade <brade(a)pearlcrescent.com>
Date: Mon Mar 7 12:36:28 2016 -0500
Bug 18371: symlinks incompatible with Gatekeeper signing
Use the regular Tor Browser instead of a symlinked copy and pass a
--invisible option to firefox. Tor Browser will be patched to
recognize that flag and hide the Mac OS dock icon as soon as possible.
Also, fix meek-client-torbrowser's embedded paths to match Tor Browser's
new Mac OS directory structure and create the meek-http-helper browser
profile on-the-fly by copying files from a template.
---
meek-client-torbrowser/linux.go | 5 +-
meek-client-torbrowser/mac.go | 11 +--
meek-client-torbrowser/meek-client-torbrowser.go | 105 ++++++++++++++++++++++-
meek-client-torbrowser/windows.go | 5 +-
4 files changed, 115 insertions(+), 11 deletions(-)
diff --git a/meek-client-torbrowser/linux.go b/meek-client-torbrowser/linux.go
index 7a85d82..c95f264 100644
--- a/meek-client-torbrowser/linux.go
+++ b/meek-client-torbrowser/linux.go
@@ -6,6 +6,7 @@
package main
const (
- firefoxPath = "./firefox"
- firefoxProfilePath = "TorBrowser/Data/Browser/profile.meek-http-helper"
+ firefoxPath = "./firefox"
+ firefoxProfilePath = "TorBrowser/Data/Browser/profile.meek-http-helper"
+ profileTemplatePath = ""
)
diff --git a/meek-client-torbrowser/mac.go b/meek-client-torbrowser/mac.go
index 7eee72d..a2be44c 100644
--- a/meek-client-torbrowser/mac.go
+++ b/meek-client-torbrowser/mac.go
@@ -6,9 +6,10 @@
package main
const (
- // The TorBrowser.app.meek-http-helper directory is a special case for
- // the mac bundle. It is a copy of TorBrowser.app that has a modified
- // Info.plist file so that it doesn't show a dock icon.
- firefoxPath = "PluggableTransports/TorBrowser.app.meek-http-helper/Contents/MacOS/firefox"
- firefoxProfilePath = "../Data/Browser/profile.meek-http-helper"
+ // During startup of meek-client-torbrowser, the browser profile is
+ // created under firefoxProfilePath if it does not exist by making a
+ // recursive copy of everything under profileTemplatePath.
+ firefoxPath = "../firefox"
+ firefoxProfilePath = "../../../../TorBrowser-Data/Tor/PluggableTransports/profile.meek-http-helper"
+ profileTemplatePath = "../../Resources/TorBrowser/Tor/PluggableTransports/template-profile.meek-http-helper"
)
diff --git a/meek-client-torbrowser/meek-client-torbrowser.go b/meek-client-torbrowser/meek-client-torbrowser.go
index 605bc85..8a3809d 100644
--- a/meek-client-torbrowser/meek-client-torbrowser.go
+++ b/meek-client-torbrowser/meek-client-torbrowser.go
@@ -32,6 +32,7 @@ import (
"os/signal"
"path/filepath"
"regexp"
+ "strings"
"syscall"
)
@@ -63,15 +64,115 @@ func logSignal(p *os.Process, sig os.Signal) error {
return err
}
+func copyFile(srcPath string, mode os.FileMode, destPath string) error {
+ inFile, err := os.Open(srcPath)
+ if err != nil {
+ return err
+ }
+
+ defer inFile.Close()
+ outFile, err := os.OpenFile(destPath, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, mode)
+ if err != nil {
+ return err
+ }
+
+ // Always close the destination file before returning.
+ defer func() {
+ closeErr := outFile.Close()
+ if err == nil {
+ err = closeErr
+ }
+ }()
+
+ if _, err = io.Copy(outFile, inFile); err != nil {
+ return err
+ }
+ err = outFile.Sync()
+ return err
+}
+
+// Make sure that the browser profile exists. If it does not exist and if
+// profileTemplatePath is not empty, create it by making a recursive copy of
+// all the files and directories under profileTemplatePath. A safe copy is
+// done by first copying the profile files into a temporary directory and
+// then doing an atomic rename of the temporary directory as the last step.
+func ensureProfileExists(profilePath string) error {
+ _, err := os.Stat(profilePath)
+ if err == nil || os.IsExist(err) {
+ return nil // The profile has already been created.
+ }
+
+ // If profileTemplatePath is not set, we are running on a platform that
+ // expects the profile to already exist.
+ if (profileTemplatePath == "") {
+ return err;
+ }
+
+ log.Printf("creating profile by copying files from %s to %s\n", profileTemplatePath, profilePath)
+ tmpPath, err := ioutil.TempDir(filepath.Dir(profilePath), "tmpMeekProfile")
+ if err != nil {
+ return err
+ }
+ err = os.MkdirAll(tmpPath, os.ModePerm)
+ if err != nil {
+ return err
+ }
+
+ // Remove the temporary directory before returning.
+ defer func() {
+ os.RemoveAll(tmpPath);
+ }()
+
+ templatePath, err := filepath.Abs(profileTemplatePath)
+ if err != nil {
+ return err
+ }
+
+ visit := func(path string, info os.FileInfo, err error) error {
+ relativePath := strings.TrimPrefix(path, templatePath)
+ if (relativePath == "") {
+ return nil // skip the root directory
+ }
+
+ // If relativePath is a directory, create it; if it is a file, copy it.
+ destPath := filepath.Join(tmpPath, relativePath);
+ if (info.IsDir()) {
+ err = os.MkdirAll(destPath, info.Mode())
+ } else {
+ err = copyFile(path, info.Mode(), destPath)
+ }
+
+ return err
+ }
+
+ err = filepath.Walk(templatePath, visit)
+ if err != nil {
+ return err
+ }
+
+ return os.Rename(tmpPath, profilePath);
+}
+
+
// Run firefox and return its exec.Cmd and stdout pipe.
func runFirefox() (cmd *exec.Cmd, stdout io.Reader, err error) {
+ // Mac OS X needs absolute paths for firefox and for the profile.
+ var absFirefoxPath string
+ absFirefoxPath, err = filepath.Abs(firefoxPath)
+ if err != nil {
+ return
+ }
var profilePath string
- // Mac OS X needs an absolute profile path.
profilePath, err = filepath.Abs(firefoxProfilePath)
if err != nil {
return
}
- cmd = exec.Command(firefoxPath, "-no-remote", "-profile", profilePath)
+ err = ensureProfileExists(profilePath)
+ if err != nil {
+ return
+ }
+
+ cmd = exec.Command(absFirefoxPath, "--invisible", "-no-remote", "-profile", profilePath)
cmd.Stderr = os.Stderr
stdout, err = cmd.StdoutPipe()
if err != nil {
diff --git a/meek-client-torbrowser/windows.go b/meek-client-torbrowser/windows.go
index cc69bec..5d87973 100644
--- a/meek-client-torbrowser/windows.go
+++ b/meek-client-torbrowser/windows.go
@@ -6,6 +6,7 @@
package main
const (
- firefoxPath = "./firefox.exe"
- firefoxProfilePath = "TorBrowser/Data/Browser/profile.meek-http-helper"
+ firefoxPath = "./firefox.exe"
+ firefoxProfilePath = "TorBrowser/Data/Browser/profile.meek-http-helper"
+ profileTemplatePath = ""
)
1
0

[meek/master] Bug 18904: Mac OS meek-http-helper profile not updated
by dcf@torproject.org 21 Jul '16
by dcf@torproject.org 21 Jul '16
21 Jul '16
commit cca1ed4a2b7401abe2fb42e22f8f22a3bfe091fd
Author: Kathy Brade <brade(a)pearlcrescent.com>
Date: Thu May 12 11:31:40 2016 -0400
Bug 18904: Mac OS meek-http-helper profile not updated
To ensure that the meek-http-helper profile is up-to-date with respect
to the template (e.g., after Tor Browser has been updated), the
contents of the file meek-template-sha256sum.txt within the profile
are compared with the corresponding template file; if they differ, the
profile is deleted and recreated.
---
meek-client-torbrowser/mac.go | 2 +-
meek-client-torbrowser/meek-client-torbrowser.go | 50 ++++++++++++++++++++----
2 files changed, 44 insertions(+), 8 deletions(-)
diff --git a/meek-client-torbrowser/mac.go b/meek-client-torbrowser/mac.go
index a2be44c..de946be 100644
--- a/meek-client-torbrowser/mac.go
+++ b/meek-client-torbrowser/mac.go
@@ -7,7 +7,7 @@ package main
const (
// During startup of meek-client-torbrowser, the browser profile is
- // created under firefoxProfilePath if it does not exist by making a
+ // created and maintained under firefoxProfilePath by making a
// recursive copy of everything under profileTemplatePath.
firefoxPath = "../firefox"
firefoxProfilePath = "../../../../TorBrowser-Data/Tor/PluggableTransports/profile.meek-http-helper"
diff --git a/meek-client-torbrowser/meek-client-torbrowser.go b/meek-client-torbrowser/meek-client-torbrowser.go
index 8647d5d..a98fcb5 100644
--- a/meek-client-torbrowser/meek-client-torbrowser.go
+++ b/meek-client-torbrowser/meek-client-torbrowser.go
@@ -22,6 +22,7 @@ package main
import (
"bufio"
+ "bytes"
"flag"
"fmt"
"io"
@@ -91,23 +92,42 @@ func copyFile(srcPath string, mode os.FileMode, destPath string) error {
return err
}
-// Make sure that the browser profile exists. If it does not exist and if
-// profileTemplatePath is not empty, create it by making a recursive copy of
+// Make sure that the browser profile exists. If profileTemplatePath is not
+// empty, the profile is created and maintained by making a recursive copy of
// all the files and directories under profileTemplatePath. A safe copy is
// done by first copying the profile files into a temporary directory and
// then doing an atomic rename of the temporary directory as the last step.
-func ensureProfileExists(profilePath string) error {
+// To ensure that the profile is up-to-date with respect to the template
+// (e.g., after Tor Browser has been updated), the contents of the file
+// meek-template-sha256sum.txt within the profile are compared with the
+// corresponding template file; if they differ, the profile is deleted and
+// recreated.
+func prepareBrowserProfile(profilePath string) error {
_, err := os.Stat(profilePath)
- if err == nil || os.IsExist(err) {
- return nil // The profile has already been created.
- }
+ profileExists := err == nil || os.IsExist(err)
// If profileTemplatePath is not set, we are running on a platform that
// expects the profile to already exist.
if profileTemplatePath == "" {
+ if profileExists {
+ return nil
+ }
return err
}
+ if profileExists {
+ if isBrowserProfileUpToDate(profileTemplatePath, profilePath) {
+ return nil
+ }
+
+ // Remove outdated meek helper profile.
+ log.Printf("removing outdated profile at %s\n", profilePath)
+ err = os.RemoveAll(profilePath)
+ if err != nil {
+ return err
+ }
+ }
+
log.Printf("creating profile by copying files from %s to %s\n", profileTemplatePath, profilePath)
profileParentPath := filepath.Dir(profilePath)
err = os.MkdirAll(profileParentPath, os.ModePerm)
@@ -160,6 +180,22 @@ func ensureProfileExists(profilePath string) error {
return os.Rename(tmpPath, profilePath)
}
+// Return true if the profile is up-to-date with the template.
+func isBrowserProfileUpToDate(templatePath string, profilePath string) bool {
+ checksumFileName := "meek-template-sha256sum.txt"
+ templateChecksumPath := filepath.Join(templatePath, checksumFileName)
+ templateData, err := ioutil.ReadFile(templateChecksumPath)
+ if (err != nil) {
+ return false
+ }
+ profileChecksumPath := filepath.Join(profilePath, checksumFileName)
+ profileData, err := ioutil.ReadFile(profileChecksumPath)
+ if (err != nil) {
+ return false
+ }
+
+ return bytes.Equal(templateData, profileData)
+}
// Run firefox and return its exec.Cmd and stdout pipe.
func runFirefox() (cmd *exec.Cmd, stdout io.Reader, err error) {
@@ -174,7 +210,7 @@ func runFirefox() (cmd *exec.Cmd, stdout io.Reader, err error) {
if err != nil {
return
}
- err = ensureProfileExists(profilePath)
+ err = prepareBrowserProfile(profilePath)
if err != nil {
return
}
1
0

[meek/master] fixup! Bug 18371: symlinks incompatible with Gatekeeper signing
by dcf@torproject.org 21 Jul '16
by dcf@torproject.org 21 Jul '16
21 Jul '16
commit 675eba207124e1bc183880909d3422c2f3c21f63
Author: Kathy Brade <brade(a)pearlcrescent.com>
Date: Wed Mar 9 09:53:08 2016 -0500
fixup! Bug 18371: symlinks incompatible with Gatekeeper signing
Fix a problem where copying the profile from the template failed
if TorBrowser-Data/Tor/PluggableTransports/ did not already exist
(before calling ioutil.TempDir(), the parent directory must exist).
Remove trailing semicolons and unneeded parens.
---
meek-client-torbrowser/meek-client-torbrowser.go | 23 +++++++++++++++--------
1 file changed, 15 insertions(+), 8 deletions(-)
diff --git a/meek-client-torbrowser/meek-client-torbrowser.go b/meek-client-torbrowser/meek-client-torbrowser.go
index 8a3809d..8647d5d 100644
--- a/meek-client-torbrowser/meek-client-torbrowser.go
+++ b/meek-client-torbrowser/meek-client-torbrowser.go
@@ -104,15 +104,22 @@ func ensureProfileExists(profilePath string) error {
// If profileTemplatePath is not set, we are running on a platform that
// expects the profile to already exist.
- if (profileTemplatePath == "") {
- return err;
+ if profileTemplatePath == "" {
+ return err
}
log.Printf("creating profile by copying files from %s to %s\n", profileTemplatePath, profilePath)
- tmpPath, err := ioutil.TempDir(filepath.Dir(profilePath), "tmpMeekProfile")
+ profileParentPath := filepath.Dir(profilePath)
+ err = os.MkdirAll(profileParentPath, os.ModePerm)
+ if err != nil {
+ return err
+ }
+
+ tmpPath, err := ioutil.TempDir(profileParentPath, "tmpMeekProfile")
if err != nil {
return err
}
+
err = os.MkdirAll(tmpPath, os.ModePerm)
if err != nil {
return err
@@ -120,7 +127,7 @@ func ensureProfileExists(profilePath string) error {
// Remove the temporary directory before returning.
defer func() {
- os.RemoveAll(tmpPath);
+ os.RemoveAll(tmpPath)
}()
templatePath, err := filepath.Abs(profileTemplatePath)
@@ -130,13 +137,13 @@ func ensureProfileExists(profilePath string) error {
visit := func(path string, info os.FileInfo, err error) error {
relativePath := strings.TrimPrefix(path, templatePath)
- if (relativePath == "") {
+ if relativePath == "" {
return nil // skip the root directory
}
// If relativePath is a directory, create it; if it is a file, copy it.
- destPath := filepath.Join(tmpPath, relativePath);
- if (info.IsDir()) {
+ destPath := filepath.Join(tmpPath, relativePath)
+ if info.IsDir() {
err = os.MkdirAll(destPath, info.Mode())
} else {
err = copyFile(path, info.Mode(), destPath)
@@ -150,7 +157,7 @@ func ensureProfileExists(profilePath string) error {
return err
}
- return os.Rename(tmpPath, profilePath);
+ return os.Rename(tmpPath, profilePath)
}
1
0
commit 00b78dee347837977dedd66afea6df3a86d20002
Merge: b186d19 cca1ed4
Author: David Fifield <david(a)bamsoftware.com>
Date: Thu Jul 21 15:40:54 2016 -0700
Merge branch 'bug18371'
meek-client-torbrowser/linux.go | 5 +-
meek-client-torbrowser/mac.go | 12 +-
meek-client-torbrowser/meek-client-torbrowser.go | 148 ++++++++++++++++++++++-
meek-client-torbrowser/windows.go | 5 +-
4 files changed, 158 insertions(+), 12 deletions(-)
1
0

[translation/tails-persistence-setup] Update translations for tails-persistence-setup
by translation@torproject.org 21 Jul '16
by translation@torproject.org 21 Jul '16
21 Jul '16
commit ca201d440bf8c714d1c09336c3da4fb676fe83d9
Author: Translation commit bot <translation(a)torproject.org>
Date: Thu Jul 21 16:45:38 2016 +0000
Update translations for tails-persistence-setup
---
br/br.po | 7 ++++---
1 file changed, 4 insertions(+), 3 deletions(-)
diff --git a/br/br.po b/br/br.po
index 0d0052a..f9ffa01 100644
--- a/br/br.po
+++ b/br/br.po
@@ -3,13 +3,14 @@
# This file is distributed under the same license as the PACKAGE package.
#
# Translators:
+# Pierre Morvan <allannkorh(a)yahoo.fr>, 2016
msgid ""
msgstr ""
"Project-Id-Version: The Tor Project\n"
"Report-Msgid-Bugs-To: Tails developers <tails(a)boum.org>\n"
"POT-Creation-Date: 2016-05-25 02:27+0200\n"
-"PO-Revision-Date: 2016-06-06 08:15+0000\n"
-"Last-Translator: carolyn <carolyn(a)anhalt.org>\n"
+"PO-Revision-Date: 2016-07-21 16:30+0000\n"
+"Last-Translator: Pierre Morvan <allannkorh(a)yahoo.fr>\n"
"Language-Team: Breton (http://www.transifex.com/otf/torproject/language/br/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -19,7 +20,7 @@ msgstr ""
#: ../lib/Tails/Persistence/Configuration/Presets.pm:48
msgid "Personal Data"
-msgstr ""
+msgstr "Roadennoù personel"
#: ../lib/Tails/Persistence/Configuration/Presets.pm:50
msgid "Keep files stored in the `Persistent' directory"
1
0
commit 53f9f719850065d0f0a300e97c386b9cf54d6c23
Author: Nick Mathewson <nickm(a)torproject.org>
Date: Thu Jul 21 15:29:56 2016 +0200
ug no, the RIGHT fix.
---
src/common/compat_time.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/common/compat_time.c b/src/common/compat_time.c
index 77fdfc8..a50760e 100644
--- a/src/common/compat_time.c
+++ b/src/common/compat_time.c
@@ -536,8 +536,8 @@ void
monotime_init(void)
{
if (!monotime_initialized) {
- monotime_initialized = 1;
monotime_init_internal();
+ monotime_initialized = 1;
monotime_get(&initialized_at);
#ifdef MONOTIME_COARSE_FN_IS_DIFFERENT
monotime_coarse_get(&initialized_at_coarse);
1
0