tor-commits
Threads by month
- ----- 2025 -----
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
March 2020
- 21 participants
- 1780 discussions

[translation/bridgedb] https://gitweb.torproject.org/translation.git/commit/?h=bridgedb
by translation@torproject.org 31 Mar '20
by translation@torproject.org 31 Mar '20
31 Mar '20
commit d4c33a246a3b7f422c5b2be477392274b17e85df
Author: Translation commit bot <translation(a)torproject.org>
Date: Tue Mar 31 11:15:16 2020 +0000
https://gitweb.torproject.org/translation.git/commit/?h=bridgedb
---
ka/LC_MESSAGES/bridgedb.po | 14 +++++++-------
1 file changed, 7 insertions(+), 7 deletions(-)
diff --git a/ka/LC_MESSAGES/bridgedb.po b/ka/LC_MESSAGES/bridgedb.po
index 4bbc2c6c26..1d2daccbea 100644
--- a/ka/LC_MESSAGES/bridgedb.po
+++ b/ka/LC_MESSAGES/bridgedb.po
@@ -12,7 +12,7 @@ msgstr ""
"Project-Id-Version: Tor Project\n"
"Report-Msgid-Bugs-To: 'https://trac.torproject.org/projects/tor/newticket?component=BridgeDB&keywo…'\n"
"POT-Creation-Date: 2020-03-24 10:22-0700\n"
-"PO-Revision-Date: 2020-03-31 10:40+0000\n"
+"PO-Revision-Date: 2020-03-31 11:13+0000\n"
"Last-Translator: Georgianization\n"
"Language-Team: Georgian (http://www.transifex.com/otf/torproject/language/ka/)\n"
"MIME-Version: 1.0\n"
@@ -328,7 +328,7 @@ msgid ""
"the pseudo-mechanism \"None\". The following list briefly explains how these\n"
"mechanisms work and our %sBridgeDB metrics%s visualize how popular each of the\n"
"mechanisms is."
-msgstr ""
+msgstr "BridgeDB ნერგავს ოთხ საშუალებას ხიდების გასავრცელებლად: „HTTPS“, „Moat“,\n„Email“, და „Reserved“. ხიდები, რომლებიც არ ვრცელდება BridgeDB-ით, იყენებს\nცრუ-საშუალებას „None“. მოცემული ჩამონათვალი მარტივად ხსნის თუ როგორ\nმუშაობს ეს საშუალებები, ჩვენი %sBridgeDB-გაზომვები%s კი ასახავს რა სიხშირით გამოიყენება\nთითოეული მათგანი."
#: bridgedb/strings.py:138
#, python-format
@@ -336,7 +336,7 @@ msgid ""
"The \"HTTPS\" distribution mechanism hands out bridges over this website. To get\n"
"bridges, go to %sbridges.torproject.org%s, select your preferred options, and\n"
"solve the subsequent CAPTCHA."
-msgstr ""
+msgstr "„HTTPS“ გავრცელების საშუალება გადასცემს ხიდებს ვებსაიტებით. ხიდების\nმისაღებად, გახსენით %sbridges.torproject.org%s, აირჩიეთ სასურველი პარამეტრები და \nშემდგომ ამოხსენით CAPTCHA."
#: bridgedb/strings.py:142
#, python-format
@@ -346,7 +346,7 @@ msgid ""
"your Tor Browser's %sTor settings%s, click on \"request a new bridge\", solve the\n"
"subsequent CAPTCHA, and Tor Browser will automatically add your new\n"
"bridges."
-msgstr ""
+msgstr "„Moat“ გავრცელების საშუალება, ნაწილია Tor-ბრაუზერის, მისით მომხმარებლებს \nშეუძლიათ ხიდების მოთხოვნა Tor-ბრაუზერიდანვე. ხიდების მისაღებად, გახსენით\nთქვენი Tor-ბრაუზერის %sTor-პარამეტრები%s, დაწკაპეთ „ახალი ხიდის მოთხოვნა“, ამოხსენით\nმოცემული CAPTCHA და Tor-ბრაუზერი თავად დაამატებს თქვენს ახალ\nხიდებს."
#: bridgedb/strings.py:148
#, python-format
@@ -354,11 +354,11 @@ msgid ""
"Users can request bridges from the \"Email\" distribution mechanism by sending an\n"
"email to %sbridges(a)torproject.org%s and writing \"get transport obfs4\" in the\n"
"email body."
-msgstr ""
+msgstr "მომხმარებლებს ხიდების მოთხოვნა „ელფოსტის“ გამოგზავნითაც შეუძლიათ\nმისამართზე %sbridges(a)torproject.org%s და „get transport obfs4“ ტექსტის დართვით\nწერილის შიგთავსში."
#: bridgedb/strings.py:152
msgid "Reserved"
-msgstr ""
+msgstr "სათადარიგო"
#: bridgedb/strings.py:153
#, python-format
@@ -369,7 +369,7 @@ msgid ""
"bridges. Bridges that are distributed over the \"Reserved\" mechanism may not\n"
"see users for a long time. Note that the \"Reserved\" distribution mechanism is\n"
"called \"Unallocated\" in %sbridge pool assignment%s files."
-msgstr ""
+msgstr "BridgeDB უზრუნველყოფს მცირე ოდენობის ხიდებს, რომლებიც არ ვრცელდება\nავტომატურად. ისინი ინახება სათადარიგოდ ხელით გავრცელებისთვის\nდა გადასაცემად NGO-ებისა და სხვა დაწესებულებების ან პირებისთვის, რომლებიც\nხიდებს საჭიროებენ. ხიდები რომლებიც ვრცელდება „სათადარიგოდ“, შესაძლოა არ\nჩანდეს მომხმარებლებისთვის. შენიშვნა: „სათადარიგო“ გავრცელების საშუალებას \nეწოდება „Unallocated“ %sგას
აცემი ხიდების მარაგის%s ფაილებში."
#: bridgedb/strings.py:160
msgid "None"
1
0

[translation/support-portal] https://gitweb.torproject.org/translation.git/commit/?h=support-portal
by translation@torproject.org 31 Mar '20
by translation@torproject.org 31 Mar '20
31 Mar '20
commit 31660766f90e9c457868a088dcd8e0c2816986b5
Author: Translation commit bot <translation(a)torproject.org>
Date: Tue Mar 31 10:54:21 2020 +0000
https://gitweb.torproject.org/translation.git/commit/?h=support-portal
---
contents+es.po | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/contents+es.po b/contents+es.po
index 2d478bb4be..53bc357f58 100644
--- a/contents+es.po
+++ b/contents+es.po
@@ -607,7 +607,7 @@ msgstr ""
#: https//support.torproject.org/glossary/exit/
#: (content/glossary/exit/contents+en.lrword.term)
msgid "exit"
-msgstr "salir"
+msgstr "salida"
#: https//support.torproject.org/glossary/exit/
#: (content/glossary/exit/contents+en.lrword.definition)
1
0

[translation/bridgedb] https://gitweb.torproject.org/translation.git/commit/?h=bridgedb
by translation@torproject.org 31 Mar '20
by translation@torproject.org 31 Mar '20
31 Mar '20
commit 9d4f5a86ff94fae7e24da40142fea18534ee64ec
Author: Translation commit bot <translation(a)torproject.org>
Date: Tue Mar 31 10:45:14 2020 +0000
https://gitweb.torproject.org/translation.git/commit/?h=bridgedb
---
ka/LC_MESSAGES/bridgedb.po | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/ka/LC_MESSAGES/bridgedb.po b/ka/LC_MESSAGES/bridgedb.po
index 2bf59d8b18..4bbc2c6c26 100644
--- a/ka/LC_MESSAGES/bridgedb.po
+++ b/ka/LC_MESSAGES/bridgedb.po
@@ -12,8 +12,8 @@ msgstr ""
"Project-Id-Version: Tor Project\n"
"Report-Msgid-Bugs-To: 'https://trac.torproject.org/projects/tor/newticket?component=BridgeDB&keywo…'\n"
"POT-Creation-Date: 2020-03-24 10:22-0700\n"
-"PO-Revision-Date: 2020-03-26 19:37+0000\n"
-"Last-Translator: Transifex Bot <>\n"
+"PO-Revision-Date: 2020-03-31 10:40+0000\n"
+"Last-Translator: Georgianization\n"
"Language-Team: Georgian (http://www.transifex.com/otf/torproject/language/ka/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -317,7 +317,7 @@ msgstr "მიიღეთ გადამცემი ხიდები!"
#: bridgedb/strings.py:130
msgid "Bridge distribution mechanisms"
-msgstr ""
+msgstr "ხიდის გავრცელების საშუალებები"
#. TRANSLATORS: Please DO NOT translate "BridgeDB", "HTTPS", and "Moat".
#: bridgedb/strings.py:132
1
0
commit 7b0e856574067f52d012f744286aaf9903f68cb8
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Tue Mar 31 12:04:49 2020 +0200
Simplify logging configuration.
Implements #33549.
---
CHANGELOG.md | 3 +
src/build | 2 +-
.../org/torproject/metrics/onionoo/cron/Main.java | 42 +++++------
.../metrics/onionoo/docs/BandwidthStatus.java | 6 +-
.../metrics/onionoo/docs/ClientsHistory.java | 19 ++---
.../metrics/onionoo/docs/ClientsStatus.java | 4 +-
.../metrics/onionoo/docs/DateTimeHelper.java | 6 +-
.../metrics/onionoo/docs/DocumentStore.java | 68 +++++++++---------
.../metrics/onionoo/docs/NodeStatus.java | 20 +++---
.../metrics/onionoo/docs/UpdateStatus.java | 5 +-
.../metrics/onionoo/docs/UptimeHistory.java | 14 ++--
.../metrics/onionoo/docs/UptimeStatus.java | 4 +-
.../metrics/onionoo/docs/WeightsStatus.java | 16 ++---
.../metrics/onionoo/server/NodeIndexer.java | 6 +-
.../metrics/onionoo/server/PerformanceMetrics.java | 29 ++++----
.../metrics/onionoo/server/ServerMain.java | 6 +-
.../metrics/onionoo/updater/DescriptorQueue.java | 19 ++---
.../metrics/onionoo/updater/DescriptorSource.java | 32 +++++----
.../metrics/onionoo/updater/LookupService.java | 34 ++++-----
.../onionoo/updater/NodeDetailsStatusUpdater.java | 21 +++---
.../onionoo/updater/StatusUpdateRunner.java | 15 ++--
.../metrics/onionoo/util/FormattingUtils.java | 4 +-
.../onionoo/writer/BandwidthDocumentWriter.java | 4 +-
.../onionoo/writer/ClientsDocumentWriter.java | 4 +-
.../onionoo/writer/DetailsDocumentWriter.java | 4 +-
.../onionoo/writer/DocumentWriterRunner.java | 6 +-
.../onionoo/writer/SummaryDocumentWriter.java | 4 +-
.../onionoo/writer/UptimeDocumentWriter.java | 4 +-
.../onionoo/writer/WeightsDocumentWriter.java | 4 +-
src/main/resources/logback.xml | 82 ----------------------
30 files changed, 209 insertions(+), 278 deletions(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index d5877d3..f8666fd 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,8 @@
# Changes in version 8.0-1.2?.? - 2020-0?-??
+ * Minor changes
+ - Simplify logging configuration.
+
# Changes in version 8.0-1.25.0 - 2020-02-20
diff --git a/src/build b/src/build
index 264e498..fd85646 160000
--- a/src/build
+++ b/src/build
@@ -1 +1 @@
-Subproject commit 264e498f54a20f7d299daaf2533d043f880e6a8b
+Subproject commit fd856466bcb260f53ef69a24c102d0e49d171cc3
diff --git a/src/main/java/org/torproject/metrics/onionoo/cron/Main.java b/src/main/java/org/torproject/metrics/onionoo/cron/Main.java
index e79edf5..9f3c3c6 100644
--- a/src/main/java/org/torproject/metrics/onionoo/cron/Main.java
+++ b/src/main/java/org/torproject/metrics/onionoo/cron/Main.java
@@ -24,7 +24,7 @@ public class Main implements Runnable {
private Main() {}
- private Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
/** Executes a single update run or partial update run, or initiates
* hourly executions, depending on the given command-line arguments. */
@@ -98,7 +98,7 @@ public class Main implements Runnable {
private void runOrScheduleExecutions() {
if (!this.defaultMode) {
- this.log.info("Going to run one-time updater ... ");
+ logger.info("Going to run one-time updater ... ");
this.run();
} else {
this.scheduleExecutions();
@@ -109,13 +109,13 @@ public class Main implements Runnable {
Executors.newScheduledThreadPool(1);
private void scheduleExecutions() {
- this.log.info("Periodic updater started.");
+ logger.info("Periodic updater started.");
final Runnable mainRunnable = this;
int currentMinute = Calendar.getInstance().get(Calendar.MINUTE);
int initialDelay = (75 - currentMinute + currentMinute % 5) % 60;
/* Run after initialDelay delay and then every hour. */
- this.log.info("Periodic updater will start every hour at minute {}.",
+ logger.info("Periodic updater will start every hour at minute {}.",
(currentMinute + initialDelay) % 60);
this.scheduler.scheduleAtFixedRate(mainRunnable, initialDelay, 60,
TimeUnit.MINUTES);
@@ -143,23 +143,23 @@ public class Main implements Runnable {
private DocumentWriterRunner dwr;
private void initialize() {
- this.log.debug("Started update ...");
+ logger.debug("Started update ...");
if (!this.writeOnly) {
this.dso = DescriptorSourceFactory.getDescriptorSource();
- this.log.info("Initialized descriptor source");
+ logger.info("Initialized descriptor source");
}
if (!this.downloadOnly) {
this.ds = DocumentStoreFactory.getDocumentStore();
- this.log.info("Initialized document store");
+ logger.info("Initialized document store");
}
if (!this.downloadOnly && !this.writeOnly) {
this.sur = new StatusUpdateRunner();
- this.log.info("Initialized status update runner");
+ logger.info("Initialized status update runner");
}
if (!this.downloadOnly && !this.updateOnly) {
this.ds.setOutDir(outDir);
this.dwr = new DocumentWriterRunner();
- this.log.info("Initialized document writer runner");
+ logger.info("Initialized document writer runner");
}
}
@@ -167,7 +167,7 @@ public class Main implements Runnable {
if (this.updateOnly || this.writeOnly) {
return;
}
- this.log.info("Downloading descriptors.");
+ logger.info("Downloading descriptors.");
this.dso.downloadDescriptors();
}
@@ -175,9 +175,9 @@ public class Main implements Runnable {
if (this.downloadOnly || this.writeOnly) {
return;
}
- this.log.info("Reading descriptors.");
+ logger.info("Reading descriptors.");
this.dso.readDescriptors();
- this.log.info("Updating internal status files.");
+ logger.info("Updating internal status files.");
this.sur.updateStatuses();
}
@@ -185,24 +185,24 @@ public class Main implements Runnable {
if (this.downloadOnly || this.updateOnly) {
return;
}
- log.info("Updating document files.");
+ logger.info("Updating document files.");
this.dwr.writeDocuments();
}
private void shutDown() {
- log.info("Shutting down.");
+ logger.info("Shutting down.");
if (this.dso != null) {
this.dso.writeHistoryFiles();
- log.info("Wrote parse histories");
+ logger.info("Wrote parse histories");
}
if (this.ds != null) {
this.ds.flushDocumentCache();
- this.log.info("Flushed document cache");
+ logger.info("Flushed document cache");
}
}
private void gatherStatistics() {
- this.log.info("Gathering statistics.");
+ logger.info("Gathering statistics.");
if (this.sur != null) {
this.sur.logStatistics();
}
@@ -210,23 +210,23 @@ public class Main implements Runnable {
this.dwr.logStatistics();
}
if (this.dso != null) {
- this.log.info("Descriptor source\n{}", this.dso.getStatsString());
+ logger.info("Descriptor source\n{}", this.dso.getStatsString());
}
if (this.ds != null) {
- this.log.info("Document store\n{}", this.ds.getStatsString());
+ logger.info("Document store\n{}", this.ds.getStatsString());
}
}
private void cleanUp() {
/* Clean up to prevent out-of-memory exception, and to ensure that the
* next execution starts with a fresh descriptor source. */
- this.log.info("Cleaning up.");
+ logger.info("Cleaning up.");
if (this.ds != null) {
this.ds.invalidateDocumentCache();
}
DocumentStoreFactory.setDocumentStore(null);
DescriptorSourceFactory.setDescriptorSource(null);
- this.log.info("Done.");
+ logger.info("Done.");
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/BandwidthStatus.java b/src/main/java/org/torproject/metrics/onionoo/docs/BandwidthStatus.java
index a3ceb69..2a68de6 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/BandwidthStatus.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/BandwidthStatus.java
@@ -15,7 +15,7 @@ import java.util.TreeMap;
public class BandwidthStatus extends Document {
- private static Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
BandwidthStatus.class);
private transient boolean isDirty = false;
@@ -55,7 +55,7 @@ public class BandwidthStatus extends Document {
String line = s.nextLine();
String[] parts = line.split(" ");
if (parts.length != 6) {
- log.error("Illegal line '{}' in bandwidth history. Skipping this "
+ logger.error("Illegal line '{}' in bandwidth history. Skipping this "
+ "line.", line);
continue;
}
@@ -64,7 +64,7 @@ public class BandwidthStatus extends Document {
long startMillis = DateTimeHelper.parse(parts[1] + " " + parts[2]);
long endMillis = DateTimeHelper.parse(parts[3] + " " + parts[4]);
if (startMillis < 0L || endMillis < 0L) {
- log.error("Could not parse timestamp while reading "
+ logger.error("Could not parse timestamp while reading "
+ "bandwidth history. Skipping.");
break;
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/ClientsHistory.java b/src/main/java/org/torproject/metrics/onionoo/docs/ClientsHistory.java
index bab618e..89cc135 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/ClientsHistory.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/ClientsHistory.java
@@ -12,7 +12,7 @@ import java.util.TreeMap;
public class ClientsHistory implements Comparable<ClientsHistory> {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
ClientsHistory.class);
private long startMillis;
@@ -73,27 +73,27 @@ public class ClientsHistory implements Comparable<ClientsHistory> {
String responseHistoryString) {
String[] parts = responseHistoryString.split(" ", 8);
if (parts.length != 8) {
- log.warn("Invalid number of space-separated strings in clients history: "
- + "'{}'. Skipping", responseHistoryString);
+ logger.warn("Invalid number of space-separated strings in clients "
+ + "history: '{}'. Skipping", responseHistoryString);
return null;
}
long startMillis = DateTimeHelper.parse(parts[0] + " " + parts[1]);
long endMillis = DateTimeHelper.parse(parts[2] + " " + parts[3]);
if (startMillis < 0L || endMillis < 0L) {
- log.warn("Invalid start or end timestamp in clients history: '{}'. "
+ logger.warn("Invalid start or end timestamp in clients history: '{}'. "
+ "Skipping.", responseHistoryString);
return null;
}
if (startMillis >= endMillis) {
- log.warn("Start timestamp must be smaller than end timestamp in clients "
- + "history: '{}'. Skipping.", responseHistoryString);
+ logger.warn("Start timestamp must be smaller than end timestamp in "
+ + "clients history: '{}'. Skipping.", responseHistoryString);
return null;
}
double totalResponses;
try {
totalResponses = Double.parseDouble(parts[4]);
} catch (NumberFormatException e) {
- log.warn("Invalid response number format in clients history: '{}'. "
+ logger.warn("Invalid response number format in clients history: '{}'. "
+ "Skipping.", responseHistoryString);
return null;
}
@@ -105,8 +105,9 @@ public class ClientsHistory implements Comparable<ClientsHistory> {
parseResponses(parts[7]);
if (responsesByCountry == null || responsesByTransport == null
|| responsesByVersion == null) {
- log.warn("Invalid format of responses by country, transport, or version "
- + "in clients history: '{}'. Skipping.", responseHistoryString);
+ logger.warn("Invalid format of responses by country, transport, or "
+ + "version in clients history: '{}'. Skipping.",
+ responseHistoryString);
return null;
}
return new ClientsHistory(startMillis, endMillis, totalResponses,
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/ClientsStatus.java b/src/main/java/org/torproject/metrics/onionoo/docs/ClientsStatus.java
index 09899b6..19d2e7f 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/ClientsStatus.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/ClientsStatus.java
@@ -12,7 +12,7 @@ import java.util.TreeSet;
public class ClientsStatus extends Document {
- private static Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
ClientsStatus.class);
private transient boolean isDirty = false;
@@ -44,7 +44,7 @@ public class ClientsStatus extends Document {
if (parsedLine != null) {
this.history.add(parsedLine);
} else {
- log.error("Could not parse clients history line '{}'. Skipping.",
+ logger.error("Could not parse clients history line '{}'. Skipping.",
line);
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/DateTimeHelper.java b/src/main/java/org/torproject/metrics/onionoo/docs/DateTimeHelper.java
index e49b48a..b0bda36 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/DateTimeHelper.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/DateTimeHelper.java
@@ -17,7 +17,7 @@ public class DateTimeHelper {
public static final long NO_TIME_AVAILABLE = -1L;
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DateTimeHelper.class);
private DateTimeHelper() {
@@ -99,13 +99,13 @@ public class DateTimeHelper {
* string cannot be parsed. */
public static long parse(String string, String format) {
if (null == string) {
- log.warn("Date String was null.");
+ logger.warn("Date String was null.");
return NO_TIME_AVAILABLE;
}
try {
return getDateFormat(format).parse(string).getTime();
} catch (ParseException e) {
- log.warn(e.getMessage(), e);
+ logger.warn(e.getMessage(), e);
return NO_TIME_AVAILABLE;
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/DocumentStore.java b/src/main/java/org/torproject/metrics/onionoo/docs/DocumentStore.java
index 4ad6709..e74094a 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/DocumentStore.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/DocumentStore.java
@@ -41,7 +41,7 @@ import java.util.TreeSet;
// TODO Also look into simple key-value stores instead of real databases.
public class DocumentStore {
- private static Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DocumentStore.class);
private static ObjectMapper objectMapper = new ObjectMapper();
@@ -143,8 +143,8 @@ public class DocumentStore {
this.listedFiles += parsedNodeStatuses.size();
this.listOperations++;
} catch (IOException e) {
- log.error("Could not read file '{}'.", summaryFile.getAbsolutePath(),
- e);
+ logger.error("Could not read file '{}'.",
+ summaryFile.getAbsolutePath(), e);
}
}
}
@@ -186,7 +186,7 @@ public class DocumentStore {
this.listedFiles += parsedSummaryDocuments.size();
this.listOperations++;
} catch (IOException e) {
- log.error("Could not parse summary document '{}' from file '{}'.",
+ logger.error("Could not parse summary document '{}' from file '{}'.",
line, summaryFile.getAbsolutePath(), e);
}
}
@@ -311,7 +311,7 @@ public class DocumentStore {
try {
documentString = objectMapper.writeValueAsString(document);
} catch (JsonProcessingException e) {
- log.error("Serializing failed for type {}.",
+ logger.error("Serializing failed for type {}.",
document.getClass().getName(), e);
return false;
}
@@ -328,7 +328,7 @@ public class DocumentStore {
documentString = FormattingUtils.replaceValidUtf(
objectMapper.writeValueAsString(document));
} catch (JsonProcessingException e) {
- log.error("Serializing failed for type {}.",
+ logger.error("Serializing failed for type {}.",
document.getClass().getName(), e);
return false;
}
@@ -347,13 +347,13 @@ public class DocumentStore {
|| document instanceof UpdateStatus) {
documentString = document.toDocumentString();
} else {
- log.error("Serializing is not supported for type {}.",
+ logger.error("Serializing is not supported for type {}.",
document.getClass().getName());
return false;
}
try {
if (documentString.length() > ONE_MIBIBYTE) {
- log.warn("Attempting to store very large document file: path='{}', "
+ logger.warn("Attempting to store very large document file: path='{}', "
+ "bytes={}", documentFile.getAbsolutePath(),
documentString.length());
}
@@ -377,7 +377,7 @@ public class DocumentStore {
this.storedFiles++;
this.storedBytes += documentString.length();
} catch (IOException e) {
- log.error("Could not write file '{}'.", documentFile.getAbsolutePath(),
+ logger.error("Could not write file '{}'.", documentFile.getAbsolutePath(),
e);
return false;
}
@@ -438,10 +438,10 @@ public class DocumentStore {
String contact = null;
for (String orAddressAndPort : detailsDocument.getOrAddresses()) {
if (!orAddressAndPort.contains(":")) {
- log.warn("Attempt to create summary document from details document for "
- + "fingerprint {} failed because of invalid OR address/port: '{}'. "
- + "Not returning a summary document in this case.", fingerprint,
- orAddressAndPort);
+ logger.warn("Attempt to create summary document from details document "
+ + "for fingerprint {} failed because of invalid OR address/port: "
+ + "'{}'. Not returning a summary document in this case.",
+ fingerprint, orAddressAndPort);
return null;
}
String orAddress = orAddressAndPort.substring(0,
@@ -482,7 +482,7 @@ public class DocumentStore {
/* Document file does not exist. That's okay. */
return null;
} else if (documentFile.isDirectory()) {
- log.error("Could not read file '{}', because it is a directory.",
+ logger.error("Could not read file '{}', because it is a directory.",
documentFile.getAbsolutePath());
return null;
}
@@ -504,11 +504,12 @@ public class DocumentStore {
this.retrievedFiles++;
this.retrievedBytes += documentString.length();
} catch (IOException e) {
- log.error("Could not read file '{}'.", documentFile.getAbsolutePath(), e);
+ logger.error("Could not read file '{}'.", documentFile.getAbsolutePath(),
+ e);
return null;
}
if (documentString.length() > ONE_MIBIBYTE) {
- log.warn("Retrieved very large document file: path='{}', bytes={}",
+ logger.warn("Retrieved very large document file: path='{}', bytes={}",
documentFile.getAbsolutePath(), documentString.length());
}
T result = null;
@@ -532,7 +533,7 @@ public class DocumentStore {
return this.retrieveParsedDocumentFile(documentType, "{"
+ documentString + "}");
} else {
- log.error("Parsing is not supported for type {}.",
+ logger.error("Parsing is not supported for type {}.",
documentType.getName());
}
return result;
@@ -546,10 +547,10 @@ public class DocumentStore {
result.setFromDocumentString(documentString);
} catch (ReflectiveOperationException e) {
/* Handle below. */
- log.error(e.getMessage(), e);
+ logger.error(e.getMessage(), e);
}
if (result == null) {
- log.error("Could not initialize parsed status file of type {}.",
+ logger.error("Could not initialize parsed status file of type {}.",
documentType.getName());
}
return result;
@@ -562,11 +563,11 @@ public class DocumentStore {
result = objectMapper.readValue(documentString, documentType);
} catch (Throwable e) {
/* Handle below. */
- log.error(documentString);
- log.error(e.getMessage(), e);
+ logger.error(documentString);
+ logger.error(e.getMessage(), e);
}
if (result == null) {
- log.error("Could not initialize parsed document of type {}.",
+ logger.error("Could not initialize parsed document of type {}.",
documentType.getName());
}
return result;
@@ -580,10 +581,10 @@ public class DocumentStore {
result.setDocumentString(documentString);
} catch (ReflectiveOperationException e) {
/* Handle below. */
- log.error(e.getMessage(), e);
+ logger.error(e.getMessage(), e);
}
if (result == null) {
- log.error("Could not initialize unparsed document of type {}.",
+ logger.error("Could not initialize unparsed document of type {}.",
documentType.getName());
}
return result;
@@ -626,7 +627,8 @@ public class DocumentStore {
Class<T> documentType, String fingerprint) {
File documentFile = this.getDocumentFile(documentType, fingerprint);
if (documentFile == null || !documentFile.delete()) {
- log.error("Could not delete file '{}'.", documentFile.getAbsolutePath());
+ logger.error("Could not delete file '{}'.",
+ documentFile.getAbsolutePath());
return false;
}
this.removedFiles++;
@@ -638,7 +640,7 @@ public class DocumentStore {
File documentFile = null;
if (fingerprint == null && !documentType.equals(UpdateStatus.class)
&& !documentType.equals(UptimeStatus.class)) {
- log.warn("Attempted to locate a document file of type {} without "
+ logger.warn("Attempted to locate a document file of type {} without "
+ "providing a fingerprint. Such a file does not exist.",
documentType.getName());
return null;
@@ -732,7 +734,7 @@ public class DocumentStore {
private void writeNodeStatuses() {
File directory = this.statusDir;
if (directory == null) {
- log.error("Unable to write node statuses without knowing the "
+ logger.error("Unable to write node statuses without knowing the "
+ "'status' directory to write to!");
return;
}
@@ -753,7 +755,7 @@ public class DocumentStore {
if (line != null) {
sb.append(line).append("\n");
} else {
- log.error("Could not serialize relay node status '{}'",
+ logger.error("Could not serialize relay node status '{}'",
relay.getFingerprint());
}
}
@@ -762,7 +764,7 @@ public class DocumentStore {
if (line != null) {
sb.append(line).append("\n");
} else {
- log.error("Could not serialize bridge node status '{}'",
+ logger.error("Could not serialize bridge node status '{}'",
bridge.getFingerprint());
}
}
@@ -775,7 +777,8 @@ public class DocumentStore {
this.storedFiles++;
this.storedBytes += documentString.length();
} catch (IOException e) {
- log.error("Could not write file '{}'.", summaryFile.getAbsolutePath(), e);
+ logger.error("Could not write file '{}'.", summaryFile.getAbsolutePath(),
+ e);
}
}
@@ -804,7 +807,7 @@ public class DocumentStore {
if (line != null) {
sb.append(line).append("\n");
} else {
- log.error("Could not serialize relay summary document '{}'",
+ logger.error("Could not serialize relay summary document '{}'",
summaryDocument.getFingerprint());
}
}
@@ -818,7 +821,8 @@ public class DocumentStore {
this.storedFiles++;
this.storedBytes += documentString.length();
} catch (IOException e) {
- log.error("Could not write file '{}'.", summaryFile.getAbsolutePath(), e);
+ logger.error("Could not write file '{}'.", summaryFile.getAbsolutePath(),
+ e);
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/NodeStatus.java b/src/main/java/org/torproject/metrics/onionoo/docs/NodeStatus.java
index 53cd9ec..e343045 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/NodeStatus.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/NodeStatus.java
@@ -77,7 +77,7 @@ import java.util.stream.Collectors;
*/
public class NodeStatus extends Document {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
NodeStatus.class);
/* From most recently published server descriptor: */
@@ -550,7 +550,7 @@ public class NodeStatus extends Document {
try {
String[] parts = documentString.trim().split("\t");
if (parts.length < 23) {
- log.error("Too few space-separated values in line '{}'. Skipping.",
+ logger.error("Too few space-separated values in line '{}'. Skipping.",
documentString.trim());
return null;
}
@@ -565,7 +565,7 @@ public class NodeStatus extends Document {
if (addresses.contains(";")) {
String[] addressParts = addresses.split(";", -1);
if (addressParts.length != 3) {
- log.error("Invalid addresses entry in line '{}'. Skipping.",
+ logger.error("Invalid addresses entry in line '{}'. Skipping.",
documentString.trim());
return null;
}
@@ -587,11 +587,11 @@ public class NodeStatus extends Document {
long lastSeenMillis = DateTimeHelper.parse(parts[4] + " "
+ parts[5]);
if (lastSeenMillis < 0L) {
- log.error("Parse exception while parsing node status line '{}'. "
+ logger.error("Parse exception while parsing node status line '{}'. "
+ "Skipping.", documentString);
return null;
} else if (lastSeenMillis == 0L) {
- log.debug("Skipping node status with fingerprint {} that has so far "
+ logger.debug("Skipping node status with fingerprint {} that has so far "
+ "never been seen in a network status.", fingerprint);
return null;
}
@@ -614,7 +614,7 @@ public class NodeStatus extends Document {
}
long firstSeenMillis = DateTimeHelper.parse(parts[15] + " " + parts[16]);
if (firstSeenMillis < 0L) {
- log.error("Parse exception while parsing node status line '{}'. "
+ logger.error("Parse exception while parsing node status line '{}'. "
+ "Skipping.", documentString);
return null;
}
@@ -624,7 +624,7 @@ public class NodeStatus extends Document {
lastChangedAddresses = DateTimeHelper.parse(parts[17] + " "
+ parts[18]);
if (lastChangedAddresses < 0L) {
- log.error("Parse exception while parsing node status line '{}'. "
+ logger.error("Parse exception while parsing node status line '{}'. "
+ "Skipping.", documentString);
return null;
}
@@ -690,13 +690,13 @@ public class NodeStatus extends Document {
}
return nodeStatus;
} catch (NumberFormatException e) {
- log.error("Number format exception while parsing node status line '{}'. "
- + "Skipping.", documentString, e);
+ logger.error("Number format exception while parsing node status line "
+ + "'{}'. Skipping.", documentString, e);
return null;
} catch (Exception e) {
/* This catch block is only here to handle yet unknown errors. It
* should go away once we're sure what kind of errors can occur. */
- log.error("Unknown exception while parsing node status line '{}'. "
+ logger.error("Unknown exception while parsing node status line '{}'. "
+ "Skipping.", documentString, e);
return null;
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/UpdateStatus.java b/src/main/java/org/torproject/metrics/onionoo/docs/UpdateStatus.java
index 10b6123..a840585 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/UpdateStatus.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/UpdateStatus.java
@@ -8,7 +8,8 @@ import org.slf4j.LoggerFactory;
public class UpdateStatus extends Document {
- private static Logger log = LoggerFactory.getLogger(UpdateStatus.class);
+ private static final Logger logger = LoggerFactory.getLogger(
+ UpdateStatus.class);
private long updatedMillis;
@@ -25,7 +26,7 @@ public class UpdateStatus extends Document {
try {
this.updatedMillis = Long.parseLong(documentString.trim());
} catch (NumberFormatException e) {
- log.error("Could not parse timestamp '{}'. Setting to 1970-01-01 "
+ logger.error("Could not parse timestamp '{}'. Setting to 1970-01-01 "
+ "00:00:00.", documentString);
this.updatedMillis = 0L;
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/UptimeHistory.java b/src/main/java/org/torproject/metrics/onionoo/docs/UptimeHistory.java
index 07145e4..595a165 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/UptimeHistory.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/UptimeHistory.java
@@ -12,7 +12,7 @@ import java.util.TreeSet;
public class UptimeHistory implements Comparable<UptimeHistory> {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
UptimeHistory.class);
private boolean relay;
@@ -58,22 +58,22 @@ public class UptimeHistory implements Comparable<UptimeHistory> {
public static UptimeHistory fromString(String uptimeHistoryString) {
String[] parts = uptimeHistoryString.split(" ", -1);
if (parts.length < 3) {
- log.warn("Invalid number of space-separated strings in uptime history: "
- + "'{}'. Skipping", uptimeHistoryString);
+ logger.warn("Invalid number of space-separated strings in uptime "
+ + "history: '{}'. Skipping", uptimeHistoryString);
return null;
}
boolean relay = false;
if (parts[0].equalsIgnoreCase("r")) {
relay = true;
} else if (!parts[0].equals("b")) {
- log.warn("Invalid node type in uptime history: '{}'. Supported types are "
- + "'r', 'R', and 'b'. Skipping.", uptimeHistoryString);
+ logger.warn("Invalid node type in uptime history: '{}'. Supported types "
+ + "are 'r', 'R', and 'b'. Skipping.", uptimeHistoryString);
return null;
}
long startMillis = DateTimeHelper.parse(parts[1],
DateTimeHelper.DATEHOUR_NOSPACE_FORMAT);
if (DateTimeHelper.NO_TIME_AVAILABLE == startMillis) {
- log.warn("Invalid start timestamp in uptime history: '{}'. Skipping.",
+ logger.warn("Invalid start timestamp in uptime history: '{}'. Skipping.",
uptimeHistoryString);
return null;
}
@@ -81,7 +81,7 @@ public class UptimeHistory implements Comparable<UptimeHistory> {
try {
uptimeHours = Integer.parseInt(parts[2]);
} catch (NumberFormatException e) {
- log.warn("Invalid number format in uptime history: '{}'. Skipping.",
+ logger.warn("Invalid number format in uptime history: '{}'. Skipping.",
uptimeHistoryString);
return null;
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/UptimeStatus.java b/src/main/java/org/torproject/metrics/onionoo/docs/UptimeStatus.java
index 912dd66..b65cc8e 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/UptimeStatus.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/UptimeStatus.java
@@ -13,7 +13,7 @@ import java.util.TreeSet;
public class UptimeStatus extends Document {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
UptimeStatus.class);
private transient boolean isDirty = false;
@@ -51,7 +51,7 @@ public class UptimeStatus extends Document {
this.bridgeHistory.add(parsedLine);
}
} else {
- log.error("Could not parse uptime history line '{}'. Skipping.",
+ logger.error("Could not parse uptime history line '{}'. Skipping.",
line);
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/docs/WeightsStatus.java b/src/main/java/org/torproject/metrics/onionoo/docs/WeightsStatus.java
index d3783fc..b9a8265 100644
--- a/src/main/java/org/torproject/metrics/onionoo/docs/WeightsStatus.java
+++ b/src/main/java/org/torproject/metrics/onionoo/docs/WeightsStatus.java
@@ -15,7 +15,7 @@ import java.util.TreeMap;
public class WeightsStatus extends Document {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
WeightsStatus.class);
private transient boolean isDirty = false;
@@ -59,8 +59,8 @@ public class WeightsStatus extends Document {
continue;
}
if (parts.length != 9 && parts.length != 11) {
- log.error("Illegal line '{}' in weights status file. Skipping this "
- + "line.", line);
+ logger.error("Illegal line '{}' in weights status file. Skipping "
+ + "this line.", line);
continue;
}
if (parts[4].equals("NaN")) {
@@ -71,13 +71,13 @@ public class WeightsStatus extends Document {
long validAfterMillis = DateTimeHelper.parse(parts[0] + " " + parts[1]);
long freshUntilMillis = DateTimeHelper.parse(parts[2] + " " + parts[3]);
if (validAfterMillis < 0L || freshUntilMillis < 0L) {
- log.error("Could not parse timestamp while reading "
+ logger.error("Could not parse timestamp while reading "
+ "weights status file. Skipping.");
break;
}
if (validAfterMillis > freshUntilMillis) {
- log.error("Illegal dates in '{}' of weights status file. Skipping.",
- line);
+ logger.error("Illegal dates in '{}' of weights status file. "
+ + "Skipping.", line);
break;
}
long[] interval = new long[] { validAfterMillis, freshUntilMillis };
@@ -92,8 +92,8 @@ public class WeightsStatus extends Document {
weights[6] = parseWeightDouble(parts[10]);
}
} catch (NumberFormatException e) {
- log.error("Could not parse weights values in line '{}' while reading "
- + "weights status file. Skipping.", line);
+ logger.error("Could not parse weights values in line '{}' while "
+ + "reading weights status file. Skipping.", line);
break;
}
this.history.put(interval, weights);
diff --git a/src/main/java/org/torproject/metrics/onionoo/server/NodeIndexer.java b/src/main/java/org/torproject/metrics/onionoo/server/NodeIndexer.java
index b32b1bc..9ba941a 100644
--- a/src/main/java/org/torproject/metrics/onionoo/server/NodeIndexer.java
+++ b/src/main/java/org/torproject/metrics/onionoo/server/NodeIndexer.java
@@ -30,7 +30,7 @@ import javax.servlet.ServletContextListener;
public class NodeIndexer implements ServletContextListener, Runnable {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
NodeIndexer.class);
@Override
@@ -38,7 +38,7 @@ public class NodeIndexer implements ServletContextListener, Runnable {
File outDir = new File(System.getProperty("onionoo.basedir",
"/srv/onionoo.torproject.org/onionoo"), "out");
if (!outDir.exists() || !outDir.isDirectory()) {
- log.error("\n\n\tOut-dir not found! Expected directory: {}"
+ logger.error("\n\n\tOut-dir not found! Expected directory: {}"
+ "\n\tSet system property 'onionoo.basedir'.", outDir);
System.exit(1);
}
@@ -115,7 +115,7 @@ public class NodeIndexer implements ServletContextListener, Runnable {
}
}
} catch (Throwable th) { // catch all and log
- log.error("Indexing failed: {}", th.getMessage(), th);
+ logger.error("Indexing failed: {}", th.getMessage(), th);
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/server/PerformanceMetrics.java b/src/main/java/org/torproject/metrics/onionoo/server/PerformanceMetrics.java
index 2ffd460..22a5573 100644
--- a/src/main/java/org/torproject/metrics/onionoo/server/PerformanceMetrics.java
+++ b/src/main/java/org/torproject/metrics/onionoo/server/PerformanceMetrics.java
@@ -14,7 +14,7 @@ import java.util.TimeZone;
public class PerformanceMetrics {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
PerformanceMetrics.class);
private static final Object lock = new Object();
@@ -65,19 +65,24 @@ public class PerformanceMetrics {
SimpleDateFormat dateTimeFormat = new SimpleDateFormat(
"yyyy-MM-dd HH:mm:ss");
dateTimeFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- log.info("Request statistics ({}, {} s):",
+ logger.info("Request statistics ({}, {} s):",
dateTimeFormat.format(lastLoggedMillis + LOG_INTERVAL_MILLIS),
LOG_INTERVAL_SECONDS);
- log.info(" Total processed requests: {}", totalProcessedRequests);
- log.info(" Most frequently requested resource: {}",
+ logger.info(" Total processed requests: {}", totalProcessedRequests);
+ logger.info(" Most frequently requested resource: {}",
requestsByResourceType);
- log.info(" Most frequently requested parameter combinations: {}",
+ logger.info(" Most frequently requested parameter combinations: {}",
requestsByParameters);
- log.info(" Matching relays per request: {}", matchingRelayDocuments);
- log.info(" Matching bridges per request: {}", matchingBridgeDocuments);
- log.info(" Written characters per response: {}", writtenChars);
- log.info(" Milliseconds to handle request: {}", handleRequestMillis);
- log.info(" Milliseconds to build response: {}", buildResponseMillis);
+ logger.info(" Matching relays per request: {}",
+ matchingRelayDocuments);
+ logger.info(" Matching bridges per request: {}",
+ matchingBridgeDocuments);
+ logger.info(" Written characters per response: {}",
+ writtenChars);
+ logger.info(" Milliseconds to handle request: {}",
+ handleRequestMillis);
+ logger.info(" Milliseconds to build response: {}",
+ buildResponseMillis);
totalProcessedRequests.clear();
requestsByResourceType.clear();
requestsByParameters.clear();
@@ -94,7 +99,7 @@ public class PerformanceMetrics {
totalProcessedRequests.increment();
long handlingTime = parsedRequestMillis - receivedRequestMillis;
if (handlingTime > DateTimeHelper.ONE_SECOND) {
- log.warn("longer request handling: {} ms for {} params: {} and {} "
+ logger.warn("longer request handling: {} ms for {} params: {} and {} "
+ "chars.", handlingTime, resourceType, parameterKeys,
charsWritten);
}
@@ -106,7 +111,7 @@ public class PerformanceMetrics {
writtenChars.addLong(charsWritten);
long responseTime = writtenResponseMillis - parsedRequestMillis;
if (responseTime > DateTimeHelper.ONE_SECOND) {
- log.warn("longer response building: {} ms for {} params: {} and {} "
+ logger.warn("longer response building: {} ms for {} params: {} and {} "
+ "chars.", responseTime, resourceType, parameterKeys,
charsWritten);
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/server/ServerMain.java b/src/main/java/org/torproject/metrics/onionoo/server/ServerMain.java
index 8bc2fa4..0cab37a 100644
--- a/src/main/java/org/torproject/metrics/onionoo/server/ServerMain.java
+++ b/src/main/java/org/torproject/metrics/onionoo/server/ServerMain.java
@@ -11,21 +11,21 @@ import org.slf4j.LoggerFactory;
public class ServerMain {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
ServerMain.class);
/** Starts the web server listening for incoming client connections. */
public static void main(String[] args) {
try {
Resource onionooXml = Resource.newSystemResource("jetty.xml");
- log.info("Reading configuration from '{}'.", onionooXml);
+ logger.info("Reading configuration from '{}'.", onionooXml);
XmlConfiguration configuration = new XmlConfiguration(
onionooXml.getInputStream());
Server server = (Server) configuration.configure();
server.start();
server.join();
} catch (Exception ex) {
- log.error("Exiting, because of: {}", ex.getMessage(), ex);
+ logger.error("Exiting, because of: {}", ex.getMessage(), ex);
System.exit(1);
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorQueue.java b/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorQueue.java
index 8ebae37..972bde8 100644
--- a/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorQueue.java
+++ b/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorQueue.java
@@ -24,7 +24,7 @@ import java.util.TreeMap;
class DescriptorQueue {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DescriptorQueue.class);
private File statusDir;
@@ -89,12 +89,12 @@ class DescriptorQueue {
String[] parts = line.split(" ", 2);
excludedFiles.put(parts[1], Long.parseLong(parts[0]));
} catch (NumberFormatException e) {
- log.error("Illegal line '{}' in parse history. Skipping line.",
+ logger.error("Illegal line '{}' in parse history. Skipping line.",
line);
}
}
} catch (IOException e) {
- log.error("Could not read history file '{}'. Not excluding "
+ logger.error("Could not read history file '{}'. Not excluding "
+ "descriptors in this execution.",
this.historyFile.getAbsolutePath(), e);
return;
@@ -109,8 +109,8 @@ class DescriptorQueue {
return;
}
if (null == this.descriptors) {
- log.debug("Not writing history file {}, because we did not read a single "
- + "descriptor from {}.", this.historyFile, this.directory);
+ logger.debug("Not writing history file {}, because we did not read a "
+ + "single descriptor from {}.", this.historyFile, this.directory);
return;
}
SortedMap<String, Long> excludedAndParsedFiles = new TreeMap<>();
@@ -127,8 +127,9 @@ class DescriptorQueue {
bw.write(lastModifiedMillis + " " + absolutePath + "\n");
}
} catch (IOException e) {
- log.error("Could not write history file '{}'. Not excluding descriptors "
- + "in next execution.", this.historyFile.getAbsolutePath());
+ logger.error("Could not write history file '{}'. Not excluding "
+ + "descriptors in next execution.",
+ this.historyFile.getAbsolutePath());
}
}
@@ -142,8 +143,8 @@ class DescriptorQueue {
this.descriptors = this.descriptorReader.readDescriptors(
this.directory).iterator();
} else {
- log.error("Directory {} either does not exist or is not a directory. "
- + "Not adding to descriptor reader.",
+ logger.error("Directory {} either does not exist or is not a "
+ + "directory. Not adding to descriptor reader.",
this.directory.getAbsolutePath());
return null;
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorSource.java b/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorSource.java
index 27be94d..22f9127 100644
--- a/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorSource.java
+++ b/src/main/java/org/torproject/metrics/onionoo/updater/DescriptorSource.java
@@ -20,7 +20,7 @@ import java.util.Set;
public class DescriptorSource {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DescriptorSource.class);
private final File inDir = new File("in");
@@ -111,29 +111,31 @@ public class DescriptorSource {
* any registered listeners. */
public void readDescriptors() {
this.readArchivedDescriptors();
- log.debug("Reading recent {} ...", DescriptorType.RELAY_SERVER_DESCRIPTORS);
+ logger.debug("Reading recent {} ...",
+ DescriptorType.RELAY_SERVER_DESCRIPTORS);
this.readDescriptors(DescriptorType.RELAY_SERVER_DESCRIPTORS,
DescriptorHistory.RELAY_SERVER_HISTORY, true);
- log.debug("Reading recent {} ...", DescriptorType.RELAY_EXTRA_INFOS);
+ logger.debug("Reading recent {} ...", DescriptorType.RELAY_EXTRA_INFOS);
this.readDescriptors(DescriptorType.RELAY_EXTRA_INFOS,
DescriptorHistory.RELAY_EXTRAINFO_HISTORY, true);
- log.debug("Reading recent {} ...", DescriptorType.EXIT_LISTS);
+ logger.debug("Reading recent {} ...", DescriptorType.EXIT_LISTS);
this.readDescriptors(DescriptorType.EXIT_LISTS,
DescriptorHistory.EXIT_LIST_HISTORY, true);
- log.debug("Reading recent {} ...", DescriptorType.RELAY_CONSENSUSES);
+ logger.debug("Reading recent {} ...", DescriptorType.RELAY_CONSENSUSES);
this.readDescriptors(DescriptorType.RELAY_CONSENSUSES,
DescriptorHistory.RELAY_CONSENSUS_HISTORY, true);
- log.debug("Reading recent {} ...",
+ logger.debug("Reading recent {} ...",
DescriptorType.BRIDGE_SERVER_DESCRIPTORS);
this.readDescriptors(DescriptorType.BRIDGE_SERVER_DESCRIPTORS,
DescriptorHistory.BRIDGE_SERVER_HISTORY, false);
- log.debug("Reading recent {} ...", DescriptorType.BRIDGE_EXTRA_INFOS);
+ logger.debug("Reading recent {} ...", DescriptorType.BRIDGE_EXTRA_INFOS);
this.readDescriptors(DescriptorType.BRIDGE_EXTRA_INFOS,
DescriptorHistory.BRIDGE_EXTRAINFO_HISTORY, false);
- log.debug("Reading recent {} ...", DescriptorType.BRIDGE_STATUSES);
+ logger.debug("Reading recent {} ...", DescriptorType.BRIDGE_STATUSES);
this.readDescriptors(DescriptorType.BRIDGE_STATUSES,
DescriptorHistory.BRIDGE_STATUS_HISTORY, false);
- log.debug("Reading recent {} ...", DescriptorType.BRIDGE_POOL_ASSIGNMENTS);
+ logger.debug("Reading recent {} ...",
+ DescriptorType.BRIDGE_POOL_ASSIGNMENTS);
this.readDescriptors(DescriptorType.BRIDGE_POOL_ASSIGNMENTS,
DescriptorHistory.BRIDGE_POOL_ASSIGNMENTS_HISTORY, false);
}
@@ -154,7 +156,7 @@ public class DescriptorSource {
}
}
}
- log.info("Read recent/{}.", descriptorType.getDir());
+ logger.info("Read recent/{}.", descriptorType.getDir());
}
/** Reads archived descriptors from disk and feeds them into any
@@ -163,7 +165,7 @@ public class DescriptorSource {
if (!this.inArchiveDir.exists()) {
return;
}
- log.info("Reading archived descriptors...");
+ logger.info("Reading archived descriptors...");
this.archiveDescriptorQueue = new DescriptorQueue(this.inArchiveDir,
null, this.statusDir);
this.archiveDescriptorQueue.readHistoryFile(
@@ -204,8 +206,8 @@ public class DescriptorSource {
}
}
if (descriptorType == null) {
- log.warn("Unrecognized descriptor in {} with annotations {}. Skipping "
- + "descriptor.", this.inArchiveDir.getAbsolutePath(),
+ logger.warn("Unrecognized descriptor in {} with annotations {}. "
+ + "Skipping descriptor.", this.inArchiveDir.getAbsolutePath(),
descriptor.getAnnotations());
continue;
}
@@ -215,12 +217,12 @@ public class DescriptorSource {
}
}
this.archiveDescriptorQueue.writeHistoryFile();
- log.info("Read archived descriptors");
+ logger.info("Read archived descriptors");
}
/** Writes parse histories for recent descriptors to disk. */
public void writeHistoryFiles() {
- log.debug("Writing parse histories for recent descriptors...");
+ logger.debug("Writing parse histories for recent descriptors...");
for (DescriptorQueue descriptorQueue : this.recentDescriptorQueues) {
descriptorQueue.writeHistoryFile();
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/updater/LookupService.java b/src/main/java/org/torproject/metrics/onionoo/updater/LookupService.java
index 9a9dad5..32cc112 100644
--- a/src/main/java/org/torproject/metrics/onionoo/updater/LookupService.java
+++ b/src/main/java/org/torproject/metrics/onionoo/updater/LookupService.java
@@ -29,7 +29,7 @@ import java.util.regex.Pattern;
public class LookupService {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
LookupService.class);
private File geoipDir;
@@ -52,20 +52,20 @@ public class LookupService {
this.geoLite2CityBlocksIPv4CsvFile = new File(this.geoipDir,
"GeoLite2-City-Blocks-IPv4.csv");
if (!this.geoLite2CityBlocksIPv4CsvFile.exists()) {
- log.error("No GeoLite2-City-Blocks-IPv4.csv file in geoip/.");
+ logger.error("No GeoLite2-City-Blocks-IPv4.csv file in geoip/.");
return;
}
this.geoLite2CityLocationsEnCsvFile = new File(this.geoipDir,
"GeoLite2-City-Locations-en.csv");
if (!this.geoLite2CityLocationsEnCsvFile.exists()) {
- log.error("No GeoLite2-City-Locations-en.csv file in "
+ logger.error("No GeoLite2-City-Locations-en.csv file in "
+ "geoip/.");
return;
}
this.geoLite2AsnBlocksIpv4CsvFile = new File(this.geoipDir,
"GeoLite2-ASN-Blocks-IPv4.csv");
if (!this.geoLite2AsnBlocksIpv4CsvFile.exists()) {
- log.error("No GeoLite2-ASN-Blocks-IPv4.csv file in geoip/.");
+ logger.error("No GeoLite2-ASN-Blocks-IPv4.csv file in geoip/.");
return;
}
this.hasAllFiles = true;
@@ -135,7 +135,7 @@ public class LookupService {
while ((line = br.readLine()) != null) {
String[] parts = line.split(",", -1);
if (parts.length < 9) {
- log.error("Illegal line '{}' in {}.", line,
+ logger.error("Illegal line '{}' in {}.", line,
this.geoLite2CityBlocksIPv4CsvFile.getAbsolutePath());
return lookupResults;
}
@@ -144,14 +144,14 @@ public class LookupService {
String startAddressString = networkAddressAndMask[0];
long startIpNum = this.parseAddressString(startAddressString);
if (startIpNum < 0L) {
- log.error("Illegal IP address in '{}' in {}.", line,
+ logger.error("Illegal IP address in '{}' in {}.", line,
this.geoLite2CityBlocksIPv4CsvFile.getAbsolutePath());
return lookupResults;
}
int networkMaskLength = networkAddressAndMask.length < 2 ? 0
: Integer.parseInt(networkAddressAndMask[1]);
if (networkMaskLength < 8 || networkMaskLength > 32) {
- log.error("Missing or illegal network mask in '{}' in {}.", line,
+ logger.error("Missing or illegal network mask in '{}' in {}.", line,
this.geoLite2CityBlocksIPv4CsvFile.getAbsolutePath());
return lookupResults;
}
@@ -173,13 +173,13 @@ public class LookupService {
}
}
} catch (NumberFormatException e) {
- log.error("Number format exception while parsing line '{}' in {}.",
+ logger.error("Number format exception while parsing line '{}' in {}.",
line, this.geoLite2CityBlocksIPv4CsvFile.getAbsolutePath(), e);
return lookupResults;
}
}
} catch (IOException e) {
- log.error("I/O exception while reading {}: {}",
+ logger.error("I/O exception while reading {}: {}",
this.geoLite2CityBlocksIPv4CsvFile.getAbsolutePath(), e);
return lookupResults;
}
@@ -194,7 +194,7 @@ public class LookupService {
while ((line = br.readLine()) != null) {
String[] parts = line.replaceAll("\"", "").split(",", 13);
if (parts.length != 13) {
- log.error("Illegal line '{}' in {}.", line,
+ logger.error("Illegal line '{}' in {}.", line,
this.geoLite2CityLocationsEnCsvFile.getAbsolutePath());
return lookupResults;
}
@@ -205,13 +205,13 @@ public class LookupService {
blockLocations.put(locId, line);
}
} catch (NumberFormatException e) {
- log.error("Number format exception while parsing line '{}' in {}.",
+ logger.error("Number format exception while parsing line '{}' in {}.",
line, this.geoLite2CityLocationsEnCsvFile.getAbsolutePath());
return lookupResults;
}
}
} catch (IOException e) {
- log.error("I/O exception while reading {}: {}",
+ logger.error("I/O exception while reading {}: {}",
this.geoLite2CityLocationsEnCsvFile.getAbsolutePath(), e);
return lookupResults;
}
@@ -228,7 +228,7 @@ public class LookupService {
while ((line = br.readLine()) != null) {
String[] parts = line.replaceAll("\"", "").split(",", 3);
if (parts.length != 3) {
- log.error("Illegal line '{}' in {}.", line,
+ logger.error("Illegal line '{}' in {}.", line,
this.geoLite2AsnBlocksIpv4CsvFile.getAbsolutePath());
return lookupResults;
}
@@ -237,14 +237,14 @@ public class LookupService {
String startAddressString = networkAddressAndMask[0];
long startIpNum = this.parseAddressString(startAddressString);
if (startIpNum < 0L) {
- log.error("Illegal IP address in '{}' in {}.", line,
+ logger.error("Illegal IP address in '{}' in {}.", line,
this.geoLite2AsnBlocksIpv4CsvFile.getAbsolutePath());
return lookupResults;
}
int networkMaskLength = networkAddressAndMask.length < 2 ? 0
: Integer.parseInt(networkAddressAndMask[1]);
if (networkMaskLength < 8 || networkMaskLength > 32) {
- log.error("Missing or illegal network mask in '{}' in {}.", line,
+ logger.error("Missing or illegal network mask in '{}' in {}.", line,
this.geoLite2AsnBlocksIpv4CsvFile.getAbsolutePath());
return lookupResults;
}
@@ -275,13 +275,13 @@ public class LookupService {
break;
}
} catch (NumberFormatException e) {
- log.error("Number format exception while parsing line '{}' in {}.",
+ logger.error("Number format exception while parsing line '{}' in {}.",
line, this.geoLite2AsnBlocksIpv4CsvFile.getAbsolutePath());
return lookupResults;
}
}
} catch (IOException e) {
- log.error("I/O exception while reading {}: {}",
+ logger.error("I/O exception while reading {}: {}",
this.geoLite2AsnBlocksIpv4CsvFile.getAbsolutePath(), e);
return lookupResults;
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/updater/NodeDetailsStatusUpdater.java b/src/main/java/org/torproject/metrics/onionoo/updater/NodeDetailsStatusUpdater.java
index ce809aa..d59c533 100644
--- a/src/main/java/org/torproject/metrics/onionoo/updater/NodeDetailsStatusUpdater.java
+++ b/src/main/java/org/torproject/metrics/onionoo/updater/NodeDetailsStatusUpdater.java
@@ -67,7 +67,7 @@ import java.util.TreeSet;
public class NodeDetailsStatusUpdater implements DescriptorListener,
StatusUpdater {
- private Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
NodeDetailsStatusUpdater.class);
private DescriptorSource descriptorSource;
@@ -403,19 +403,19 @@ public class NodeDetailsStatusUpdater implements DescriptorListener,
@Override
public void updateStatuses() {
this.readNodeStatuses();
- log.info("Read node statuses");
+ logger.info("Read node statuses");
this.startReverseDomainNameLookups();
- log.info("Started reverse domain name lookups");
+ logger.info("Started reverse domain name lookups");
this.lookUpCitiesAndASes();
- log.info("Looked up cities and ASes");
+ logger.info("Looked up cities and ASes");
this.calculatePathSelectionProbabilities();
- log.info("Calculated path selection probabilities");
+ logger.info("Calculated path selection probabilities");
this.computeEffectiveAndExtendedFamilies();
- log.info("Computed effective and extended families");
+ logger.info("Computed effective and extended families");
this.finishReverseDomainNameLookups();
- log.info("Finished reverse domain name lookups");
+ logger.info("Finished reverse domain name lookups");
this.updateNodeDetailsStatuses();
- log.info("Updated node and details statuses");
+ logger.info("Updated node and details statuses");
}
/* Step 2: read node statuses from disk. */
@@ -571,8 +571,7 @@ public class NodeDetailsStatusUpdater implements DescriptorListener,
addressStrings.add(nodeStatus.getAddress());
}
if (addressStrings.isEmpty()) {
- log.error("No relay IP addresses to resolve to cities or "
- + "ASN.");
+ logger.error("No relay IP addresses to resolve to cities or ASN.");
return;
}
SortedMap<String, LookupResult> lookupResults =
@@ -621,7 +620,7 @@ public class NodeDetailsStatusUpdater implements DescriptorListener,
wed = ((double) this.lastBandwidthWeights.get("Wed")) / 10000.0;
}
} else {
- log.debug("Not calculating new path selection probabilities, "
+ logger.debug("Not calculating new path selection probabilities, "
+ "because we could not determine most recent Wxx parameter "
+ "values, probably because we didn't parse a consensus in "
+ "this execution.");
diff --git a/src/main/java/org/torproject/metrics/onionoo/updater/StatusUpdateRunner.java b/src/main/java/org/torproject/metrics/onionoo/updater/StatusUpdateRunner.java
index 65ff859..efbd0d4 100644
--- a/src/main/java/org/torproject/metrics/onionoo/updater/StatusUpdateRunner.java
+++ b/src/main/java/org/torproject/metrics/onionoo/updater/StatusUpdateRunner.java
@@ -10,7 +10,7 @@ import java.io.File;
public class StatusUpdateRunner {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
StatusUpdateRunner.class);
private LookupService ls;
@@ -37,9 +37,9 @@ public class StatusUpdateRunner {
/** Lets each configured status updater update its status files. */
public void updateStatuses() {
for (StatusUpdater su : this.statusUpdaters) {
- log.debug("Begin update of {}", su.getClass().getSimpleName());
+ logger.debug("Begin update of {}", su.getClass().getSimpleName());
su.updateStatuses();
- log.info("{} updated status files", su.getClass().getSimpleName());
+ logger.info("{} updated status files", su.getClass().getSimpleName());
}
}
@@ -48,14 +48,11 @@ public class StatusUpdateRunner {
for (StatusUpdater su : this.statusUpdaters) {
String statsString = su.getStatsString();
if (statsString != null) {
- LoggerFactory.getLogger("statistics").info("{}\n{}",
- su.getClass().getSimpleName(), statsString);
+ logger.info("{}\n{}", su.getClass().getSimpleName(), statsString);
}
}
- LoggerFactory.getLogger("statistics")
- .info("GeoIP lookup service\n{}", this.ls.getStatsString());
- LoggerFactory.getLogger("statistics")
- .info("Reverse domain name resolver\n{}", this.rdnr.getStatsString());
+ logger.info("GeoIP lookup service\n{}", this.ls.getStatsString());
+ logger.info("Reverse domain name resolver\n{}", this.rdnr.getStatsString());
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/util/FormattingUtils.java b/src/main/java/org/torproject/metrics/onionoo/util/FormattingUtils.java
index b1bae46..9f713af 100644
--- a/src/main/java/org/torproject/metrics/onionoo/util/FormattingUtils.java
+++ b/src/main/java/org/torproject/metrics/onionoo/util/FormattingUtils.java
@@ -14,7 +14,7 @@ import java.util.regex.Pattern;
/** Static helper methods for string processing etc. */
public class FormattingUtils {
- private static Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
FormattingUtils.class);
private FormattingUtils() {
@@ -66,7 +66,7 @@ public class FormattingUtils {
mat.appendTail(sb);
return sb.toString();
} catch (Throwable ex) {
- log.debug("Couldn't process input '{}'.", text, ex);
+ logger.debug("Couldn't process input '{}'.", text, ex);
return text;
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/BandwidthDocumentWriter.java b/src/main/java/org/torproject/metrics/onionoo/writer/BandwidthDocumentWriter.java
index 18317d9..2715682 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/BandwidthDocumentWriter.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/BandwidthDocumentWriter.java
@@ -21,7 +21,7 @@ import java.util.SortedSet;
public class BandwidthDocumentWriter implements DocumentWriter {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
BandwidthDocumentWriter.class);
private DocumentStore documentStore;
@@ -48,7 +48,7 @@ public class BandwidthDocumentWriter implements DocumentWriter {
fingerprint, mostRecentStatusMillis, bandwidthStatus);
this.documentStore.store(bandwidthDocument, fingerprint);
}
- log.info("Wrote bandwidth document files");
+ logger.info("Wrote bandwidth document files");
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/ClientsDocumentWriter.java b/src/main/java/org/torproject/metrics/onionoo/writer/ClientsDocumentWriter.java
index 33b8a99..dcb935c 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/ClientsDocumentWriter.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/ClientsDocumentWriter.java
@@ -43,7 +43,7 @@ import java.util.SortedSet;
*/
public class ClientsDocumentWriter implements DocumentWriter {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
ClientsDocumentWriter.class);
private DocumentStore documentStore;
@@ -74,7 +74,7 @@ public class ClientsDocumentWriter implements DocumentWriter {
this.documentStore.store(clientsDocument, hashedFingerprint);
this.writtenDocuments++;
}
- log.info("Wrote clients document files");
+ logger.info("Wrote clients document files");
}
private String[] graphNames = new String[] {
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/DetailsDocumentWriter.java b/src/main/java/org/torproject/metrics/onionoo/writer/DetailsDocumentWriter.java
index 29d9244..0b9a36e 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/DetailsDocumentWriter.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/DetailsDocumentWriter.java
@@ -22,7 +22,7 @@ import java.util.TreeSet;
public class DetailsDocumentWriter implements DocumentWriter {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DetailsDocumentWriter.class);
private DocumentStore documentStore;
@@ -48,7 +48,7 @@ public class DetailsDocumentWriter implements DocumentWriter {
this.updateBridgeDetailsFile(fingerprint, detailsStatus);
}
}
- log.info("Wrote details document files");
+ logger.info("Wrote details document files");
}
private void updateRelayDetailsFile(String fingerprint,
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/DocumentWriterRunner.java b/src/main/java/org/torproject/metrics/onionoo/writer/DocumentWriterRunner.java
index 99b627e..963b648 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/DocumentWriterRunner.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/DocumentWriterRunner.java
@@ -12,7 +12,7 @@ import org.slf4j.LoggerFactory;
public class DocumentWriterRunner {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DocumentWriterRunner.class);
private DocumentWriter[] documentWriters;
@@ -34,7 +34,7 @@ public class DocumentWriterRunner {
public void writeDocuments() {
long mostRecentStatusMillis = retrieveMostRecentStatusMillis();
for (DocumentWriter dw : this.documentWriters) {
- log.debug("Writing {}", dw.getClass().getSimpleName());
+ logger.debug("Writing {}", dw.getClass().getSimpleName());
dw.writeDocuments(mostRecentStatusMillis);
}
}
@@ -56,7 +56,7 @@ public class DocumentWriterRunner {
for (DocumentWriter dw : this.documentWriters) {
String statsString = dw.getStatsString();
if (statsString != null) {
- log.info("{}\n{}", dw.getClass().getSimpleName(), statsString);
+ logger.info("{}\n{}", dw.getClass().getSimpleName(), statsString);
}
}
}
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/SummaryDocumentWriter.java b/src/main/java/org/torproject/metrics/onionoo/writer/SummaryDocumentWriter.java
index bcdb370..5975c6c 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/SummaryDocumentWriter.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/SummaryDocumentWriter.java
@@ -19,7 +19,7 @@ import java.util.SortedSet;
public class SummaryDocumentWriter implements DocumentWriter {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
SummaryDocumentWriter.class);
private DocumentStore documentStore;
@@ -108,7 +108,7 @@ public class SummaryDocumentWriter implements DocumentWriter {
this.writtenDocuments++;
}
}
- log.info("Wrote summary document files");
+ logger.info("Wrote summary document files");
}
@Override
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/UptimeDocumentWriter.java b/src/main/java/org/torproject/metrics/onionoo/writer/UptimeDocumentWriter.java
index f03b730..28ed9fd 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/UptimeDocumentWriter.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/UptimeDocumentWriter.java
@@ -26,7 +26,7 @@ import java.util.TreeSet;
public class UptimeDocumentWriter implements DocumentWriter {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
UptimeDocumentWriter.class);
private DocumentStore documentStore;
@@ -52,7 +52,7 @@ public class UptimeDocumentWriter implements DocumentWriter {
for (String fingerprint : updatedUptimeStatuses) {
this.updateDocument(fingerprint, mostRecentStatusMillis, uptimeStatus);
}
- log.info("Wrote uptime document files");
+ logger.info("Wrote uptime document files");
}
private int writtenDocuments = 0;
diff --git a/src/main/java/org/torproject/metrics/onionoo/writer/WeightsDocumentWriter.java b/src/main/java/org/torproject/metrics/onionoo/writer/WeightsDocumentWriter.java
index ceda9ef..cfd1123 100644
--- a/src/main/java/org/torproject/metrics/onionoo/writer/WeightsDocumentWriter.java
+++ b/src/main/java/org/torproject/metrics/onionoo/writer/WeightsDocumentWriter.java
@@ -21,7 +21,7 @@ import java.util.SortedSet;
public class WeightsDocumentWriter implements DocumentWriter {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
WeightsDocumentWriter.class);
private DocumentStore documentStore;
@@ -49,7 +49,7 @@ public class WeightsDocumentWriter implements DocumentWriter {
fingerprint, history, mostRecentStatusMillis);
this.documentStore.store(weightsDocument, fingerprint);
}
- log.info("Wrote weights document files");
+ logger.info("Wrote weights document files");
}
private String[] graphNames = new String[] {
diff --git a/src/main/resources/logback.xml b/src/main/resources/logback.xml
deleted file mode 100644
index d61be28..0000000
--- a/src/main/resources/logback.xml
+++ /dev/null
@@ -1,82 +0,0 @@
-<configuration debug="false">
-
- <!-- a path and a prefix -->
- <property name="logfile-base" value="${LOGBASE}/onionoo-" />
-
- <!-- log file names -->
- <property name="fileall-logname" value="${logfile-base}all" />
- <property name="fileerr-logname" value="${logfile-base}err" />
- <property name="filestatistics-logname" value="${logfile-base}statistics" />
-
- <!-- date pattern -->
- <property name="utc-date-pattern" value="%date{ISO8601, UTC}" />
-
- <!-- appender section -->
- <appender name="FILEALL" class="ch.qos.logback.core.rolling.RollingFileAppender">
- <file>${fileall-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
- <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
- <!-- rollover daily -->
- <FileNamePattern>${fileall-logname}.%d{yyyy-MM-dd}.%i.log</FileNamePattern>
- <maxHistory>10</maxHistory>
- <timeBasedFileNamingAndTriggeringPolicy
- class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
- <!-- or whenever the file size reaches 1MB -->
- <maxFileSize>1MB</maxFileSize>
- </timeBasedFileNamingAndTriggeringPolicy>
- </rollingPolicy>
- </appender>
-
- <appender name="FILEERR" class="ch.qos.logback.core.FileAppender">
- <file>${fileerr-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <!-- ERROR or worse -->
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>ERROR</level>
- </filter>
- </appender>
-
- <appender name="FILESTATISTICS" class="ch.qos.logback.core.FileAppender">
- <file>${filestatistics-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %msg%n</pattern>
- </encoder>
-
- <!-- only INFO level -->
- <filter class="ch.qos.logback.classic.filter.LevelFilter">
- <level>INFO</level>
- <onMatch>ACCEPT</onMatch>
- <onMismatch>DENY</onMismatch>
- </filter>
- </appender>
-
- <!-- logger section -->
- <logger name="org.torproject" >
- <appender-ref ref="FILEERR" />
- </logger>
-
- <logger name="org.eclipse" level="INFO" />
-
- <logger name="org.torproject.metrics.onionoo.cron.Main" >
- <appender-ref ref="FILESTATISTICS" />
- </logger>
-
- <logger name="org.torproject.metrics.onionoo.server.PerformanceMetrics" >
- <appender-ref ref="FILESTATISTICS" />
- </logger>
-
- <logger name="statistics" >
- <appender-ref ref="FILESTATISTICS" />
- </logger>
-
- <root level="ALL">
- <appender-ref ref="FILEALL" />
- </root>
-
-</configuration>
-
1
0

[translation/donatepages-messagespot] https://gitweb.torproject.org/translation.git/commit/?h=donatepages-messagespot
by translation@torproject.org 31 Mar '20
by translation@torproject.org 31 Mar '20
31 Mar '20
commit a2e50a49547be209fe01c0136f8a819238f47db7
Author: Translation commit bot <translation(a)torproject.org>
Date: Tue Mar 31 09:45:39 2020 +0000
https://gitweb.torproject.org/translation.git/commit/?h=donatepages-message…
---
locale/zh_CN/LC_MESSAGES/messages.po | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/locale/zh_CN/LC_MESSAGES/messages.po b/locale/zh_CN/LC_MESSAGES/messages.po
index a5c8a7d544..8568707c9a 100644
--- a/locale/zh_CN/LC_MESSAGES/messages.po
+++ b/locale/zh_CN/LC_MESSAGES/messages.po
@@ -1596,7 +1596,7 @@ msgstr ""
#: tmp/cache_locale/7d/7d56367a61f987367eeb2a89d0c6db83fd0801cce86278bf7e99ed39b5b46254.php:349
msgid "The account information is as follows:"
-msgstr ""
+msgstr "账户信息如下:"
#: tmp/cache_locale/7d/7d56367a61f987367eeb2a89d0c6db83fd0801cce86278bf7e99ed39b5b46254.php:353
msgid ""
@@ -1662,7 +1662,7 @@ msgstr ""
#: tmp/cache_locale/7d/7d56367a61f987367eeb2a89d0c6db83fd0801cce86278bf7e99ed39b5b46254.php:391
msgid "They also like donations of bandwidth from ISPs."
-msgstr ""
+msgstr "他们同样喜欢来自ISP的带宽捐献。"
#: tmp/cache_locale/7d/7d56367a61f987367eeb2a89d0c6db83fd0801cce86278bf7e99ed39b5b46254.php:395
msgid ""
1
0

31 Mar '20
commit 334550d6d5c50e064c49f6ee713cff561e137fc6
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Tue Mar 31 10:30:46 2020 +0200
Simplify logging configuration.
Implements #33549.
---
CHANGELOG.md | 1 +
src/build | 2 +-
.../torproject/metrics/stats/bridgedb/Main.java | 16 +++---
.../org/torproject/metrics/stats/bwhist/Main.java | 14 +++---
.../bwhist/RelayDescriptorDatabaseImporter.java | 43 ++++++++--------
.../org/torproject/metrics/stats/clients/Main.java | 24 ++++-----
.../metrics/stats/connbidirect/Main.java | 44 ++++++++--------
.../metrics/stats/hidserv/Aggregator.java | 7 +--
.../stats/hidserv/ComputedNetworkFractions.java | 12 ++---
.../metrics/stats/hidserv/DocumentStore.java | 17 ++++---
.../stats/hidserv/ExtrapolatedHidServStats.java | 6 +--
.../metrics/stats/hidserv/Extrapolator.java | 5 +-
.../org/torproject/metrics/stats/hidserv/Main.java | 16 +++---
.../torproject/metrics/stats/hidserv/Parser.java | 19 +++----
.../stats/hidserv/ReportedHidServStats.java | 4 +-
.../torproject/metrics/stats/hidserv/Simulate.java | 12 ++---
.../org/torproject/metrics/stats/main/Main.java | 30 +++++------
.../torproject/metrics/stats/onionperf/Main.java | 22 ++++----
.../org/torproject/metrics/stats/servers/Main.java | 24 ++++-----
.../org/torproject/metrics/stats/totalcw/Main.java | 24 ++++-----
.../torproject/metrics/stats/webstats/Main.java | 32 ++++++------
.../org/torproject/metrics/web/ServerMain.java | 7 +--
.../org/torproject/metrics/web/UpdateNews.java | 8 +--
src/main/resources/logback.xml | 58 ----------------------
src/submods/metrics-lib | 2 +-
25 files changed, 201 insertions(+), 248 deletions(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index b1571c6..323d0e7 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -14,6 +14,7 @@
links into https:// links.
- Set default locale `US` at the beginning of the execution.
- Set default time zone `UTC` at the beginning of the execution.
+ - Simplify logging configuration.
# Changes in version 1.3.0 - 2019-11-09
diff --git a/src/build b/src/build
index 264e498..fd85646 160000
--- a/src/build
+++ b/src/build
@@ -1 +1 @@
-Subproject commit 264e498f54a20f7d299daaf2533d043f880e6a8b
+Subproject commit fd856466bcb260f53ef69a24c102d0e49d171cc3
diff --git a/src/main/java/org/torproject/metrics/stats/bridgedb/Main.java b/src/main/java/org/torproject/metrics/stats/bridgedb/Main.java
index 989c695..f7e9fb4 100644
--- a/src/main/java/org/torproject/metrics/stats/bridgedb/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/bridgedb/Main.java
@@ -24,7 +24,7 @@ import java.util.TreeMap;
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final Path bridgedbStatsCsvFile
= org.torproject.metrics.stats.main.Main.modulesDir.toPath()
@@ -55,7 +55,7 @@ public class Main {
}
String[] lineParts = line.split(",");
if (lineParts.length != 4) {
- log.warn("Skipping unrecognized line '{}' in {}.", line,
+ logger.warn("Skipping unrecognized line '{}' in {}.", line,
bridgedbStatsCsvFile.toAbsolutePath());
continue;
}
@@ -64,8 +64,8 @@ public class Main {
long value = Long.parseLong(lineParts[3]);
readStatistics.put(key, value);
}
- log.debug("Read {} containing {} non-header lines.", bridgedbStatsCsvFile,
- readStatistics.size());
+ logger.debug("Read {} containing {} non-header lines.",
+ bridgedbStatsCsvFile, readStatistics.size());
}
return readStatistics;
}
@@ -82,7 +82,7 @@ public class Main {
}
BridgedbMetrics bridgedbMetrics = (BridgedbMetrics) descriptor;
if (!"1".equals(bridgedbMetrics.bridgedbMetricsVersion())) {
- log.warn("Unable to process BridgeDB metrics version {} != 1.",
+ logger.warn("Unable to process BridgeDB metrics version {} != 1.",
bridgedbMetrics.bridgedbMetricsVersion());
continue;
}
@@ -100,7 +100,7 @@ public class Main {
continue;
}
if (bridgedbMetricCount.getValue() < 10) {
- log.warn("Skipping too small BridgeDB metric count {} < 10 in {}.",
+ logger.warn("Skipping too small BridgeDB metric count {} < 10 in {}.",
bridgedbMetricCount.getValue(),
descriptor.getDescriptorFile().getAbsolutePath());
continue;
@@ -141,8 +141,8 @@ public class Main {
statistic.getValue()));
}
Files.write(bridgedbStatsCsvFile, lines, StandardOpenOption.CREATE);
- log.debug("Wrote {} containing {} non-header lines.", bridgedbStatsCsvFile,
- lines.size() - 1);
+ logger.debug("Wrote {} containing {} non-header lines.",
+ bridgedbStatsCsvFile, lines.size() - 1);
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/bwhist/Main.java b/src/main/java/org/torproject/metrics/stats/bwhist/Main.java
index 4e92f1c..e3befc0 100644
--- a/src/main/java/org/torproject/metrics/stats/bwhist/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/bwhist/Main.java
@@ -15,7 +15,7 @@ import java.util.Arrays;
*/
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static String[] paths = {
"recent/relay-descriptors/consensuses",
@@ -34,9 +34,9 @@ public class Main {
/** Executes this data-processing module. */
public static void main(String[] args) throws Exception {
- log.info("Starting bwhist module.");
+ logger.info("Starting bwhist module.");
- log.info("Reading descriptors and inserting relevant parts into the "
+ logger.info("Reading descriptors and inserting relevant parts into the "
+ "database.");
File[] descriptorDirectories = Arrays.stream(paths).map((String path)
-> new File(org.torproject.metrics.stats.main.Main.descriptorsDir,
@@ -47,17 +47,17 @@ public class Main {
historyFile, jdbcString);
database.importRelayDescriptors();
- log.info("Aggregating database entries.");
+ logger.info("Aggregating database entries.");
database.aggregate();
- log.info("Querying aggregated statistics from the database.");
+ logger.info("Querying aggregated statistics from the database.");
new Writer().write(new File(baseDir, "stats/bandwidth.csv").toPath(),
database.queryBandwidth());
- log.info("Closing database connection.");
+ logger.info("Closing database connection.");
database.closeConnection();
- log.info("Terminating bwhist module.");
+ logger.info("Terminating bwhist module.");
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/bwhist/RelayDescriptorDatabaseImporter.java b/src/main/java/org/torproject/metrics/stats/bwhist/RelayDescriptorDatabaseImporter.java
index 7b08f77..2afbecf 100644
--- a/src/main/java/org/torproject/metrics/stats/bwhist/RelayDescriptorDatabaseImporter.java
+++ b/src/main/java/org/torproject/metrics/stats/bwhist/RelayDescriptorDatabaseImporter.java
@@ -85,7 +85,7 @@ public final class RelayDescriptorDatabaseImporter {
*/
private CallableStatement csH;
- private static Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(RelayDescriptorDatabaseImporter.class);
/**
@@ -141,7 +141,7 @@ public final class RelayDescriptorDatabaseImporter {
+ "(date) VALUES (?)");
this.scheduledUpdates = new HashSet<>();
} catch (SQLException e) {
- log.warn("Could not connect to database or prepare statements.", e);
+ logger.warn("Could not connect to database or prepare statements.", e);
}
}
@@ -160,7 +160,7 @@ public final class RelayDescriptorDatabaseImporter {
this.dateTimeFormat.format(timestamp).substring(0, 10)
+ " 00:00:00").getTime();
} catch (ParseException e) {
- log.warn("Internal parsing error.", e);
+ logger.warn("Internal parsing error.", e);
return;
}
if (!this.scheduledUpdates.contains(dateMillis)) {
@@ -206,8 +206,8 @@ public final class RelayDescriptorDatabaseImporter {
insertedStatusEntries.add(fingerprint);
}
} catch (SQLException e) {
- log.warn("Could not add network status consensus entry. We won't make "
- + "any further SQL requests in this execution.", e);
+ logger.warn("Could not add network status consensus entry. We won't "
+ + "make any further SQL requests in this execution.", e);
this.importIntoDatabase = false;
}
}
@@ -304,7 +304,7 @@ public final class RelayDescriptorDatabaseImporter {
for (String bandwidthHistoryString : bandwidthHistoryStrings) {
String[] parts = bandwidthHistoryString.split(" ");
if (parts.length != 6) {
- log.debug("Bandwidth history line does not have expected "
+ logger.debug("Bandwidth history line does not have expected "
+ "number of elements. Ignoring this line.");
continue;
}
@@ -312,13 +312,13 @@ public final class RelayDescriptorDatabaseImporter {
try {
intervalLength = Long.parseLong(parts[3].substring(1));
} catch (NumberFormatException e) {
- log.debug("Bandwidth history line does not have valid interval length "
- + "'{} {}'. Ignoring this line.", parts[3], parts[4]);
+ logger.debug("Bandwidth history line does not have valid interval "
+ + "length '{} {}'. Ignoring this line.", parts[3], parts[4]);
continue;
}
String[] values = parts[5].split(",");
if (intervalLength % 900L != 0L) {
- log.debug("Bandwidth history line does not contain "
+ logger.debug("Bandwidth history line does not contain "
+ "multiples of 15-minute intervals. Ignoring this line.");
continue;
} else if (intervalLength != 900L) {
@@ -336,7 +336,7 @@ public final class RelayDescriptorDatabaseImporter {
values = newValues;
intervalLength = 900L;
} catch (NumberFormatException e) {
- log.debug("Number format exception while parsing "
+ logger.debug("Number format exception while parsing "
+ "bandwidth history line. Ignoring this line.");
continue;
}
@@ -350,15 +350,16 @@ public final class RelayDescriptorDatabaseImporter {
dateStart = dateTimeFormat.parse(parts[1] + " 00:00:00")
.getTime();
} catch (ParseException e) {
- log.debug("Parse exception while parsing timestamp in "
+ logger.debug("Parse exception while parsing timestamp in "
+ "bandwidth history line. Ignoring this line.");
continue;
}
if (Math.abs(published - intervalEnd)
> 7L * 24L * 60L * 60L * 1000L) {
- log.debug("Extra-info descriptor publication time {} and last interval "
- + "time {} in {} line differ by more than 7 days! Not adding this "
- + "line!", dateTimeFormat.format(published), intervalEndTime, type);
+ logger.debug("Extra-info descriptor publication time {} and last "
+ + "interval time {} in {} line differ by more than 7 days! Not "
+ + "adding this line!", dateTimeFormat.format(published),
+ intervalEndTime, type);
continue;
}
long currentIntervalEnd = intervalEnd;
@@ -384,7 +385,7 @@ public final class RelayDescriptorDatabaseImporter {
currentIntervalEnd -= intervalLength * 1000L;
}
} catch (NumberFormatException e) {
- log.debug("Number format exception while parsing "
+ logger.debug("Number format exception while parsing "
+ "bandwidth history line. Ignoring this line.");
continue;
}
@@ -432,7 +433,7 @@ public final class RelayDescriptorDatabaseImporter {
this.csH.executeBatch();
}
} catch (SQLException | ParseException e) {
- log.warn("Could not insert bandwidth "
+ logger.warn("Could not insert bandwidth "
+ "history line into database. We won't make any "
+ "further SQL requests in this execution.", e);
this.importIntoDatabase = false;
@@ -539,8 +540,8 @@ public final class RelayDescriptorDatabaseImporter {
public void commit() {
/* Log stats about imported descriptors. */
- log.info("Finished importing relay descriptors: {} network status entries "
- + "and {} bandwidth history elements", rrsCount, rhsCount);
+ logger.info("Finished importing relay descriptors: {} network status "
+ + "entries and {} bandwidth history elements", rrsCount, rhsCount);
/* Insert scheduled updates a second time, just in case the refresh
* run has started since inserting them the first time in which case
@@ -555,7 +556,7 @@ public final class RelayDescriptorDatabaseImporter {
this.psU.execute();
}
} catch (SQLException e) {
- log.warn("Could not add scheduled dates "
+ logger.warn("Could not add scheduled dates "
+ "for the next refresh run.", e);
}
}
@@ -567,7 +568,7 @@ public final class RelayDescriptorDatabaseImporter {
this.conn.commit();
} catch (SQLException e) {
- log.warn("Could not commit final records to database", e);
+ logger.warn("Could not commit final records to database", e);
}
}
}
@@ -637,7 +638,7 @@ public final class RelayDescriptorDatabaseImporter {
try {
this.conn.close();
} catch (SQLException e) {
- log.warn("Could not close database connection.", e);
+ logger.warn("Could not close database connection.", e);
}
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/clients/Main.java b/src/main/java/org/torproject/metrics/stats/clients/Main.java
index d89a82a..bfa9214 100644
--- a/src/main/java/org/torproject/metrics/stats/clients/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/clients/Main.java
@@ -23,7 +23,7 @@ import java.util.TreeMap;
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static Database database;
@@ -33,36 +33,36 @@ public class Main {
/** Executes this data-processing module. */
public static void main(String[] args) throws Exception {
- log.info("Starting clients module.");
+ logger.info("Starting clients module.");
- log.info("Connecting to database.");
+ logger.info("Connecting to database.");
database = new Database();
- log.info("Reading relay descriptors and importing relevant parts into the "
- + "database.");
+ logger.info("Reading relay descriptors and importing relevant parts into "
+ + "the database.");
parseRelayDescriptors();
- log.info("Reading bridge descriptors and importing relevant parts into the "
- + "database.");
+ logger.info("Reading bridge descriptors and importing relevant parts into "
+ + "the database.");
parseBridgeDescriptors();
- log.info("Processing newly imported data.");
+ logger.info("Processing newly imported data.");
database.processImported();
database.commit();
- log.info("Querying aggregated statistics from the database.");
+ logger.info("Querying aggregated statistics from the database.");
new Writer().write(new File(baseDir, "stats/userstats.csv").toPath(),
database.queryEstimated());
new Writer().write(new File(baseDir, "stats/userstats-combined.csv")
.toPath(), database.queryCombined());
- log.info("Disconnecting from database.");
+ logger.info("Disconnecting from database.");
database.close();
- log.info("Running detector.");
+ logger.info("Running detector.");
new Detector().detect();
- log.info("Terminating clients module.");
+ logger.info("Terminating clients module.");
}
private static final long ONE_HOUR_MILLIS = 60L * 60L * 1000L;
diff --git a/src/main/java/org/torproject/metrics/stats/connbidirect/Main.java b/src/main/java/org/torproject/metrics/stats/connbidirect/Main.java
index 5e71534..2abf202 100644
--- a/src/main/java/org/torproject/metrics/stats/connbidirect/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/connbidirect/Main.java
@@ -34,7 +34,7 @@ import java.util.TreeSet;
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
static class RawStat implements Comparable<RawStat> {
@@ -79,12 +79,12 @@ public class Main {
return new RawStat(dateDays, fingerprint, fractionRead,
fractionWrite, fractionBoth);
} else {
- log.warn("Could not deserialize raw statistic from string '{}'.",
+ logger.warn("Could not deserialize raw statistic from string '{}'.",
string);
return null;
}
} catch (NumberFormatException e) {
- log.warn("Could not deserialize raw statistic from string '{}'.",
+ logger.warn("Could not deserialize raw statistic from string '{}'.",
string, e);
return null;
}
@@ -144,13 +144,13 @@ public class Main {
SortedMap<String, Long> parseHistory = parseParseHistory(
readStringFromFile(parseHistoryFile));
if (parseHistory == null) {
- log.warn("Could not parse {}. Proceeding without parse history.",
+ logger.warn("Could not parse {}. Proceeding without parse history.",
parseHistoryFile.getAbsolutePath());
}
SortedMap<String, Short> aggregateStats = parseAggregateStats(
readStringFromFile(aggregateStatsFile));
if (aggregateStats == null) {
- log.warn("Could not parse previously aggregated "
+ logger.warn("Could not parse previously aggregated "
+ "statistics. Not proceeding, because we would otherwise "
+ "lose previously aggregated values for which we don't have "
+ "raw statistics anymore.");
@@ -160,7 +160,7 @@ public class Main {
parseHistory = addRawStatsFromDescriptors(newRawStats,
descriptorsDirectories, parseHistory);
if (parseHistory == null) {
- log.warn("Could not parse raw statistics from "
+ logger.warn("Could not parse raw statistics from "
+ "descriptors. Not proceeding, because we would otherwise "
+ "leave out those descriptors in future runs.");
return;
@@ -169,7 +169,7 @@ public class Main {
SortedSet<RawStat> rawStats = parseRawStats(
readStringFromFile(rawStatsFile));
if (rawStats == null) {
- log.warn("Could not parse previously parsed raw "
+ logger.warn("Could not parse previously parsed raw "
+ "statistics. Not proceeding, because we might otherwise "
+ "leave out previously parsed statistics in the aggregates.");
return;
@@ -189,7 +189,7 @@ public class Main {
sb.append("\n ")
.append(dateFormat.format(conflictingDate * ONE_DAY_IN_MILLIS));
}
- log.warn(sb.toString());
+ logger.warn(sb.toString());
return;
}
updateAggregateStats(aggregateStats, rawStats);
@@ -248,19 +248,19 @@ public class Main {
while ((line = lnr.readLine()) != null) {
String[] parts = line.split(",");
if (parts.length < 2) {
- log.warn("Invalid line {} in parse history: '{}'.",
+ logger.warn("Invalid line {} in parse history: '{}'.",
lnr.getLineNumber(), line);
return null;
}
parsedParseHistory.put(parts[0], Long.parseLong(parts[1]));
}
} catch (IOException e) {
- log.warn("Unexpected I/O exception while reading line {} from parse "
+ logger.warn("Unexpected I/O exception while reading line {} from parse "
+ "history.", lnr.getLineNumber(), e);
return null;
} catch (NumberFormatException e) {
- log.warn("Invalid line {} in parse history: '{}'.", lnr.getLineNumber(),
- line, e);
+ logger.warn("Invalid line {} in parse history: '{}'.",
+ lnr.getLineNumber(), line, e);
return null;
}
return parsedParseHistory;
@@ -295,14 +295,14 @@ public class Main {
String line = "";
try {
if (!AGGREGATE_STATS_HEADER.equals(lnr.readLine())) {
- log.warn("First line of aggregate statistics does not "
+ logger.warn("First line of aggregate statistics does not "
+ "contain the header line. Is this the correct file?");
return null;
}
while ((line = lnr.readLine()) != null) {
String[] parts = line.split(",");
if (parts.length != 4) {
- log.warn("Invalid line {} in aggregate statistics: '{}'.",
+ logger.warn("Invalid line {} in aggregate statistics: '{}'.",
lnr.getLineNumber(), line);
return null;
}
@@ -310,11 +310,11 @@ public class Main {
+ parts[2], Short.parseShort(parts[3]));
}
} catch (IOException e) {
- log.warn("Unexpected I/O exception while reading line {} from aggregate "
- + "statistics.", lnr.getLineNumber(), e);
+ logger.warn("Unexpected I/O exception while reading line {} from "
+ + "aggregate statistics.", lnr.getLineNumber(), e);
return null;
} catch (NumberFormatException e) {
- log.warn("Invalid line {} in aggregate statistics: '{}'.",
+ logger.warn("Invalid line {} in aggregate statistics: '{}'.",
lnr.getLineNumber(), line, e);
return null;
}
@@ -341,19 +341,19 @@ public class Main {
while ((line = lnr.readLine()) != null) {
RawStat rawStat = RawStat.fromString(line);
if (rawStat == null) {
- log.warn("Invalid line {} in raw statistics: '{}'.",
+ logger.warn("Invalid line {} in raw statistics: '{}'.",
lnr.getLineNumber(), line);
return null;
}
parsedRawStats.add(rawStat);
}
} catch (IOException e) {
- log.warn("Unexpected I/O exception while reading line {} from raw "
+ logger.warn("Unexpected I/O exception while reading line {} from raw "
+ "statistics.", lnr.getLineNumber(), e);
return null;
} catch (NumberFormatException e) {
- log.warn("Invalid line {} in raw statistics: '{}'.", lnr.getLineNumber(),
- line, e);
+ logger.warn("Invalid line {} in raw statistics: '{}'.",
+ lnr.getLineNumber(), line, e);
return null;
}
return parsedRawStats;
@@ -392,7 +392,7 @@ public class Main {
int write = extraInfo.getConnBiDirectWrite();
int both = extraInfo.getConnBiDirectBoth();
if (below < 0 || read < 0 || write < 0 || both < 0) {
- log.debug("Could not parse incomplete conn-bi-direct statistics. "
+ logger.debug("Could not parse incomplete conn-bi-direct statistics. "
+ "Skipping descriptor.");
return null;
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/Aggregator.java b/src/main/java/org/torproject/metrics/stats/hidserv/Aggregator.java
index cb52598..8ca00a0 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/Aggregator.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/Aggregator.java
@@ -24,7 +24,8 @@ import java.util.TreeMap;
* statistics and the total network fraction of reporting relays. */
public class Aggregator {
- private static Logger log = LoggerFactory.getLogger(Aggregator.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(Aggregator.class);
/** Document file containing extrapolated hidden-service statistics. */
private File extrapolatedHidServStatsFile;
@@ -62,7 +63,7 @@ public class Aggregator {
this.extrapolatedHidServStatsStore.retrieve(
this.extrapolatedHidServStatsFile);
if (extrapolatedStats == null) {
- log.warn("Unable to retrieve extrapolated hidden-service "
+ logger.warn("Unable to retrieve extrapolated hidden-service "
+ "statistics from file {}. Skipping aggregation step.",
this.extrapolatedHidServStatsFile.getAbsolutePath());
return;
@@ -188,7 +189,7 @@ public class Aggregator {
this.hidservStatsCsvFile))) {
bw.write(sb.toString());
} catch (IOException e) {
- log.warn("Unable to write results to {}. Ignoring.",
+ logger.warn("Unable to write results to {}. Ignoring.",
this.extrapolatedHidServStatsFile.getAbsolutePath());
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/ComputedNetworkFractions.java b/src/main/java/org/torproject/metrics/stats/hidserv/ComputedNetworkFractions.java
index 3f3f12d..d110cbd 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/ComputedNetworkFractions.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/ComputedNetworkFractions.java
@@ -15,7 +15,7 @@ import java.util.Map;
* status entries and bandwidth weights in a network status consensus. */
public class ComputedNetworkFractions implements Document {
- private static Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(ComputedNetworkFractions.class);
/** Relay fingerprint consisting of 40 upper-case hex characters. */
@@ -137,18 +137,18 @@ public class ComputedNetworkFractions implements Document {
@Override
public boolean parse(String[] formattedStrings) {
if (formattedStrings.length != 2) {
- log.warn("Invalid number of formatted strings. Skipping.");
+ logger.warn("Invalid number of formatted strings. Skipping.");
return false;
}
String[] firstParts = formattedStrings[0].split(",", 2);
if (firstParts.length != 2) {
- log.warn("Invalid number of comma-separated values. Skipping.");
+ logger.warn("Invalid number of comma-separated values. Skipping.");
return false;
}
String fingerprint = firstParts[0];
String[] secondParts = formattedStrings[1].split(",", 3);
if (secondParts.length != 3) {
- log.warn("Invalid number of comma-separated values. Skipping.");
+ logger.warn("Invalid number of comma-separated values. Skipping.");
return false;
}
String validAfterDate = firstParts[1];
@@ -166,7 +166,7 @@ public class ComputedNetworkFractions implements Document {
if (validAfterDateMillis == DateTimeHelper.NO_TIME_AVAILABLE
|| validAfterTimeMillis < 0L
|| validAfterTimeMillis >= DateTimeHelper.ONE_DAY) {
- log.warn("Invalid date/hour format. Skipping.");
+ logger.warn("Invalid date/hour format. Skipping.");
return false;
}
long validAfterMillis = validAfterDateMillis + validAfterTimeMillis;
@@ -179,7 +179,7 @@ public class ComputedNetworkFractions implements Document {
? 0.0 : Double.parseDouble(secondParts[2]);
return true;
} catch (NumberFormatException e) {
- log.warn("Invalid number format. Skipping.");
+ logger.warn("Invalid number format. Skipping.");
return false;
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/DocumentStore.java b/src/main/java/org/torproject/metrics/stats/hidserv/DocumentStore.java
index 1cfcf08..fe223c7 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/DocumentStore.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/DocumentStore.java
@@ -26,7 +26,8 @@ import java.util.TreeSet;
* interface to a file and later to retrieve them. */
public class DocumentStore<T extends Document> {
- private static Logger log = LoggerFactory.getLogger(DocumentStore.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(DocumentStore.class);
/** Document class, needed to create new instances when retrieving
* documents. */
@@ -47,7 +48,7 @@ public class DocumentStore<T extends Document> {
/* Retrieve existing documents. */
Set<T> retrievedDocuments = this.retrieve(documentFile);
if (retrievedDocuments == null) {
- log.warn("Unable to read and update {}. Not storing documents.",
+ logger.warn("Unable to read and update {}. Not storing documents.",
documentFile.getAbsoluteFile());
return false;
}
@@ -68,7 +69,7 @@ public class DocumentStore<T extends Document> {
File documentTempFile = new File(documentFile.getAbsoluteFile()
+ ".tmp");
if (documentTempFile.exists()) {
- log.warn("Temporary document file {} still exists, "
+ logger.warn("Temporary document file {} still exists, "
+ "indicating that a previous execution did not terminate "
+ "cleanly. Not storing documents.",
documentTempFile.getAbsoluteFile());
@@ -90,7 +91,7 @@ public class DocumentStore<T extends Document> {
documentFile.delete();
documentTempFile.renameTo(documentFile);
} catch (IOException e) {
- log.warn("Unable to write {}. Not storing documents.",
+ logger.warn("Unable to write {}. Not storing documents.",
documentFile.getAbsolutePath(), e);
return false;
}
@@ -125,7 +126,7 @@ public class DocumentStore<T extends Document> {
if (!line.startsWith(" ")) {
formattedString0 = line;
} else if (formattedString0 == null) {
- log.warn("First line in {} must not start with a space. Not "
+ logger.warn("First line in {} must not start with a space. Not "
+ "retrieving any previously stored documents.",
documentFile.getAbsolutePath());
return null;
@@ -140,7 +141,7 @@ public class DocumentStore<T extends Document> {
T document = this.clazz.getDeclaredConstructor().newInstance();
if (!document.parse(new String[] { formattedString0,
line.substring(1) })) {
- log.warn("Unable to read line {} from {}. Not retrieving any "
+ logger.warn("Unable to read line {} from {}. Not retrieving any "
+ "previously stored documents.", lnr.getLineNumber(),
documentFile.getAbsolutePath());
return null;
@@ -149,12 +150,12 @@ public class DocumentStore<T extends Document> {
}
}
} catch (IOException e) {
- log.warn("Unable to read {}. Not retrieving any previously stored "
+ logger.warn("Unable to read {}. Not retrieving any previously stored "
+ "documents.", documentFile.getAbsolutePath(), e);
return null;
} catch (InstantiationException | IllegalAccessException
| NoSuchMethodException | InvocationTargetException e) {
- log.warn("Unable to read {}. Cannot instantiate document object.",
+ logger.warn("Unable to read {}. Cannot instantiate document object.",
documentFile.getAbsolutePath(), e);
return null;
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/ExtrapolatedHidServStats.java b/src/main/java/org/torproject/metrics/stats/hidserv/ExtrapolatedHidServStats.java
index 71048f3..c46ee2f 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/ExtrapolatedHidServStats.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/ExtrapolatedHidServStats.java
@@ -11,7 +11,7 @@ import org.slf4j.LoggerFactory;
* computed network fractions in the statistics interval. */
public class ExtrapolatedHidServStats implements Document {
- private static Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(ExtrapolatedHidServStats.class);
/** Date of statistics interval end in milliseconds. */
@@ -136,7 +136,7 @@ public class ExtrapolatedHidServStats implements Document {
@Override
public boolean parse(String[] formattedStrings) {
if (formattedStrings.length != 2) {
- log.warn("Invalid number of formatted strings: {}. Skipping.",
+ logger.warn("Invalid number of formatted strings: {}. Skipping.",
formattedStrings.length);
return false;
}
@@ -144,7 +144,7 @@ public class ExtrapolatedHidServStats implements Document {
DateTimeHelper.ISO_DATE_FORMAT);
String[] secondParts = formattedStrings[1].split(",", 5);
if (secondParts.length != 5) {
- log.warn("Invalid number of comma-separated values: {}. Skipping.",
+ logger.warn("Invalid number of comma-separated values: {}. Skipping.",
secondParts.length);
return false;
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/Extrapolator.java b/src/main/java/org/torproject/metrics/stats/hidserv/Extrapolator.java
index 6bb47b8..a1f5028 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/Extrapolator.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/Extrapolator.java
@@ -20,7 +20,8 @@ import java.util.TreeSet;
* observed by the relay. */
public class Extrapolator {
- private static Logger log = LoggerFactory.getLogger(Extrapolator.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(Extrapolator.class);
/** Document file containing previously parsed reported hidden-service
* statistics. */
@@ -89,7 +90,7 @@ public class Extrapolator {
/* Make sure that all documents could be retrieved correctly. */
if (extrapolatedStats == null || reportedStats == null) {
- log.warn("Could not read previously parsed or extrapolated "
+ logger.warn("Could not read previously parsed or extrapolated "
+ "hidserv-stats. Skipping.");
return false;
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/Main.java b/src/main/java/org/torproject/metrics/stats/hidserv/Main.java
index 1711dbb..ba65f8e 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/Main.java
@@ -14,7 +14,7 @@ import java.io.File;
* do not overlap. */
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final File baseDir = new File(
org.torproject.metrics.stats.main.Main.modulesDir, "hidserv");
@@ -34,7 +34,7 @@ public class Main {
/* Initialize parser and read parse history to avoid parsing
* descriptor files that haven't changed since the last execution. */
- log.info("Initializing parser and reading parse history...");
+ logger.info("Initializing parser and reading parse history...");
DocumentStore<ReportedHidServStats> reportedHidServStatsStore =
new DocumentStore<>(ReportedHidServStats.class);
DocumentStore<ComputedNetworkFractions>
@@ -46,28 +46,28 @@ public class Main {
/* Parse new descriptors and store their contents using the document
* stores. */
- log.info("Parsing descriptors...");
+ logger.info("Parsing descriptors...");
parser.parseDescriptors();
/* Write the parse history to avoid parsing descriptor files again
* next time. It's okay to do this now and not at the end of the
* execution, because even if something breaks apart below, it's safe
* not to parse descriptor files again. */
- log.info("Writing parse history...");
+ logger.info("Writing parse history...");
parser.writeParseHistory();
/* Extrapolate reported statistics using computed network fractions
* and write the result to disk using a document store. The result is
* a single file with extrapolated network totals based on reports by
* single relays. */
- log.info("Extrapolating statistics...");
+ logger.info("Extrapolating statistics...");
DocumentStore<ExtrapolatedHidServStats> extrapolatedHidServStatsStore
= new DocumentStore<>(ExtrapolatedHidServStats.class);
Extrapolator extrapolator = new Extrapolator(statusDirectory,
reportedHidServStatsStore, computedNetworkFractionsStore,
extrapolatedHidServStatsStore);
if (!extrapolator.extrapolateHidServStats()) {
- log.warn("Could not extrapolate statistics. Terminating.");
+ logger.warn("Could not extrapolate statistics. Terminating.");
return;
}
@@ -75,7 +75,7 @@ public class Main {
* This includes calculating daily weighted interquartile means, among
* other statistics. Write the result to a .csv file that can be
* processed by other tools. */
- log.info("Aggregating statistics...");
+ logger.info("Aggregating statistics...");
File hidservStatsExtrapolatedCsvFile = new File(baseDir,
"stats/hidserv.csv");
Aggregator aggregator = new Aggregator(statusDirectory,
@@ -83,7 +83,7 @@ public class Main {
aggregator.aggregateHidServStats();
/* End this execution. */
- log.info("Terminating.");
+ logger.info("Terminating.");
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/Parser.java b/src/main/java/org/torproject/metrics/stats/hidserv/Parser.java
index 46a6607..d1d2328 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/Parser.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/Parser.java
@@ -35,7 +35,7 @@ import java.util.TreeSet;
* document files for later use. */
public class Parser {
- private static Logger log = LoggerFactory.getLogger(Parser.class);
+ private static final Logger logger = LoggerFactory.getLogger(Parser.class);
/** File containing tuples of last-modified times and file names of
* descriptor files parsed in the previous execution. */
@@ -111,12 +111,12 @@ public class Parser {
String[] parts = line.split(" ", 2);
excludedFiles.put(parts[1], Long.parseLong(parts[0]));
} catch (NumberFormatException e) {
- log.warn("Illegal line '{}' in parse history. Skipping line.", line,
- e);
+ logger.warn("Illegal line '{}' in parse history. Skipping line.",
+ line, e);
}
}
} catch (IOException e) {
- log.warn("Could not read history file '{}'. Not "
+ logger.warn("Could not read history file '{}'. Not "
+ "excluding descriptors in this execution.",
this.parseHistoryFile.getAbsolutePath(), e);
}
@@ -151,8 +151,9 @@ public class Parser {
+ "\n");
}
} catch (IOException e) {
- log.warn("Could not write history file '{}'. Not excluding descriptors "
- + "in next execution.", this.parseHistoryFile.getAbsolutePath(), e);
+ logger.warn("Could not write history file '{}'. Not excluding "
+ + "descriptors in next execution.",
+ this.parseHistoryFile.getAbsolutePath(), e);
}
}
@@ -231,7 +232,7 @@ public class Parser {
} else if (extraInfoDescriptor.getHidservStatsEndMillis() >= 0L
|| extraInfoDescriptor.getHidservRendRelayedCells() != null
|| extraInfoDescriptor.getHidservDirOnionsSeen() != null) {
- log.warn("Relay {} published incomplete hidserv-stats. Ignoring.",
+ logger.warn("Relay {} published incomplete hidserv-stats. Ignoring.",
fingerprint);
}
}
@@ -252,7 +253,7 @@ public class Parser {
SortedMap<String, Integer> bandwidthWeights =
consensus.getBandwidthWeights();
if (bandwidthWeights == null) {
- log.warn("Consensus with valid-after time {} doesn't contain any Wxx "
+ logger.warn("Consensus with valid-after time {} doesn't contain any Wxx "
+ "weights. Skipping.",
DateTimeHelper.format(consensus.getValidAfterMillis()));
return;
@@ -264,7 +265,7 @@ public class Parser {
new TreeSet<>(Arrays.asList("Wmg,Wmm,Wme,Wmd".split(",")));
expectedWeightKeys.removeAll(bandwidthWeights.keySet());
if (!expectedWeightKeys.isEmpty()) {
- log.warn("Consensus with valid-after time {} doesn't contain expected "
+ logger.warn("Consensus with valid-after time {} doesn't contain expected "
+ "Wmx weights. Skipping.",
DateTimeHelper.format(consensus.getValidAfterMillis()));
return;
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/ReportedHidServStats.java b/src/main/java/org/torproject/metrics/stats/hidserv/ReportedHidServStats.java
index 5b79a65..95942af 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/ReportedHidServStats.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/ReportedHidServStats.java
@@ -11,7 +11,7 @@ import org.slf4j.LoggerFactory;
* by the relay in the "hidserv-" lines of its extra-info descriptor. */
public class ReportedHidServStats implements Document {
- private static Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(ReportedHidServStats.class);
/* Relay fingerprint consisting of 40 upper-case hex characters. */
@@ -115,7 +115,7 @@ public class ReportedHidServStats implements Document {
@Override
public boolean parse(String[] formattedStrings) {
if (formattedStrings.length != 2) {
- log.warn("Invalid number of formatted strings: {} Skipping.",
+ logger.warn("Invalid number of formatted strings: {} Skipping.",
formattedStrings.length);
return false;
}
diff --git a/src/main/java/org/torproject/metrics/stats/hidserv/Simulate.java b/src/main/java/org/torproject/metrics/stats/hidserv/Simulate.java
index 696fc1d..21f20a9 100644
--- a/src/main/java/org/torproject/metrics/stats/hidserv/Simulate.java
+++ b/src/main/java/org/torproject/metrics/stats/hidserv/Simulate.java
@@ -24,7 +24,7 @@ import java.util.TreeSet;
* contains its own main method.) */
public class Simulate {
- private static Logger log = LoggerFactory.getLogger(Simulate.class);
+ private static final Logger logger = LoggerFactory.getLogger(Simulate.class);
private static File simCellsCsvFile =
new File("out/csv/sim-cells.csv");
@@ -34,11 +34,11 @@ public class Simulate {
/** Runs two simulations to evaluate this data-processing module. */
public static void main(String[] args) throws Exception {
- log.info("Simulating extrapolation of rendezvous cells");
+ logger.info("Simulating extrapolation of rendezvous cells");
simulateManyCells();
- log.info("Simulating extrapolation of .onions");
+ logger.info("Simulating extrapolation of .onions");
simulateManyOnions();
- log.info("Terminating.");
+ logger.info("Terminating.");
}
private static Random rnd = new Random();
@@ -51,7 +51,7 @@ public class Simulate {
final int numberOfExtrapolations = 1000;
for (int i = 0; i < numberOfExtrapolations; i++) {
bw.write(simulateCells(i));
- log.info(".");
+ logger.info(".");
}
bw.close();
}
@@ -64,7 +64,7 @@ public class Simulate {
final int numberOfExtrapolations = 1000;
for (int i = 0; i < numberOfExtrapolations; i++) {
bw.write(simulateOnions(i));
- log.info(".");
+ logger.info(".");
}
bw.close();
}
diff --git a/src/main/java/org/torproject/metrics/stats/main/Main.java b/src/main/java/org/torproject/metrics/stats/main/Main.java
index 6badd96..41cba3a 100644
--- a/src/main/java/org/torproject/metrics/stats/main/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/main/Main.java
@@ -15,7 +15,7 @@ import java.util.TimeZone;
public class Main {
- private static final Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final String baseDir = System.getProperty("metrics.basedir",
"/srv/metrics.torproject.org/metrics");
@@ -29,7 +29,7 @@ public class Main {
/** Start the metrics update run. */
public static void main(String[] args) {
- log.info("Starting metrics update run.");
+ logger.info("Starting metrics update run.");
Locale.setDefault(Locale.US);
TimeZone.setDefault(TimeZone.getTimeZone("UTC"));
@@ -40,11 +40,11 @@ public class Main {
continue;
}
if (outputDir.mkdirs()) {
- log.info("Successfully created module base directory {} and any "
+ logger.info("Successfully created module base directory {} and any "
+ "nonexistent parent directories.",
outputDir.getAbsolutePath());
} else {
- log.error("Unable to create module base directory {} and any "
+ logger.error("Unable to create module base directory {} and any "
+ "nonexistent parent directories. Exiting.",
outputDir.getAbsolutePath());
return;
@@ -67,19 +67,19 @@ public class Main {
for (Class<?> module : modules) {
try {
- log.info("Starting {} module.", module.getName());
+ logger.info("Starting {} module.", module.getName());
module.getDeclaredMethod("main", String[].class)
.invoke(null, (Object) args);
- log.info("Completed {} module.", module.getName());
+ logger.info("Completed {} module.", module.getName());
} catch (NoSuchMethodException | IllegalAccessException
| InvocationTargetException e) {
- log.warn("Caught an exception when invoking the main method of the {} "
- + "module. Moving on to the next module, if available.",
+ logger.warn("Caught an exception when invoking the main method of the "
+ + "{} module. Moving on to the next module, if available.",
module.getName(), e);
}
}
- log.info("Making module data available.");
+ logger.info("Making module data available.");
File[] moduleStatsDirs = new File[] {
new File(modulesDir, "connbidirect/stats"),
new File(modulesDir, "onionperf/stats"),
@@ -96,13 +96,15 @@ public class Main {
List<String> copiedFiles = new ArrayList<>();
for (File moduleStatsDir : moduleStatsDirs) {
if (!moduleStatsDir.exists()) {
- log.warn("Skipping nonexistent module stats dir {}.", moduleStatsDir);
+ logger.warn("Skipping nonexistent module stats dir {}.",
+ moduleStatsDir);
continue;
}
File[] moduleStatsFiles = moduleStatsDir.isDirectory()
? moduleStatsDir.listFiles() : new File[] { moduleStatsDir };
if (null == moduleStatsFiles) {
- log.warn("Skipping nonexistent module stats dir {}.", moduleStatsDir);
+ logger.warn("Skipping nonexistent module stats dir {}.",
+ moduleStatsDir);
continue;
}
for (File statsFile : moduleStatsFiles) {
@@ -115,16 +117,16 @@ public class Main {
StandardCopyOption.REPLACE_EXISTING);
copiedFiles.add(statsFile.getName());
} catch (IOException e) {
- log.warn("Unable to copy module stats file {} to stats output "
+ logger.warn("Unable to copy module stats file {} to stats output "
+ "directory {}. Skipping.", statsFile, statsDir, e);
}
}
}
if (!copiedFiles.isEmpty()) {
- log.info("Successfully copied {} files to stats output directory: {}",
+ logger.info("Successfully copied {} files to stats output directory: {}",
copiedFiles.size(), copiedFiles);
}
- log.info("Completed metrics update run.");
+ logger.info("Completed metrics update run.");
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/onionperf/Main.java b/src/main/java/org/torproject/metrics/stats/onionperf/Main.java
index 4325c01..e1b063c 100644
--- a/src/main/java/org/torproject/metrics/stats/onionperf/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/onionperf/Main.java
@@ -31,7 +31,7 @@ import java.util.Set;
public class Main {
/** Logger for this class. */
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final String jdbcString = String.format(
"jdbc:postgresql://localhost/onionperf?user=%s&password=%s",
@@ -43,7 +43,7 @@ public class Main {
/** Executes this data-processing module. */
public static void main(String[] args) throws Exception {
- log.info("Starting onionperf module.");
+ logger.info("Starting onionperf module.");
Connection connection = connectToDatabase();
importOnionPerfFiles(connection);
writeStatistics(new File(baseDir, "stats/torperf-1.1.csv").toPath(),
@@ -56,15 +56,15 @@ public class Main {
new File(baseDir, "stats/onionperf-throughput.csv").toPath(),
queryThroughput(connection));
disconnectFromDatabase(connection);
- log.info("Terminated onionperf module.");
+ logger.info("Terminated onionperf module.");
}
private static Connection connectToDatabase()
throws SQLException {
- log.info("Connecting to database.");
+ logger.info("Connecting to database.");
Connection connection = DriverManager.getConnection(jdbcString);
connection.setAutoCommit(false);
- log.info("Successfully connected to database.");
+ logger.info("Successfully connected to database.");
return connection;
}
@@ -240,7 +240,7 @@ public class Main {
static List<String> queryOnionPerf(Connection connection)
throws SQLException {
- log.info("Querying statistics from database.");
+ logger.info("Querying statistics from database.");
List<String> statistics = new ArrayList<>();
statistics
.add("date,filesize,source,server,q1,md,q3,timeouts,failures,requests");
@@ -268,7 +268,7 @@ public class Main {
static List<String> queryBuildTimes(Connection connection)
throws SQLException {
- log.info("Querying buildtime statistics from database.");
+ logger.info("Querying buildtime statistics from database.");
List<String> statistics = new ArrayList<>();
statistics.add("date,source,position,q1,md,q3");
Statement st = connection.createStatement();
@@ -291,7 +291,7 @@ public class Main {
static List<String> queryLatencies(Connection connection)
throws SQLException {
- log.info("Querying latency statistics from database.");
+ logger.info("Querying latency statistics from database.");
List<String> statistics = new ArrayList<>();
statistics.add("date,source,server,low,q1,md,q3,high");
Statement st = connection.createStatement();
@@ -316,7 +316,7 @@ public class Main {
static List<String> queryThroughput(Connection connection)
throws SQLException {
- log.info("Querying throughput statistics from database.");
+ logger.info("Querying throughput statistics from database.");
List<String> statistics = new ArrayList<>();
statistics.add("date,source,server,low,q1,md,q3,high");
Statement st = connection.createStatement();
@@ -361,14 +361,14 @@ public class Main {
static void writeStatistics(Path webstatsPath, List<String> statistics)
throws IOException {
webstatsPath.toFile().getParentFile().mkdirs();
- log.info("Writing {} lines to {}.", statistics.size(),
+ logger.info("Writing {} lines to {}.", statistics.size(),
webstatsPath.toFile().getAbsolutePath());
Files.write(webstatsPath, statistics, StandardCharsets.UTF_8);
}
private static void disconnectFromDatabase(Connection connection)
throws SQLException {
- log.info("Disconnecting from database.");
+ logger.info("Disconnecting from database.");
connection.close();
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/servers/Main.java b/src/main/java/org/torproject/metrics/stats/servers/Main.java
index 1fc853f..3258189 100644
--- a/src/main/java/org/torproject/metrics/stats/servers/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/servers/Main.java
@@ -22,7 +22,7 @@ import java.util.Arrays;
* statistics to CSV files. */
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final File baseDir = new File(
org.torproject.metrics.stats.main.Main.modulesDir, "servers");
@@ -40,9 +40,9 @@ public class Main {
/** Run the module. */
public static void main(String[] args) throws Exception {
- log.info("Starting servers module.");
+ logger.info("Starting servers module.");
- log.info("Reading descriptors and inserting relevant parts into the "
+ logger.info("Reading descriptors and inserting relevant parts into the "
+ "database.");
DescriptorReader reader = DescriptorSourceFactory.createDescriptorReader();
File historyFile = new File(baseDir, "status/read-descriptors");
@@ -64,30 +64,30 @@ public class Main {
database.insertStatus(parser.parseBridgeNetworkStatus(
(BridgeNetworkStatus) descriptor));
} else if (null != descriptor.getRawDescriptorBytes()) {
- log.debug("Skipping unknown descriptor of type {} starting with "
+ logger.debug("Skipping unknown descriptor of type {} starting with "
+ "'{}'.", descriptor.getClass(),
new String(descriptor.getRawDescriptorBytes(), 0,
Math.min(descriptor.getRawDescriptorLength(), 100)));
} else {
- log.debug("Skipping unknown, empty descriptor of type {}.",
+ logger.debug("Skipping unknown, empty descriptor of type {}.",
descriptor.getClass());
}
}
- log.info("Aggregating database entries.");
+ logger.info("Aggregating database entries.");
database.aggregate();
- log.info("Committing all updated parts in the database.");
+ logger.info("Committing all updated parts in the database.");
database.commit();
} catch (SQLException sqle) {
- log.error("Cannot recover from SQL exception while inserting or "
+ logger.error("Cannot recover from SQL exception while inserting or "
+ "aggregating data. Rolling back and exiting.", sqle);
database.rollback();
return;
}
reader.saveHistoryFile(historyFile);
- log.info("Querying aggregated statistics from the database.");
+ logger.info("Querying aggregated statistics from the database.");
File outputDir = new File(baseDir, "stats");
new Writer().write(new File(outputDir, "ipv6servers.csv").toPath(),
database.queryServersIpv6());
@@ -102,10 +102,10 @@ public class Main {
new Writer().write(new File(outputDir, "platforms.csv").toPath(),
database.queryPlatforms());
- log.info("Terminating servers module.");
+ logger.info("Terminating servers module.");
} catch (SQLException sqle) {
- log.error("Cannot recover from SQL exception while querying. Not writing "
- + "output file.", sqle);
+ logger.error("Cannot recover from SQL exception while querying. Not "
+ + "writing output file.", sqle);
}
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/totalcw/Main.java b/src/main/java/org/torproject/metrics/stats/totalcw/Main.java
index 3be41f9..c19defd 100644
--- a/src/main/java/org/torproject/metrics/stats/totalcw/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/totalcw/Main.java
@@ -21,7 +21,7 @@ import java.util.Arrays;
* CSV file. */
public class Main {
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final File baseDir = new File(
org.torproject.metrics.stats.main.Main.modulesDir, "totalcw");
@@ -35,10 +35,10 @@ public class Main {
/** Run the module. */
public static void main(String[] args) throws Exception {
- log.info("Starting totalcw module.");
+ logger.info("Starting totalcw module.");
- log.info("Reading consensuses and votes and inserting relevant parts into "
- + "the database.");
+ logger.info("Reading consensuses and votes and inserting relevant parts "
+ + "into the database.");
DescriptorReader reader = DescriptorSourceFactory.createDescriptorReader();
File historyFile = new File(baseDir, "status/read-descriptors");
reader.setHistoryFile(historyFile);
@@ -56,33 +56,33 @@ public class Main {
database.insertVote(parser.parseRelayNetworkStatusVote(
(RelayNetworkStatusVote) descriptor));
} else {
- log.debug("Skipping unknown descriptor of type {}.",
+ logger.debug("Skipping unknown descriptor of type {}.",
descriptor.getClass());
}
}
- log.info("Committing all updated parts in the database.");
+ logger.info("Committing all updated parts in the database.");
database.commit();
} catch (SQLException sqle) {
- log.error("Cannot recover from SQL exception while inserting data. "
+ logger.error("Cannot recover from SQL exception while inserting data. "
+ "Rolling back and exiting.", sqle);
database.rollback();
return;
}
reader.saveHistoryFile(historyFile);
- log.info("Querying aggregated statistics from the database.");
+ logger.info("Querying aggregated statistics from the database.");
Iterable<OutputLine> output = database.queryTotalcw();
File outputFile = new File(baseDir, "stats/totalcw.csv");
- log.info("Writing aggregated statistics to {}.", outputFile);
+ logger.info("Writing aggregated statistics to {}.", outputFile);
if (null != output) {
new Writer().write(outputFile.toPath(), output);
}
- log.info("Terminating totalcw module.");
+ logger.info("Terminating totalcw module.");
} catch (SQLException sqle) {
- log.error("Cannot recover from SQL exception while querying. Not writing "
- + "output file.", sqle);
+ logger.error("Cannot recover from SQL exception while querying. Not "
+ + "writing output file.", sqle);
}
}
}
diff --git a/src/main/java/org/torproject/metrics/stats/webstats/Main.java b/src/main/java/org/torproject/metrics/stats/webstats/Main.java
index bca86c5..7ce099e 100644
--- a/src/main/java/org/torproject/metrics/stats/webstats/Main.java
+++ b/src/main/java/org/torproject/metrics/stats/webstats/Main.java
@@ -42,7 +42,7 @@ import java.util.TreeSet;
public class Main {
/** Logger for this class. */
- private static Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
private static final String jdbcString = String.format(
"jdbc:postgresql://localhost/webstats?user=%s&password=%s",
@@ -72,7 +72,7 @@ public class Main {
/** Executes this data-processing module. */
public static void main(String[] args) throws Exception {
- log.info("Starting webstats module.");
+ logger.info("Starting webstats module.");
Connection connection = connectToDatabase();
SortedSet<String> skipFiles = queryImportedFileNames(connection);
importLogFiles(connection, skipFiles,
@@ -84,21 +84,21 @@ public class Main {
writeStatistics(new File(baseDir, "stats/webstats.csv").toPath(),
statistics);
disconnectFromDatabase(connection);
- log.info("Terminated webstats module.");
+ logger.info("Terminated webstats module.");
}
private static Connection connectToDatabase()
throws SQLException {
- log.info("Connecting to database.");
+ logger.info("Connecting to database.");
Connection connection = DriverManager.getConnection(jdbcString);
connection.setAutoCommit(false);
- log.info("Successfully connected to database.");
+ logger.info("Successfully connected to database.");
return connection;
}
static SortedSet<String> queryImportedFileNames(Connection connection)
throws SQLException {
- log.info("Querying previously imported log files.");
+ logger.info("Querying previously imported log files.");
SortedSet<String> importedLogFileUrls = new TreeSet<>();
Statement st = connection.createStatement();
String queryString = "SELECT server, site, log_date FROM files";
@@ -110,7 +110,7 @@ public class Main {
rs.getDate(3).toLocalDate().format(dateFormat)));
}
}
- log.info("Found {} previously imported log files.",
+ logger.info("Found {} previously imported log files.",
importedLogFileUrls.size());
return importedLogFileUrls;
}
@@ -142,11 +142,11 @@ public class Main {
logFile.getPhysicalHost(), logFile.getVirtualHost(),
logFile.getLogDate(), parsedLogLines);
} catch (DescriptorParseException exc) {
- log.warn("Cannot parse log file with file name {}. Retrying in the "
+ logger.warn("Cannot parse log file with file name {}. Retrying in the "
+ "next run.", logFile.getDescriptorFile().getName(), exc);
} catch (SQLException exc) {
- log.warn("Cannot import log file with file name {} into the database. "
- + "Rolling back and retrying in the next run.",
+ logger.warn("Cannot import log file with file name {} into the "
+ + "database. Rolling back and retrying in the next run.",
logFile.getDescriptorFile().getName(), exc);
try {
connection.rollback();
@@ -173,7 +173,7 @@ public class Main {
+ COUNT + ") VALUES (?, CAST(? AS method), ?, ?, ?)");
int fileId = insertFile(psFiles, urlString, server, site, logDate);
if (fileId < 0) {
- log.debug("Skipping previously imported log file {}.", urlString);
+ logger.debug("Skipping previously imported log file {}.", urlString);
return;
}
for (Map.Entry<String, Long> requests : parsedLogLines.entrySet()) {
@@ -185,7 +185,7 @@ public class Main {
int resourceId = insertResource(psResourcesSelect, psResourcesInsert,
resource);
if (resourceId < 0) {
- log.error("Could not retrieve auto-generated key for new resources "
+ logger.error("Could not retrieve auto-generated key for new resources "
+ "entry.");
connection.rollback();
return;
@@ -194,7 +194,7 @@ public class Main {
count);
}
connection.commit();
- log.debug("Finished importing log file with file name {} into database.",
+ logger.debug("Finished importing log file with file name {} into database.",
urlString);
}
@@ -265,7 +265,7 @@ public class Main {
static SortedSet<String> queryWebstats(Connection connection)
throws SQLException {
- log.info("Querying statistics from database.");
+ logger.info("Querying statistics from database.");
SortedSet<String> statistics = new TreeSet<>();
Statement st = connection.createStatement();
String queryString = "SELECT " + ALL_COLUMNS + " FROM webstats";
@@ -295,14 +295,14 @@ public class Main {
List<String> lines = new ArrayList<>();
lines.add(ALL_COLUMNS);
lines.addAll(statistics);
- log.info("Writing {} lines to {}.", lines.size(),
+ logger.info("Writing {} lines to {}.", lines.size(),
webstatsPath.toFile().getAbsolutePath());
Files.write(webstatsPath, lines, StandardCharsets.UTF_8);
}
private static void disconnectFromDatabase(Connection connection)
throws SQLException {
- log.info("Disconnecting from database.");
+ logger.info("Disconnecting from database.");
connection.close();
}
}
diff --git a/src/main/java/org/torproject/metrics/web/ServerMain.java b/src/main/java/org/torproject/metrics/web/ServerMain.java
index bb03086..21f8529 100644
--- a/src/main/java/org/torproject/metrics/web/ServerMain.java
+++ b/src/main/java/org/torproject/metrics/web/ServerMain.java
@@ -13,21 +13,22 @@ import java.util.Locale;
public class ServerMain {
- private static final Logger log = LoggerFactory.getLogger(ServerMain.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(ServerMain.class);
/** Starts the web server listening for incoming client connections. */
public static void main(String[] args) {
Locale.setDefault(Locale.US);
try {
Resource jettyXml = Resource.newSystemResource("jetty.xml");
- log.info("Reading configuration from '{}'.", jettyXml);
+ logger.info("Reading configuration from '{}'.", jettyXml);
XmlConfiguration configuration
= new XmlConfiguration(jettyXml.getInputStream());
Server server = (Server) configuration.configure();
server.start();
server.join();
} catch (Exception ex) {
- log.error("Exiting, because of: {}.", ex.getMessage(), ex);
+ logger.error("Exiting, because of: {}.", ex.getMessage(), ex);
System.exit(1);
}
}
diff --git a/src/main/java/org/torproject/metrics/web/UpdateNews.java b/src/main/java/org/torproject/metrics/web/UpdateNews.java
index 8f4440e..07b1d75 100644
--- a/src/main/java/org/torproject/metrics/web/UpdateNews.java
+++ b/src/main/java/org/torproject/metrics/web/UpdateNews.java
@@ -18,7 +18,8 @@ import java.util.Locale;
public class UpdateNews {
- private static Logger log = LoggerFactory.getLogger(UpdateNews.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(UpdateNews.class);
/** Update news. */
public static void main(String[] args) throws Exception {
@@ -79,7 +80,7 @@ public class UpdateNews {
int space = desc.indexOf(" ", open);
int close = desc.indexOf("]", open);
if (open < 0 || space < 0 || close < 0) {
- log.warn("Cannot convert link in line {}. Exiting.", line);
+ logger.warn("Cannot convert link in line {}. Exiting.", line);
System.exit(1);
}
desc = desc.substring(0, open) + "<a href=\""
@@ -91,7 +92,8 @@ public class UpdateNews {
int open = desc.indexOf("`");
int close = desc.indexOf("`", open + 1);
if (open < 0 || close < 0) {
- log.warn("Cannot convert code fragment in line {}. Exiting.", line);
+ logger.warn("Cannot convert code fragment in line {}. Exiting.",
+ line);
System.exit(1);
}
desc = desc.substring(0, open) + "<code>"
diff --git a/src/main/resources/logback.xml b/src/main/resources/logback.xml
deleted file mode 100644
index 7789feb..0000000
--- a/src/main/resources/logback.xml
+++ /dev/null
@@ -1,58 +0,0 @@
-<configuration debug="false">
- <statusListener class="ch.qos.logback.core.status.NopStatusListener" />
-
- <!-- a path and a prefix -->
- <property name="logfile-base" value="${LOGBASE}/metrics-web-" />
-
- <!-- log file names -->
- <property name="fileall-logname" value="${logfile-base}all" />
- <property name="fileerr-logname" value="${logfile-base}err" />
- <property name="filestatistics-logname" value="${logfile-base}statistics" />
-
- <!-- date pattern -->
- <property name="utc-date-pattern" value="%date{ISO8601, UTC}" />
-
- <!-- appender section -->
- <appender name="FILEALL" class="ch.qos.logback.core.rolling.RollingFileAppender">
- <file>${fileall-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
- <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
- <!-- rollover daily -->
- <FileNamePattern>${fileall-logname}.%d{yyyy-MM-dd}.%i.log</FileNamePattern>
- <maxHistory>10</maxHistory>
- <timeBasedFileNamingAndTriggeringPolicy
- class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
- <!-- or whenever the file size reaches 1MB -->
- <maxFileSize>1MB</maxFileSize>
- </timeBasedFileNamingAndTriggeringPolicy>
- </rollingPolicy>
- </appender>
-
- <appender name="FILEERR" class="ch.qos.logback.core.FileAppender">
- <file>${fileerr-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <!-- ERROR or worse -->
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>ERROR</level>
- </filter>
- </appender>
-
- <!-- logger section -->
- <logger name="org.torproject" >
- <appender-ref ref="FILEERR" />
- </logger>
-
- <logger name="org.eclipse" level="INFO" />
- <logger name="org.apache" level="INFO" />
-
- <root level="ALL">
- <appender-ref ref="FILEALL" />
- </root>
-
-</configuration>
-
diff --git a/src/submods/metrics-lib b/src/submods/metrics-lib
index 81570c4..d7d5303 160000
--- a/src/submods/metrics-lib
+++ b/src/submods/metrics-lib
@@ -1 +1 @@
-Subproject commit 81570c4dbc097089f367c104c7ef5a77bee29763
+Subproject commit d7d5303e76a69f5fd0fe2b3b9f6be9f25f1fd824
1
0

31 Mar '20
commit 335b02441f5c942b4bfbbe73e95a52d60322e2c9
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Tue Mar 31 09:46:02 2020 +0200
Add change log entry for #33090.
---
CHANGELOG.md | 1 +
1 file changed, 1 insertion(+)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 068c0b9..4a5a49e 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -8,6 +8,7 @@
* Minor changes
- Avoid invoking overridable methods from constructors.
+ - Make all descriptor instances serializable.
# Changes in version 2.10.0 - 2020-01-15
1
0

31 Mar '20
commit d7d5303e76a69f5fd0fe2b3b9f6be9f25f1fd824
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Tue Mar 31 09:44:06 2020 +0200
Simplify logging configuration.
Implements #33549.
---
CHANGELOG.md | 1 +
src/build | 2 +-
.../descriptor/DescriptorSourceFactory.java | 4 +--
.../descriptor/impl/DescriptorParserImpl.java | 4 +--
.../descriptor/impl/DescriptorReaderImpl.java | 20 ++++++-------
.../descriptor/index/DescriptorIndexCollector.java | 34 ++++++++++++----------
.../org/torproject/descriptor/index/FileNode.java | 8 ++---
.../org/torproject/descriptor/index/IndexNode.java | 5 ----
.../descriptor/log/LogDescriptorImpl.java | 6 ----
.../descriptor/log/WebServerAccessLogImpl.java | 6 ----
.../descriptor/log/WebServerAccessLogLine.java | 4 +--
11 files changed, 39 insertions(+), 55 deletions(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 4a5a49e..f0c9a63 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -9,6 +9,7 @@
* Minor changes
- Avoid invoking overridable methods from constructors.
- Make all descriptor instances serializable.
+ - Simplify logging configuration.
# Changes in version 2.10.0 - 2020-01-15
diff --git a/src/build b/src/build
index 264e498..fd85646 160000
--- a/src/build
+++ b/src/build
@@ -1 +1 @@
-Subproject commit 264e498f54a20f7d299daaf2533d043f880e6a8b
+Subproject commit fd856466bcb260f53ef69a24c102d0e49d171cc3
diff --git a/src/main/java/org/torproject/descriptor/DescriptorSourceFactory.java b/src/main/java/org/torproject/descriptor/DescriptorSourceFactory.java
index 97f93cc..3dc8439 100644
--- a/src/main/java/org/torproject/descriptor/DescriptorSourceFactory.java
+++ b/src/main/java/org/torproject/descriptor/DescriptorSourceFactory.java
@@ -40,7 +40,7 @@ import org.slf4j.LoggerFactory;
*/
public final class DescriptorSourceFactory {
- private static Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DescriptorSourceFactory.class);
/**
@@ -147,7 +147,7 @@ public final class DescriptorSourceFactory {
}
object = ClassLoader.getSystemClassLoader().loadClass(clazzName)
.getDeclaredConstructor().newInstance();
- log.info("Serving implementation {} for {}.", clazzName, type);
+ logger.debug("Serving implementation {} for {}.", clazzName, type);
} catch (ReflectiveOperationException ex) {
throw new RuntimeException("Cannot load class "
+ clazzName + "for type " + type, ex);
diff --git a/src/main/java/org/torproject/descriptor/impl/DescriptorParserImpl.java b/src/main/java/org/torproject/descriptor/impl/DescriptorParserImpl.java
index 0a2444b..160baac 100644
--- a/src/main/java/org/torproject/descriptor/impl/DescriptorParserImpl.java
+++ b/src/main/java/org/torproject/descriptor/impl/DescriptorParserImpl.java
@@ -23,7 +23,7 @@ import java.util.List;
public class DescriptorParserImpl implements DescriptorParser {
- private static final Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(DescriptorParserImpl.class);
@Override
@@ -33,7 +33,7 @@ public class DescriptorParserImpl implements DescriptorParser {
return this.detectTypeAndParseDescriptors(rawDescriptorBytes,
sourceFile, fileName);
} catch (DescriptorParseException e) {
- log.debug("Cannot parse descriptor file '{}'.", sourceFile, e);
+ logger.debug("Cannot parse descriptor file '{}'.", sourceFile, e);
List<Descriptor> parsedDescriptors = new ArrayList<>();
parsedDescriptors.add(new UnparseableDescriptorImpl(rawDescriptorBytes,
new int[] { 0, rawDescriptorBytes.length }, sourceFile, e));
diff --git a/src/main/java/org/torproject/descriptor/impl/DescriptorReaderImpl.java b/src/main/java/org/torproject/descriptor/impl/DescriptorReaderImpl.java
index 08c82ec..8ec04a5 100644
--- a/src/main/java/org/torproject/descriptor/impl/DescriptorReaderImpl.java
+++ b/src/main/java/org/torproject/descriptor/impl/DescriptorReaderImpl.java
@@ -36,11 +36,9 @@ import java.util.TreeMap;
public class DescriptorReaderImpl implements DescriptorReader {
- private static Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
DescriptorReaderImpl.class);
- private static Logger statisticsLog = LoggerFactory.getLogger("statistics");
-
private boolean hasStartedReading = false;
private File manualSaveHistoryFile;
@@ -161,8 +159,8 @@ public class DescriptorReaderImpl implements DescriptorReader {
this.readTarballs();
this.hasFinishedReading = true;
} catch (Throwable t) {
- log.error("Bug: uncaught exception or error while reading descriptors.",
- t);
+ logger.error("Bug: uncaught exception or error while reading "
+ + "descriptors.", t);
} finally {
if (null != this.descriptorQueue) {
this.descriptorQueue.setOutOfDescriptors();
@@ -180,7 +178,7 @@ public class DescriptorReaderImpl implements DescriptorReader {
StandardCharsets.UTF_8);
for (String line : lines) {
if (!line.contains(" ")) {
- log.warn("Unexpected line structure in old history: {}", line);
+ logger.warn("Unexpected line structure in old history: {}", line);
continue;
}
long lastModifiedMillis = Long.parseLong(line.substring(0,
@@ -189,7 +187,7 @@ public class DescriptorReaderImpl implements DescriptorReader {
this.excludedFilesBefore.put(absolutePath, lastModifiedMillis);
}
} catch (IOException | NumberFormatException e) {
- log.warn("Trouble reading given history file {}.", historyFile, e);
+ logger.warn("Trouble reading given history file {}.", historyFile, e);
}
}
@@ -212,7 +210,7 @@ public class DescriptorReaderImpl implements DescriptorReader {
bw.newLine();
}
} catch (IOException e) {
- log.warn("Trouble writing new history file '{}'.",
+ logger.warn("Trouble writing new history file '{}'.",
historyFile, e);
}
}
@@ -250,7 +248,7 @@ public class DescriptorReaderImpl implements DescriptorReader {
}
this.parsedFilesAfter.put(absolutePath, lastModifiedMillis);
} catch (IOException e) {
- log.warn("Unable to read descriptor file {}.", file, e);
+ logger.warn("Unable to read descriptor file {}.", file, e);
}
}
}
@@ -271,13 +269,13 @@ public class DescriptorReaderImpl implements DescriptorReader {
this.parsedFilesAfter.put(tarball.getAbsolutePath(),
tarball.lastModified());
} catch (IOException e) {
- log.warn("Unable to read tarball {}.", tarball, e);
+ logger.warn("Unable to read tarball {}.", tarball, e);
}
long previousPercentDone = 100L * progress / total;
progress += tarball.length();
long percentDone = 100L * progress / total;
if (percentDone > previousPercentDone) {
- statisticsLog.info("Finished reading {}% of tarball bytes.",
+ logger.info("Finished reading {}% of tarball bytes.",
percentDone);
}
}
diff --git a/src/main/java/org/torproject/descriptor/index/DescriptorIndexCollector.java b/src/main/java/org/torproject/descriptor/index/DescriptorIndexCollector.java
index b4aae59..b2f4e97 100644
--- a/src/main/java/org/torproject/descriptor/index/DescriptorIndexCollector.java
+++ b/src/main/java/org/torproject/descriptor/index/DescriptorIndexCollector.java
@@ -30,7 +30,7 @@ import java.util.TreeMap;
*/
public class DescriptorIndexCollector implements DescriptorCollector {
- private static Logger log = LoggerFactory
+ private static final Logger logger = LoggerFactory
.getLogger(DescriptorIndexCollector.class);
/**
@@ -44,7 +44,7 @@ public class DescriptorIndexCollector implements DescriptorCollector {
public void collectDescriptors(String collecTorIndexUrlString,
String[] remoteDirectories, long minLastModified,
File localDirectory, boolean deleteExtraneousLocalFiles) {
- log.info("Starting descriptor collection.");
+ logger.info("Starting descriptor collection.");
if (minLastModified < 0) {
throw new IllegalArgumentException("A negative minimum "
+ "last-modified time is not permitted.");
@@ -60,7 +60,8 @@ public class DescriptorIndexCollector implements DescriptorCollector {
+ "fetched files. Move this file away or delete it. Aborting "
+ "descriptor collection.");
}
- log.info("Indexing local directory {}.", localDirectory.getAbsolutePath());
+ logger.info("Indexing local directory {}.",
+ localDirectory.getAbsolutePath());
SortedMap<String, Long> localFiles = statLocalDirectory(localDirectory);
SortedMap<String, FileNode> remoteFiles;
IndexNode index;
@@ -71,27 +72,27 @@ public class DescriptorIndexCollector implements DescriptorCollector {
if (indexUrl.getPath().isEmpty()) {
indexUrlString += "/index/index.json";
}
- log.info("Fetching remote index file {}.", indexUrlString);
+ logger.info("Fetching remote index file {}.", indexUrlString);
index = IndexNode.fetchIndex(indexUrlString);
remoteFiles = index.retrieveFilesIn(remoteDirectories);
} catch (Exception ex) {
- log.warn("Cannot fetch index file {} and hence cannot determine which "
+ logger.warn("Cannot fetch index file {} and hence cannot determine which "
+ "remote files to fetch. Aborting descriptor collection.",
indexUrlString, ex);
return;
}
- log.info("Fetching remote files from {}.", index.path);
+ logger.info("Fetching remote files from {}.", index.path);
if (!this.fetchRemoteFiles(index.path, remoteFiles, minLastModified,
localDirectory, localFiles)) {
return;
}
if (deleteExtraneousLocalFiles) {
- log.info("Deleting extraneous files from local directory {}.",
+ logger.info("Deleting extraneous files from local directory {}.",
localDirectory);
deleteExtraneousLocalFiles(remoteDirectories, remoteFiles, localDirectory,
localFiles);
}
- log.info("Finished descriptor collection.");
+ logger.info("Finished descriptor collection.");
}
boolean fetchRemoteFiles(String baseUrl, SortedMap<String, FileNode> remotes,
@@ -108,14 +109,15 @@ public class DescriptorIndexCollector implements DescriptorCollector {
continue;
}
if (!filepath.exists() && !filepath.mkdirs()) {
- log.warn("Cannot create local directory {} to store remote file {}. "
+ logger.warn("Cannot create local directory {} to store remote file {}. "
+ "Aborting descriptor collection.", filepath, filename);
return false;
}
File destinationFile = new File(filepath, filename);
File tempDestinationFile = new File(filepath, "." + filename);
- log.debug("Fetching remote file {} with expected size of {} bytes from "
- + "{}, storing locally to temporary file {}, then renaming to {}.",
+ logger.debug("Fetching remote file {} with expected size of {} bytes "
+ + "from {}, storing locally to temporary file {}, then renaming to "
+ + "{}.",
filepathname, entry.getValue().size, baseUrl,
tempDestinationFile.getAbsolutePath(),
destinationFile.getAbsolutePath());
@@ -127,14 +129,14 @@ public class DescriptorIndexCollector implements DescriptorCollector {
tempDestinationFile.renameTo(destinationFile);
destinationFile.setLastModified(lastModifiedMillis);
} else {
- log.warn("Fetched remote file {} from {} has a size of {} bytes "
+ logger.warn("Fetched remote file {} from {} has a size of {} bytes "
+ "which is different from the expected {} bytes. Not storing "
+ "this file.",
filename, baseUrl, tempDestinationFile.length(),
entry.getValue().size);
}
} catch (IOException e) {
- log.warn("Cannot fetch remote file {} from {}. Skipping that file.",
+ logger.warn("Cannot fetch remote file {} from {}. Skipping that file.",
filename, baseUrl, e);
}
}
@@ -151,7 +153,7 @@ public class DescriptorIndexCollector implements DescriptorCollector {
if (localPath.startsWith(remDir)) {
if (!remoteFiles.containsKey(localPath)) {
File extraneousLocalFile = new File(localDir, localPath);
- log.debug("Deleting extraneous local file {}.",
+ logger.debug("Deleting extraneous local file {}.",
extraneousLocalFile.getAbsolutePath());
extraneousLocalFile.delete();
}
@@ -179,8 +181,8 @@ public class DescriptorIndexCollector implements DescriptorCollector {
}
});
} catch (IOException ioe) {
- log.warn("Cannot index local directory {} to skip any remote files that "
- + "already exist locally. Continuing with an either empty or "
+ logger.warn("Cannot index local directory {} to skip any remote files "
+ + "that already exist locally. Continuing with an either empty or "
+ "incomplete index of local files.", localDir, ioe);
}
return locals;
diff --git a/src/main/java/org/torproject/descriptor/index/FileNode.java b/src/main/java/org/torproject/descriptor/index/FileNode.java
index 6c35146..a433d59 100644
--- a/src/main/java/org/torproject/descriptor/index/FileNode.java
+++ b/src/main/java/org/torproject/descriptor/index/FileNode.java
@@ -21,7 +21,7 @@ import java.util.TimeZone;
*/
public class FileNode implements Comparable<FileNode> {
- private static Logger log = LoggerFactory.getLogger(FileNode.class);
+ private static final Logger logger = LoggerFactory.getLogger(FileNode.class);
/** Path (i.e. file name) is exposed in JSON. */
public final String path;
@@ -70,9 +70,9 @@ public class FileNode implements Comparable<FileNode> {
try {
lastModifiedMillis = dateTimeFormat.parse(this.lastModified).getTime();
} catch (ParseException ex) {
- log.warn("Cannot parse last-modified time {} of remote file entry {}. "
- + "Fetching remote file regardless of configured last-modified "
- + "time. The following error message provides more details.",
+ logger.warn("Cannot parse last-modified time {} of remote file entry "
+ + "{}. Fetching remote file regardless of configured last-modified "
+ + "time. The following error message provides more details.",
this.lastModified, this.path, ex);
this.lastModifiedMillis = -1L;
}
diff --git a/src/main/java/org/torproject/descriptor/index/IndexNode.java b/src/main/java/org/torproject/descriptor/index/IndexNode.java
index ce3faa4..d5f62ad 100644
--- a/src/main/java/org/torproject/descriptor/index/IndexNode.java
+++ b/src/main/java/org/torproject/descriptor/index/IndexNode.java
@@ -14,9 +14,6 @@ import com.fasterxml.jackson.databind.DeserializationFeature;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.PropertyNamingStrategy;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
@@ -39,8 +36,6 @@ import java.util.TreeSet;
@JsonPropertyOrder({ "created", "revision", "path", "directories", "files" })
public class IndexNode {
- private static Logger log = LoggerFactory.getLogger(IndexNode.class);
-
private static final int READ_TIMEOUT = Integer.parseInt(System
.getProperty("sun.net.client.defaultReadTimeout", "60000"));
diff --git a/src/main/java/org/torproject/descriptor/log/LogDescriptorImpl.java b/src/main/java/org/torproject/descriptor/log/LogDescriptorImpl.java
index a253c50..fdebc90 100644
--- a/src/main/java/org/torproject/descriptor/log/LogDescriptorImpl.java
+++ b/src/main/java/org/torproject/descriptor/log/LogDescriptorImpl.java
@@ -8,9 +8,6 @@ import org.torproject.descriptor.DescriptorParseException;
import org.torproject.descriptor.LogDescriptor;
import org.torproject.descriptor.internal.FileType;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.File;
@@ -37,9 +34,6 @@ public abstract class LogDescriptorImpl
private static final int unrecognizedLinesLimit = 3;
- private static final Logger log
- = LoggerFactory.getLogger(LogDescriptorImpl.class);
-
private static Pattern filenamePattern = Pattern.compile(
"(?:\\S*)" + MARKER + SEP + "(?:[0-9a-zA-Z]*)(?:\\.?)([a-zA-Z2]*)");
diff --git a/src/main/java/org/torproject/descriptor/log/WebServerAccessLogImpl.java b/src/main/java/org/torproject/descriptor/log/WebServerAccessLogImpl.java
index eb05413..986eafc 100644
--- a/src/main/java/org/torproject/descriptor/log/WebServerAccessLogImpl.java
+++ b/src/main/java/org/torproject/descriptor/log/WebServerAccessLogImpl.java
@@ -7,9 +7,6 @@ import org.torproject.descriptor.DescriptorParseException;
import org.torproject.descriptor.WebServerAccessLog;
import org.torproject.descriptor.internal.FileType;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
import java.io.BufferedReader;
import java.io.File;
import java.io.InputStreamReader;
@@ -35,9 +32,6 @@ public class WebServerAccessLogImpl extends LogDescriptorImpl
private static final long serialVersionUID = 7528914359452568309L;
- private static final Logger log
- = LoggerFactory.getLogger(WebServerAccessLogImpl.class);
-
/** The log's name should include this string. */
public static final String MARKER = InternalWebServerAccessLog.MARKER;
diff --git a/src/main/java/org/torproject/descriptor/log/WebServerAccessLogLine.java b/src/main/java/org/torproject/descriptor/log/WebServerAccessLogLine.java
index 445df9d..b39c633 100644
--- a/src/main/java/org/torproject/descriptor/log/WebServerAccessLogLine.java
+++ b/src/main/java/org/torproject/descriptor/log/WebServerAccessLogLine.java
@@ -25,7 +25,7 @@ public class WebServerAccessLogLine implements WebServerAccessLog.Line {
private static final long serialVersionUID = 6160416810587561460L;
- private static final Logger log = LoggerFactory
+ private static final Logger logger = LoggerFactory
.getLogger(WebServerAccessLogLine.class);
private static final String DATE_PATTERN = "dd/MMM/yyyy";
@@ -153,7 +153,7 @@ public class WebServerAccessLogLine implements WebServerAccessLog.Line {
res.valid = true;
}
} catch (Throwable th) {
- log.debug("Unmatchable line: '{}'.", line, th);
+ logger.debug("Unmatchable line: '{}'.", line, th);
return new WebServerAccessLogLine();
}
return res;
1
0
commit 77d9429797594113d2876ef5c3600d8fa37caf46
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Tue Mar 31 09:18:17 2020 +0200
Simplify logging configuration.
Implements #33549.
---
CHANGELOG.md | 3 +
src/build | 2 +-
.../org/torproject/metrics/collector/Main.java | 4 +-
.../metrics/collector/cron/ShutdownHook.java | 7 +-
.../persist/BandwidthFilePersistence.java | 7 +-
.../collector/persist/DescriptorPersistence.java | 6 -
.../collector/persist/PersistenceUtils.java | 8 +-
.../metrics/collector/persist/VotePersistence.java | 7 +-
.../metrics/collector/sync/SyncManager.java | 21 +--
.../metrics/collector/sync/SyncPersistence.java | 11 +-
.../metrics/collector/webstats/LogFileMap.java | 7 +-
.../metrics/collector/webstats/LogMetadata.java | 6 +-
.../collector/webstats/SanitizeWeblogs.java | 33 ++--
.../collector/webstats/WebServerAccessLogLine.java | 4 +-
src/main/resources/logback.xml | 167 ---------------------
15 files changed, 70 insertions(+), 223 deletions(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 5606180..c284d47 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,8 @@
# Changes in version 1.1?.? - 2020-0?-??
+ * Minor changes
+ - Simplify logging configuration.
+
# Changes in version 1.14.1 - 2020-01-16
diff --git a/src/build b/src/build
index 264e498..fd85646 160000
--- a/src/build
+++ b/src/build
@@ -1 +1 @@
-Subproject commit 264e498f54a20f7d299daaf2533d043f880e6a8b
+Subproject commit fd856466bcb260f53ef69a24c102d0e49d171cc3
diff --git a/src/main/java/org/torproject/metrics/collector/Main.java b/src/main/java/org/torproject/metrics/collector/Main.java
index 3822353..3e8ec33 100644
--- a/src/main/java/org/torproject/metrics/collector/Main.java
+++ b/src/main/java/org/torproject/metrics/collector/Main.java
@@ -39,7 +39,7 @@ import java.util.Map;
*/
public class Main {
- private static final Logger log = LoggerFactory.getLogger(Main.class);
+ private static final Logger logger = LoggerFactory.getLogger(Main.class);
public static final String CONF_FILE = "collector.properties";
@@ -116,7 +116,7 @@ public class Main {
+ ") and provide at least one data source and one data sink. "
+ "Refer to the manual for more information.");
} catch (IOException e) {
- log.error("Cannot write default configuration.", e);
+ logger.error("Cannot write default configuration.", e);
throw new RuntimeException(e);
}
}
diff --git a/src/main/java/org/torproject/metrics/collector/cron/ShutdownHook.java b/src/main/java/org/torproject/metrics/collector/cron/ShutdownHook.java
index ec34a19..7e0d0be 100644
--- a/src/main/java/org/torproject/metrics/collector/cron/ShutdownHook.java
+++ b/src/main/java/org/torproject/metrics/collector/cron/ShutdownHook.java
@@ -11,7 +11,8 @@ import org.slf4j.LoggerFactory;
*/
public final class ShutdownHook extends Thread {
- private static final Logger log = LoggerFactory.getLogger(ShutdownHook.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(ShutdownHook.class);
private boolean stayAlive = true;
@@ -37,13 +38,13 @@ public final class ShutdownHook extends Thread {
@Override
public void run() {
- log.info("Shutdown in progress ... ");
+ logger.info("Shutdown in progress ... ");
Scheduler.getInstance().shutdownScheduler();
synchronized (this) {
this.stayAlive = false;
this.notify();
}
- log.info("Shutdown finished. Exiting.");
+ logger.info("Shutdown finished. Exiting.");
}
}
diff --git a/src/main/java/org/torproject/metrics/collector/persist/BandwidthFilePersistence.java b/src/main/java/org/torproject/metrics/collector/persist/BandwidthFilePersistence.java
index bbbfca5..8664ae8 100644
--- a/src/main/java/org/torproject/metrics/collector/persist/BandwidthFilePersistence.java
+++ b/src/main/java/org/torproject/metrics/collector/persist/BandwidthFilePersistence.java
@@ -7,6 +7,8 @@ import org.torproject.descriptor.BandwidthFile;
import org.torproject.metrics.collector.conf.Annotation;
import org.apache.commons.codec.digest.DigestUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
import java.nio.file.Paths;
import java.time.LocalDateTime;
@@ -16,6 +18,9 @@ import java.time.format.DateTimeFormatter;
public class BandwidthFilePersistence
extends DescriptorPersistence<BandwidthFile> {
+ private static final Logger logger
+ = LoggerFactory.getLogger(BandwidthFilePersistence.class);
+
private static final String BANDWIDTH = "bandwidth";
private static final String BANDWIDTHS = "bandwidths";
@@ -57,7 +62,7 @@ public class BandwidthFilePersistence
System.arraycopy(bytes, start, forDigest, 0, forDigest.length);
digest = DigestUtils.sha256Hex(forDigest).toUpperCase();
} else {
- log.error("No digest calculation possible. Returning empty string.");
+ logger.error("No digest calculation possible. Returning empty string.");
}
return digest;
}
diff --git a/src/main/java/org/torproject/metrics/collector/persist/DescriptorPersistence.java b/src/main/java/org/torproject/metrics/collector/persist/DescriptorPersistence.java
index 7c648ef..a2c9bc4 100644
--- a/src/main/java/org/torproject/metrics/collector/persist/DescriptorPersistence.java
+++ b/src/main/java/org/torproject/metrics/collector/persist/DescriptorPersistence.java
@@ -5,18 +5,12 @@ package org.torproject.metrics.collector.persist;
import org.torproject.descriptor.Descriptor;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.util.List;
public abstract class DescriptorPersistence<T extends Descriptor> {
- protected static final Logger log = LoggerFactory.getLogger(
- DescriptorPersistence.class);
-
protected static final String BRIDGEDESCS = "bridge-descriptors";
protected static final String BRIDGEPOOLASSIGNMENTS
= "bridge-pool-assignments";
diff --git a/src/main/java/org/torproject/metrics/collector/persist/PersistenceUtils.java b/src/main/java/org/torproject/metrics/collector/persist/PersistenceUtils.java
index 72ad73a..da1403c 100644
--- a/src/main/java/org/torproject/metrics/collector/persist/PersistenceUtils.java
+++ b/src/main/java/org/torproject/metrics/collector/persist/PersistenceUtils.java
@@ -23,7 +23,7 @@ import java.util.TimeZone;
public class PersistenceUtils {
- private static final Logger log = LoggerFactory.getLogger(
+ private static final Logger logger = LoggerFactory.getLogger(
PersistenceUtils.class);
public static final String TEMPFIX = ".tmp";
@@ -55,14 +55,14 @@ public class PersistenceUtils {
}
return createOrAppend(typeAnnotation, data, tmpPath, option);
} catch (FileAlreadyExistsException faee) {
- log.debug("Already have descriptor(s) of type '{}': {}. Skipping.",
+ logger.debug("Already have descriptor(s) of type '{}': {}. Skipping.",
new String(typeAnnotation), outputPath);
} catch (IOException | SecurityException
| UnsupportedOperationException e) {
- log.warn("Could not store descriptor(s) {} of type '{}'",
+ logger.warn("Could not store descriptor(s) {} of type '{}'",
outputPath, new String(typeAnnotation), e);
} catch (Throwable th) { // anything else
- log.warn("Problem storing descriptor(s) {} of type '{}'",
+ logger.warn("Problem storing descriptor(s) {} of type '{}'",
outputPath, new String(typeAnnotation), th);
}
return false;
diff --git a/src/main/java/org/torproject/metrics/collector/persist/VotePersistence.java b/src/main/java/org/torproject/metrics/collector/persist/VotePersistence.java
index 461ca40..5973795 100644
--- a/src/main/java/org/torproject/metrics/collector/persist/VotePersistence.java
+++ b/src/main/java/org/torproject/metrics/collector/persist/VotePersistence.java
@@ -7,6 +7,8 @@ import org.torproject.descriptor.RelayNetworkStatusVote;
import org.torproject.metrics.collector.conf.Annotation;
import org.apache.commons.codec.digest.DigestUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
import java.nio.charset.StandardCharsets;
import java.nio.file.Paths;
@@ -14,6 +16,9 @@ import java.nio.file.Paths;
public class VotePersistence
extends DescriptorPersistence<RelayNetworkStatusVote> {
+ private static final Logger logger
+ = LoggerFactory.getLogger(VotePersistence.class);
+
private static final String VOTE = "vote";
private static final String VOTES = "votes";
@@ -56,7 +61,7 @@ public class VotePersistence
System.arraycopy(bytes, start, forDigest, 0, sig - start);
digest = DigestUtils.sha1Hex(forDigest).toUpperCase();
} else {
- log.error("No digest calculation possible. Returning empty string.");
+ logger.error("No digest calculation possible. Returning empty string.");
}
return digest;
}
diff --git a/src/main/java/org/torproject/metrics/collector/sync/SyncManager.java b/src/main/java/org/torproject/metrics/collector/sync/SyncManager.java
index e42ae61..1fa1347 100644
--- a/src/main/java/org/torproject/metrics/collector/sync/SyncManager.java
+++ b/src/main/java/org/torproject/metrics/collector/sync/SyncManager.java
@@ -25,7 +25,8 @@ import java.util.Set;
public class SyncManager {
- private static final Logger log = LoggerFactory.getLogger(SyncManager.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(SyncManager.class);
public static final String SYNCORIGINS = "SyncOrigins";
private Date collectionDate;
@@ -53,12 +54,12 @@ public class SyncManager {
File storage = new File(basePath.toFile(),
marker + "-" + source.getHost());
storage.mkdirs();
- log.info("Collecting {} from {} ...", marker, source.getHost());
+ logger.info("Collecting {} from {} ...", marker, source.getHost());
descriptorCollector.collectDescriptors(source.toString(),
dirs.toArray(new String[dirs.size()]), 0L, storage, true);
- log.info("Done collecting {} from {}.", marker, source.getHost());
+ logger.info("Done collecting {} from {}.", marker, source.getHost());
} catch (Throwable th) { // catch all
- log.warn("Cannot download {} from {}.", dirs, source, th);
+ logger.warn("Cannot download {} from {}.", dirs, source, th);
}
}
}
@@ -72,7 +73,7 @@ public class SyncManager {
= new ProcessCriterium(UnparseableDescriptor.class);
for (URL source : sources) {
File base = new File(basePath.toFile(), marker + "-" + source.getHost());
- log.info("Merging {} from {} into storage ...", marker,
+ logger.info("Merging {} from {} into storage ...", marker,
source.getHost());
for (Map.Entry<String, Class<? extends Descriptor>> entry
: mapPathDesc.entrySet()) {
@@ -86,21 +87,21 @@ public class SyncManager {
"sync-history-" + source.getHost() + "-" + marker + "-"
+ histFileEnding);
descriptorReader.setHistoryFile(historyFile);
- log.info("Reading {} of type {} ... ", marker, histFileEnding);
+ logger.info("Reading {} of type {} ... ", marker, histFileEnding);
Iterator<Descriptor> descriptors
= descriptorReader.readDescriptors(descFile).iterator();
- log.info("Done reading {} of type {}.", marker, histFileEnding);
+ logger.info("Done reading {} of type {}.", marker, histFileEnding);
Criterium<Descriptor> crit = new ProcessCriterium(entry.getValue());
while (descriptors.hasNext()) {
Descriptor desc = descriptors.next();
if (unparseable.applies(desc)) {
Exception ex
= ((UnparseableDescriptor)desc).getDescriptorParseException();
- log.warn("Parsing of {} caused Exception(s). Processing anyway.",
+ logger.warn("Parsing of {} caused Exception(s). Processing anyway.",
desc.getDescriptorFile(), ex);
}
if (!crit.applies(desc)) {
- log.warn("Not processing {} in {}.", desc.getClass().getName(),
+ logger.warn("Not processing {} in {}.", desc.getClass().getName(),
desc.getDescriptorFile());
continue;
}
@@ -110,7 +111,7 @@ public class SyncManager {
persist.cleanDirectory();
descriptorReader.saveHistoryFile(historyFile);
}
- log.info("Done merging {} from {}.", marker, source.getHost());
+ logger.info("Done merging {} from {}.", marker, source.getHost());
}
}
diff --git a/src/main/java/org/torproject/metrics/collector/sync/SyncPersistence.java b/src/main/java/org/torproject/metrics/collector/sync/SyncPersistence.java
index f81e164..adffb93 100644
--- a/src/main/java/org/torproject/metrics/collector/sync/SyncPersistence.java
+++ b/src/main/java/org/torproject/metrics/collector/sync/SyncPersistence.java
@@ -48,7 +48,7 @@ import java.nio.file.Path;
/** Provides persistence for descriptors based on the descriptor type. */
public class SyncPersistence {
- private static final Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(SyncPersistence.class);
private final Path recentPath;
@@ -72,7 +72,7 @@ public class SyncPersistence {
try {
PersistenceUtils.cleanDirectory(recentPath);
} catch (IOException ioe) {
- log.error("Cleaning of {} failed.", recentPath.toString(), ioe);
+ logger.error("Cleaning of {} failed.", recentPath.toString(), ioe);
}
}
@@ -126,7 +126,8 @@ public class SyncPersistence {
case "BridgeNetworkStatus": // need to infer authId from filename
String[] filenameParts = filename.split(DASH);
if (filenameParts.length < 3) {
- log.error("Invalid BridgeNetworkStatus; skipping: {}.", filename);
+ logger.error("Invalid BridgeNetworkStatus; skipping: {}.",
+ filename);
break;
}
descPersist = new StatusPersistence(
@@ -160,7 +161,7 @@ public class SyncPersistence {
descPersist = new BridgedbMetricsPersistence((BridgedbMetrics) desc);
break;
default:
- log.trace("Invalid descriptor type {} for sync-merge.",
+ logger.trace("Invalid descriptor type {} for sync-merge.",
clazz.getName());
continue;
}
@@ -171,7 +172,7 @@ public class SyncPersistence {
break;
}
if (!recognizedAndWritten) {
- log.error("Unknown descriptor type {} implementing {}.",
+ logger.error("Unknown descriptor type {} implementing {}.",
desc.getClass().getSimpleName(), desc.getClass().getInterfaces());
}
}
diff --git a/src/main/java/org/torproject/metrics/collector/webstats/LogFileMap.java b/src/main/java/org/torproject/metrics/collector/webstats/LogFileMap.java
index 5be6b50..fb39202 100644
--- a/src/main/java/org/torproject/metrics/collector/webstats/LogFileMap.java
+++ b/src/main/java/org/torproject/metrics/collector/webstats/LogFileMap.java
@@ -22,7 +22,8 @@ import java.util.TreeMap;
public class LogFileMap
extends TreeMap<String, TreeMap<String, TreeMap<LocalDate, LogMetadata>>> {
- private static final Logger log = LoggerFactory.getLogger(LogFileMap.class);
+ private static final Logger logger
+ = LoggerFactory.getLogger(LogFileMap.class);
/**
* The map to keep track of the logfiles by virtual host,
@@ -54,13 +55,13 @@ public class LogFileMap
private FileVisitResult logIfError(Path path, IOException ex) {
if (null != ex) {
- log.warn("Cannot process '{}'.", path, ex);
+ logger.warn("Cannot process '{}'.", path, ex);
}
return FileVisitResult.CONTINUE;
}
});
} catch (IOException ex) {
- log.error("Cannot read directory '{}'.", startDir, ex);
+ logger.error("Cannot read directory '{}'.", startDir, ex);
}
}
diff --git a/src/main/java/org/torproject/metrics/collector/webstats/LogMetadata.java b/src/main/java/org/torproject/metrics/collector/webstats/LogMetadata.java
index d3bf8fb..2cac619 100644
--- a/src/main/java/org/torproject/metrics/collector/webstats/LogMetadata.java
+++ b/src/main/java/org/torproject/metrics/collector/webstats/LogMetadata.java
@@ -17,7 +17,7 @@ import java.util.regex.Pattern;
public class LogMetadata {
- private static final Logger log
+ private static final Logger logger
= LoggerFactory.getLogger(LogMetadata.class);
/** The mandatory web server log descriptor file name pattern. */
@@ -67,7 +67,7 @@ public class LogMetadata {
= LocalDate.parse(mat.group(2), DateTimeFormatter.BASIC_ISO_DATE);
if (null == virtualHost || null == physicalHost || null == logDate
|| virtualHost.isEmpty() || physicalHost.isEmpty()) {
- log.debug("Non-matching file encountered: '{}/{}'.",
+ logger.debug("Non-matching file encountered: '{}/{}'.",
parentPath, file);
} else {
metadata = new LogMetadata(logPath, physicalHost, virtualHost,
@@ -77,7 +77,7 @@ public class LogMetadata {
}
} catch (Throwable ex) {
metadata = null;
- log.debug("Problem parsing path '{}'.", logPath, ex);
+ logger.debug("Problem parsing path '{}'.", logPath, ex);
}
return Optional.ofNullable(metadata);
}
diff --git a/src/main/java/org/torproject/metrics/collector/webstats/SanitizeWeblogs.java b/src/main/java/org/torproject/metrics/collector/webstats/SanitizeWeblogs.java
index 6c8a495..670f686 100644
--- a/src/main/java/org/torproject/metrics/collector/webstats/SanitizeWeblogs.java
+++ b/src/main/java/org/torproject/metrics/collector/webstats/SanitizeWeblogs.java
@@ -55,7 +55,7 @@ import java.util.stream.Stream;
*/
public class SanitizeWeblogs extends CollecTorMain {
- private static final Logger log =
+ private static final Logger logger =
LoggerFactory.getLogger(SanitizeWeblogs.class);
private static final int LIMIT = 2;
@@ -99,7 +99,7 @@ public class SanitizeWeblogs extends CollecTorMain {
Set<SourceType> sources = this.config.getSourceTypeSet(
Key.WebstatsSources);
if (sources.contains(SourceType.Local)) {
- log.info("Processing logs using batch value {}.", BATCH);
+ logger.info("Processing logs using batch value {}.", BATCH);
Map<LogMetadata, Set<LocalDate>> previouslyProcessedWebstats
= this.readProcessedWebstats();
Map<LogMetadata, Set<LocalDate>> newlyProcessedWebstats
@@ -112,7 +112,7 @@ public class SanitizeWeblogs extends CollecTorMain {
cutOffMillis);
}
} catch (Exception e) {
- log.error("Cannot sanitize web-logs: {}", e.getMessage(), e);
+ logger.error("Cannot sanitize web-logs: {}", e.getMessage(), e);
throw new RuntimeException(e);
}
}
@@ -132,9 +132,10 @@ public class SanitizeWeblogs extends CollecTorMain {
}
}
} catch (IOException e) {
- log.error("Cannot read state file {}.", this.processedWebstatsFile, e);
+ logger.error("Cannot read state file {}.", this.processedWebstatsFile,
+ e);
}
- log.debug("Read state file containing {} log files.",
+ logger.debug("Read state file containing {} log files.",
processedWebstats.size());
}
return processedWebstats;
@@ -144,14 +145,14 @@ public class SanitizeWeblogs extends CollecTorMain {
Map<LogMetadata, Set<LocalDate>> previouslyProcessedWebstats) {
Map<LogMetadata, Set<LocalDate>> newlyProcessedWebstats = new HashMap<>();
LogFileMap fileMapIn = new LogFileMap(dir);
- log.info("Found log files for {} virtual hosts.", fileMapIn.size());
+ logger.info("Found log files for {} virtual hosts.", fileMapIn.size());
for (Map.Entry<String,TreeMap<String,TreeMap<LocalDate,LogMetadata>>>
virtualEntry : fileMapIn.entrySet()) {
String virtualHost = virtualEntry.getKey();
for (Map.Entry<String, TreeMap<LocalDate, LogMetadata>> physicalEntry
: virtualEntry.getValue().entrySet()) {
String physicalHost = physicalEntry.getKey();
- log.info("Processing logs for {} on {}.", virtualHost, physicalHost);
+ logger.info("Processing logs for {} on {}.", virtualHost, physicalHost);
/* Go through current input log files for given virtual and physical
* host, and either look up contained log dates from the last execution,
* or parse files to memory now. */
@@ -231,7 +232,7 @@ public class SanitizeWeblogs extends CollecTorMain {
.add(WebServerAccessLogImpl.MARKER)
.add(date.format(DateTimeFormatter.BASIC_ISO_DATE))
.toString() + "." + FileType.XZ.name().toLowerCase();
- log.debug("Storing {}.", name);
+ logger.debug("Storing {}.", name);
Map<String, Long> retainedLines = new TreeMap<>(lineCounts);
lineCounts.clear(); // not needed anymore
try {
@@ -239,13 +240,14 @@ public class SanitizeWeblogs extends CollecTorMain {
= new WebServerAccessLogPersistence(
new WebServerAccessLogImpl(toCompressedBytes(retainedLines),
new File(name), name));
- log.debug("Storing {}.", name);
+ logger.debug("Storing {}.", name);
walp.storeOut(this.outputDirectory.toString());
walp.storeRecent(this.recentDirectory.toString());
} catch (DescriptorParseException dpe) {
- log.error("Cannot store log desriptor {}.", name, dpe);
+ logger.error("Cannot store log desriptor {}.", name, dpe);
} catch (Throwable th) { // catch all else
- log.error("Serious problem. Cannot store log desriptor {}.", name, th);
+ logger.error("Serious problem. Cannot store log desriptor {}.", name,
+ th);
}
}
@@ -327,7 +329,7 @@ public class SanitizeWeblogs extends CollecTorMain {
private Map<LocalDate, Map<String, Long>>
sanitzedLineStream(LogMetadata metadata) {
- log.debug("Processing file {}.", metadata.path);
+ logger.debug("Processing file {}.", metadata.path);
try (BufferedReader br
= new BufferedReader(new InputStreamReader(
metadata.fileType.decompress(Files.newInputStream(metadata.path))))) {
@@ -365,7 +367,7 @@ public class SanitizeWeblogs extends CollecTorMain {
.collect(groupingByConcurrent(Map.Entry::getKey,
summingLong(Map.Entry::getValue))))));
} catch (Exception ex) {
- log.debug("Skipping log-file {}.", metadata.path, ex);
+ logger.debug("Skipping log-file {}.", metadata.path, ex);
}
return Collections.emptyMap();
}
@@ -385,9 +387,10 @@ public class SanitizeWeblogs extends CollecTorMain {
}
Files.write(this.processedWebstatsFile, lines);
} catch (IOException e) {
- log.error("Cannot write state file {}.", this.processedWebstatsFile, e);
+ logger.error("Cannot write state file {}.", this.processedWebstatsFile,
+ e);
}
- log.debug("Wrote state file containing {} log files.",
+ logger.debug("Wrote state file containing {} log files.",
newlyProcessedWebstats.size());
}
}
diff --git a/src/main/java/org/torproject/metrics/collector/webstats/WebServerAccessLogLine.java b/src/main/java/org/torproject/metrics/collector/webstats/WebServerAccessLogLine.java
index 816064a..d187cf2 100644
--- a/src/main/java/org/torproject/metrics/collector/webstats/WebServerAccessLogLine.java
+++ b/src/main/java/org/torproject/metrics/collector/webstats/WebServerAccessLogLine.java
@@ -23,7 +23,7 @@ import java.util.regex.Pattern;
public class WebServerAccessLogLine implements WebServerAccessLog.Line {
- private static final Logger log = LoggerFactory
+ private static final Logger logger = LoggerFactory
.getLogger(WebServerAccessLogLine.class);
private static final String DATE_PATTERN = "dd/MMM/yyyy";
@@ -151,7 +151,7 @@ public class WebServerAccessLogLine implements WebServerAccessLog.Line {
res.valid = true;
}
} catch (Throwable th) {
- log.debug("Unmatchable line: '{}'.", line, th);
+ logger.debug("Unmatchable line: '{}'.", line, th);
return new WebServerAccessLogLine();
}
return res;
diff --git a/src/main/resources/logback.xml b/src/main/resources/logback.xml
deleted file mode 100644
index 6cb5831..0000000
--- a/src/main/resources/logback.xml
+++ /dev/null
@@ -1,167 +0,0 @@
-<configuration debug="false">
-
- <!-- a path and a prefix -->
- <property name="logfile-base" value="${LOGBASE}/collector-" />
-
- <!-- log file names -->
- <property name="fileall-logname" value="${logfile-base}all" />
- <property name="file-bridgedescs-logname" value="${logfile-base}bridgedescs" />
- <property name="file-exitlists-logname" value="${logfile-base}exitlists" />
- <property name="file-relaydescs-logname" value="${logfile-base}relaydescs" />
- <property name="file-torperf-logname" value="${logfile-base}torperf" />
- <property name="file-updateindex-logname" value="${logfile-base}updateindex" />
-
- <!-- date pattern -->
- <property name="utc-date-pattern" value="%date{ISO8601, UTC}" />
-
- <!-- appender section -->
- <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>WARN</level>
- </filter>
- </appender>
-
- <appender name="SHUTDOWN" class="ch.qos.logback.core.ConsoleAppender">
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
- </appender>
-
- <appender name="FILEALL" class="ch.qos.logback.core.rolling.RollingFileAppender">
- <file>${fileall-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
- <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
- <!-- rollover daily -->
- <FileNamePattern>${fileall-logname}.%d{yyyy-MM-dd}.%i.log</FileNamePattern>
- <maxHistory>10</maxHistory>
- <timeBasedFileNamingAndTriggeringPolicy
- class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
- <!-- or whenever the file size reaches 1MB -->
- <maxFileSize>1MB</maxFileSize>
- </timeBasedFileNamingAndTriggeringPolicy>
- </rollingPolicy>
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>INFO</level>
- </filter>
- </appender>
-
- <appender name="FILEBRIDGEDESCS" class="ch.qos.logback.core.FileAppender">
- <file>${file-bridgedescs-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>INFO</level>
- </filter>
- </appender>
-
- <appender name="FILEEXITLISTS" class="ch.qos.logback.core.FileAppender">
- <file>${file-exitlists-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>INFO</level>
- </filter>
- </appender>
-
- <appender name="FILERELAYDESCS" class="ch.qos.logback.core.FileAppender">
- <file>${file-relaydescs-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>INFO</level>
- </filter>
- </appender>
-
- <appender name="FILETORPERF" class="ch.qos.logback.core.FileAppender">
- <file>${file-torperf-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>INFO</level>
- </filter>
- </appender>
-
- <appender name="FILEUPDATEINDEX" class="ch.qos.logback.core.FileAppender">
- <file>${file-updateindex-logname}.log</file>
- <encoder>
- <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
- </encoder>
-
- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
- <level>INFO</level>
- </filter>
- </appender>
-
- <!-- logger section -->
- <logger name="org.torproject.collector.bridgedescs" >
- <appender-ref ref="FILEBRIDGEDESCS" />
- </logger>
-
- <logger name="org.torproject.collector.exitlists" >
- <appender-ref ref="FILEEXITLISTS" />
- </logger>
-
- <logger name="org.torproject.collector.relaydescs" >
- <appender-ref ref="FILERELAYDESCS" />
- </logger>
-
- <logger name="org.torproject.collector.torperf" >
- <appender-ref ref="FILETORPERF" />
- </logger>
-
- <logger name="org.torproject.collector.index" level="INFO" >
- <appender-ref ref="FILEUPDATEINDEX" />
- </logger>
-
- <logger name="org.torproject.collector.Main" >
- <appender-ref ref="FILEBRIDGEDESCS" />
- <appender-ref ref="FILEEXITLISTS" />
- <appender-ref ref="FILERELAYDESCS" />
- <appender-ref ref="FILETORPERF" />
- <appender-ref ref="FILEUPDATEINDEX" />
- </logger>
-
- <logger name="org.torproject.collector.conf" >
- <appender-ref ref="FILEBRIDGEDESCS" />
- <appender-ref ref="FILEEXITLISTS" />
- <appender-ref ref="FILERELAYDESCS" />
- <appender-ref ref="FILETORPERF" />
- <appender-ref ref="FILEUPDATEINDEX" />
- </logger>
-
- <logger name="org.torproject.collector.cron" >
- <appender-ref ref="FILEBRIDGEDESCS" />
- <appender-ref ref="FILEEXITLISTS" />
- <appender-ref ref="FILERELAYDESCS" />
- <appender-ref ref="FILETORPERF" />
- <appender-ref ref="FILEUPDATEINDEX" />
- </logger>
-
- <logger name="org.torproject" >
- <appender-ref ref="CONSOLE" />
- </logger>
-
- <logger name="org.torproject.collector.cron.ShutdownHook" >
- <appender-ref ref="SHUTDOWN" />
- </logger>
-
- <root level="ALL">
- <appender-ref ref="FILEALL" />
- </root>
-
-</configuration>
-
1
0

[metrics-base/master] Simplify logging configuration across code bases.
by karsten@torproject.org 31 Mar '20
by karsten@torproject.org 31 Mar '20
31 Mar '20
commit fd856466bcb260f53ef69a24c102d0e49d171cc3
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Sun Mar 8 08:58:37 2020 +0100
Simplify logging configuration across code bases.
---
java/base.xml | 2 ++
java/logback.xml | 59 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++
2 files changed, 61 insertions(+)
diff --git a/java/base.xml b/java/base.xml
index 6eb3f63..f28062a 100644
--- a/java/base.xml
+++ b/java/base.xml
@@ -185,6 +185,7 @@
haltonfailure="true"
printsummary="on">
<jvmarg value="-DLOGBASE=${generated}/test-logs"/>
+ <jvmarg value="-Dlogback.configurationFile=${buildresources}/logback.xml" />
<classpath refid="test.classpath"/>
<formatter type="plain" usefile="false"/>
<batchtest>
@@ -295,6 +296,7 @@
basedir="${usebase}"
manifest="${manifestfile}" >
<fileset dir="${resources}" includes="${resourceincludes}" />
+ <fileset dir="${buildresources}" includes="logback.xml" />
<restrict>
<not>
<and>
diff --git a/java/logback.xml b/java/logback.xml
new file mode 100644
index 0000000..b788c0a
--- /dev/null
+++ b/java/logback.xml
@@ -0,0 +1,59 @@
+<!-- Default Logback configuration for production and unit tests.
+
+ By default, WARN and ERROR messages are printed out on the console and INFO
+ messages or higher are appended to `logs/metrics.log` or
+ `generated/test-logs/metrics.log`, respectively.
+
+ Configurable options are:
+ - `-DLOGBASE=path/for/logs/` for using a different log directory,
+ - `-DLOGLEVEL=DEBUG` for using a different log level for the log file, and
+ - `-Dlogback.configurationFile=./logback.xml` for using the custom logging
+ configuration file in the current working directory rather than this
+ file.
+-->
+<configuration>
+
+ <!-- Log WARN and ERROR messages to the console. -->
+ <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
+ <encoder>
+ <pattern>%date{ISO8601, UTC} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>WARN</level>
+ </filter>
+ </appender>
+
+ <!-- Log to local log file `logs/metrics.log`, with `logs/` being configurable
+ via `-DLOGBASE=path/for/logs/`. Roll over daily, retaining no more than
+ 10 days or 1 GiB of total log files. -->
+ <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
+ <file>${LOGBASE:-logs}/metrics.log</file>
+ <encoder>
+ <pattern>%date{ISO8601, UTC} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+ <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
+ <fileNamePattern>${LOGBASE:-logs}/metrics.%d.log</fileNamePattern>
+ <maxHistory>10</maxHistory>
+ <totalSizeCap>1GB</totalSizeCap>
+ </rollingPolicy>
+ </appender>
+
+ <!-- Only print out our own log messages to the console, not those coming from
+ third-party libraries. -->
+ <logger name="org.torproject" >
+ <appender-ref ref="STDOUT" />
+ </logger>
+
+ <!-- Never log anything lower than INFO from known third-party libraries. -->
+ <logger name="org.apache" level="INFO" />
+ <logger name="org.eclipse" level="INFO" />
+
+ <!-- Only log messages on INFO level or higher. This level can be configured
+ via `-DLOGLEVEL=DEBUG`, if DEBUG logs are required. This only affects the
+ log file, not the console which only logs warnings and errors. -->
+ <root level="${LOGLEVEL:-INFO}">
+ <appender-ref ref="FILE" />
+ </root>
+
+</configuration>
+
1
0