tor-commits
Threads by month
- ----- 2025 -----
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
June 2016
- 20 participants
- 1204 discussions

[translation/liveusb-creator] Update translations for liveusb-creator
by translation@torproject.org 06 Jun '16
by translation@torproject.org 06 Jun '16
06 Jun '16
commit 2f7b5f9a51bdc58ce5714cadf801a207375a8c73
Author: Translation commit bot <translation(a)torproject.org>
Date: Mon Jun 6 20:45:19 2016 +0000
Update translations for liveusb-creator
---
lb/lb.po | 24 ++++++++++++------------
1 file changed, 12 insertions(+), 12 deletions(-)
diff --git a/lb/lb.po b/lb/lb.po
index 5c986c3..6b5871a 100644
--- a/lb/lb.po
+++ b/lb/lb.po
@@ -9,7 +9,7 @@ msgstr ""
"Project-Id-Version: The Tor Project\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2015-11-02 21:23+0100\n"
-"PO-Revision-Date: 2016-06-06 20:15+0000\n"
+"PO-Revision-Date: 2016-06-06 20:37+0000\n"
"Last-Translator: Tyler Durden <virii(a)enn.lu>\n"
"Language-Team: Luxembourgish (http://www.transifex.com/otf/torproject/language/lb/)\n"
"MIME-Version: 1.0\n"
@@ -547,39 +547,39 @@ msgstr "ISO MD5 Checkzomm verifizéieren"
#: ../liveusb/creator.py:373
msgid "Verifying SHA1 checksum of LiveCD image..."
-msgstr ""
+msgstr "SHA1 Checkzomm vun der LiveCD verifizéieren..."
#: ../liveusb/creator.py:377
msgid "Verifying SHA256 checksum of LiveCD image..."
-msgstr ""
+msgstr "SHA256 Checkzomm vun der LiveCD verifizéieren..."
#: ../liveusb/creator.py:961 ../liveusb/creator.py:1280
msgid "Verifying filesystem..."
-msgstr ""
+msgstr "Dateisystem verifizéieren..."
#: ../liveusb/gui.py:725
msgid ""
"Warning: Creating a new persistent overlay will delete your existing one."
-msgstr ""
+msgstr "Warnung: D'Erstelle vun engem neie persistente Späicher wäert äre momentane läschen."
#: ../liveusb/gui.py:377
msgid ""
"Warning: This tool needs to be run as an Administrator. To do this, right "
"click on the icon and open the Properties. Under the Compatibility tab, "
"check the \"Run this program as an administrator\" box."
-msgstr ""
+msgstr "Warnung: Dëst Tool muss als Administrateur ausgefouert ginn. Fir dat ze maachen, musst der op der Maus op den Icon riets klicken an dann d'Astellungen auswielen. Ënnert dem Reider \"Kompatibilitéit\", wielt der dann \"Dëst Programm als Administrateur ausféieren\" aus."
#: ../liveusb/creator.py:162
#, python-format
msgid "Wrote to device at %(speed)d MB/sec"
-msgstr ""
+msgstr "Mat %(speed)d MB/sek op d'Medium geschriwwen"
#: ../liveusb/gui.py:699
#, python-format
msgid ""
"You are going to install Tails on the %(size)s %(vendor)s %(model)s device "
"(%(device)s). All data on the selected device will be lost. Continue?"
-msgstr ""
+msgstr "Dir sidd dobäi Tails op %(size)s %(vendor)s %(model)s d'Medium (%(device)s) ze installéieren. All Daten op dem ausgewielte Medium gi verluer. Weiderfueren?"
#: ../liveusb/gui.py:715
#, python-format
@@ -587,21 +587,21 @@ msgid ""
"You are going to upgrade Tails on the %(parent_size)s %(vendor)s %(model)s "
"device (%(device)s). Any persistent volume on this device will remain "
"unchanged. Continue?"
-msgstr ""
+msgstr "Dir sidd dobäi Tails op %(parent_size)s %(vendor)s %(model)s dem Medium (%(device)s) ze upgraden. Jiddwer persistente Späicher op dësem Medium bleift onverännert. Weiderfueren?"
#: ../liveusb/creator.py:622
msgid ""
"You are using an old version of syslinux-extlinux that does not support the "
"ext4 filesystem"
-msgstr ""
+msgstr "Dir benotzt eng al Versioun vu syslinux-extlinux, déi den ext4 Dateisystem net ënnerstëtzt"
#: ../liveusb/gui.py:783
msgid "You can try again to resume your download"
-msgstr ""
+msgstr "Download probéieren ze resuméieren"
#: ../liveusb/creator.py:95
msgid "You must run this application as root"
-msgstr ""
+msgstr "Dës Applikatioun muss als root ausgefouert ginn"
#: ../liveusb/dialog.py:162
msgid "or"
1
0

06 Jun '16
commit 050a88ffcf2b205a63741d4848951ce91c0bd02f
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Mon Jun 6 22:41:56 2016 +0200
Make another trivial tweak to INSTALL.md.
---
INSTALL.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/INSTALL.md b/INSTALL.md
index 005a29a..0ae4f9a 100644
--- a/INSTALL.md
+++ b/INSTALL.md
@@ -57,5 +57,5 @@ $ java -DLOGBASE=/path/to/logs -jar collector-<version>.jar relaydescs
Watch out for INFO-level logs in the log directory you configured. In
particular, the lines following "Statistics on the completeness of written
-relay descriptors:" is quite important.
+relay descriptors:" are quite important.
1
0

06 Jun '16
commit f7811f4ed6ae1e57333f4ac831b834bb5122f0ba
Author: Karsten Loesing <karsten.loesing(a)gmx.net>
Date: Mon Jun 6 17:13:43 2016 +0200
Make some trivial tweaks to INSTALL.md.
---
INSTALL.md | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/INSTALL.md b/INSTALL.md
index d9588f8..005a29a 100644
--- a/INSTALL.md
+++ b/INSTALL.md
@@ -53,9 +53,9 @@ DownloadRelayDescriptors = true
Run the relay descriptor downloader
-----------------------------------
-$ java -DLOGBASE=/path/to/logs -jar collector-<version>.jar releaydescs
+$ java -DLOGBASE=/path/to/logs -jar collector-<version>.jar relaydescs
-Watch out for INFO-level logs in the log directory you configured. In particular, the
-lines following "Statistics on the completeness of written relay
-descriptors:" is quite important.
+Watch out for INFO-level logs in the log directory you configured. In
+particular, the lines following "Statistics on the completeness of written
+relay descriptors:" is quite important.
1
0

[collector/master] Implements task-19015. Removed obsolete scripts and changed INSTALL.md
by karsten@torproject.org 06 Jun '16
by karsten@torproject.org 06 Jun '16
06 Jun '16
commit 295adfb87725614ddbe4548b101cccdbe8fb1233
Author: iwakeh <iwakeh(a)torproject.org>
Date: Fri Jun 3 15:32:44 2016 +0200
Implements task-19015. Removed obsolete scripts and changed INSTALL.md
---
INSTALL.md | 41 +++++++++++------------------------------
bin/logging.properties | 20 --------------------
bin/run-all | 16 ----------------
bin/run-bridgedescs | 3 ---
bin/run-exitlists | 3 ---
bin/run-relaydescs | 3 ---
bin/run-torperf | 3 ---
bin/update-index | 3 ---
8 files changed, 11 insertions(+), 81 deletions(-)
diff --git a/INSTALL.md b/INSTALL.md
index 05db7ba..d9588f8 100644
--- a/INSTALL.md
+++ b/INSTALL.md
@@ -24,26 +24,12 @@ that you're using `/srv/collector.torproject.org/` as working directory,
but feel free to use another directory that better suits your needs.
$ sudo mkdir -p /srv/collector.torproject.org/
-$ sudo chown vagrant:vagrant /srv/collector.torproject.org/
Install a few packages:
-$ sudo apt-get install openjdk-6-jdk ant libcommons-codec-java \
- libcommons-compress-java
-
-
-Clone the metrics-db repository
--------------------------------
-
-$ cd /srv/collector.torproject.org/
-$ git clone https://git.torproject.org/metrics-db
-
-
-Clone required submodule metrics-lib
-------------------------------------
-
-$ git submodule init
-$ git submodule update
+$ sudo apt-get ant junit4 libasm4-java libcommons-codec-java \
+ libcommons-compress-java libcommons-lang3-java libgoogle-gson-java \
+ liblogback-java liboro-java libslf4j-java libxz-java openjdk-7-jdk
Compile CollecTor
@@ -55,26 +41,21 @@ $ ant compile
Configure the relay descriptor downloader
-----------------------------------------
-Edit the config file and uncomment and edit at least the following line:
+Run
+$ java -DLOGBASE=/path/to/logs -jar collector-<version>.jar releaydescs
+once in order to obtain a configuration properties file.
-DownloadRelayDescriptors 1
+Edit collector.properties and set at least the following value to true:
+
+DownloadRelayDescriptors = true
Run the relay descriptor downloader
-----------------------------------
-$ bin/run-relaydescs
-
-
-Set up an hourly cronjob for the relay descriptor downloader
-------------------------------------------------------------
-
-Ideally, run the relay descriptor downloader once per hour by adding a
-crontab entry like the following:
-
-6 * * * * cd /srv/collector.torproject.org/db/ && bin/run-relaydescs
+$ java -DLOGBASE=/path/to/logs -jar collector-<version>.jar releaydescs
-Watch out for INFO-level logs in the `log/` directory. In particular, the
+Watch out for INFO-level logs in the log directory you configured. In particular, the
lines following "Statistics on the completeness of written relay
descriptors:" is quite important.
diff --git a/bin/logging.properties b/bin/logging.properties
deleted file mode 100644
index 5aae3cb..0000000
--- a/bin/logging.properties
+++ /dev/null
@@ -1,20 +0,0 @@
-handlers= java.util.logging.FileHandler, java.util.logging.ConsoleHandler
-
-.level = FINER
-
-# if the following dir pattern is changed, LOGGINGPATH='log' in run-all
-# needs to be adapted.
-java.util.logging.FileHandler.pattern = ./log/collector-%g.log
-java.util.logging.FileHandler.limit = 5000000
-java.util.logging.FileHandler.count = 9
-java.util.logging.FileHandler.append = true
-java.util.logging.FileHandler.level = FINER
-java.util.logging.FileHandler.formatter = java.util.logging.SimpleFormatter
-
-java.util.logging.ConsoleHandler.level = WARNING
-java.util.logging.ConsoleHandler.formatter = java.util.logging.SimpleFormatter
-
-java.util.logging.SimpleFormatter.format = %1$tF %1$tT %4$s %2$s %5$s%n
-
-sun.net.www.level = SEVERE
-sun.level = SEVERE
diff --git a/bin/run-all b/bin/run-all
deleted file mode 100755
index 469aced..0000000
--- a/bin/run-all
+++ /dev/null
@@ -1,16 +0,0 @@
-#!/bin/sh
-
-TOKEN=`expr substr $1 5 20`
-
-# if the path is changed in properties, this needs to be adapted.
-LOGGINGPATH='log'
-
-if ! test -d $LOGGINGPATH
-then mkdir $LOGGINGPATH
-fi
-
-# the following uses one logging configuration for all modules
-java -Xmx2g -Djava.util.logging.config.file=./bin/logging.properties -jar collector-0.9.0-dev.jar $TOKEN
-
-# this would use a special log config for each module
-# java -Xmx2g -Djava.util.logging.config.file=./bin/logging.$TOKEN.properties -jar collector-0.9.0-dev.jar $TOKEN
diff --git a/bin/run-bridgedescs b/bin/run-bridgedescs
deleted file mode 100755
index 62d71f7..0000000
--- a/bin/run-bridgedescs
+++ /dev/null
@@ -1,3 +0,0 @@
-#!/bin/sh
-`dirname $0`/run-all `basename $0`
-
diff --git a/bin/run-exitlists b/bin/run-exitlists
deleted file mode 100755
index 62d71f7..0000000
--- a/bin/run-exitlists
+++ /dev/null
@@ -1,3 +0,0 @@
-#!/bin/sh
-`dirname $0`/run-all `basename $0`
-
diff --git a/bin/run-relaydescs b/bin/run-relaydescs
deleted file mode 100755
index 62d71f7..0000000
--- a/bin/run-relaydescs
+++ /dev/null
@@ -1,3 +0,0 @@
-#!/bin/sh
-`dirname $0`/run-all `basename $0`
-
diff --git a/bin/run-torperf b/bin/run-torperf
deleted file mode 100755
index 62d71f7..0000000
--- a/bin/run-torperf
+++ /dev/null
@@ -1,3 +0,0 @@
-#!/bin/sh
-`dirname $0`/run-all `basename $0`
-
diff --git a/bin/update-index b/bin/update-index
deleted file mode 100755
index 6e6c488..0000000
--- a/bin/update-index
+++ /dev/null
@@ -1,3 +0,0 @@
-#!/bin/sh
-`dirname $0`/run-all run-updateindex
-
1
0

[collector/master] task-19015 minor tweaks. added console appender, some test permissions
by karsten@torproject.org 06 Jun '16
by karsten@torproject.org 06 Jun '16
06 Jun '16
commit b65b4ade06fb7592ffc8f0743a09bac8cc3c01cb
Author: iwakeh <iwakeh(a)torproject.org>
Date: Fri Jun 3 21:24:25 2016 +0200
task-19015 minor tweaks. added console appender, some test permissions
for logback, and test error info.
---
src/main/resources/logback.xml | 15 ++++++++++++++-
src/test/java/org/torproject/collector/MainTest.java | 2 +-
src/test/resources/junittest.policy | 3 +++
3 files changed, 18 insertions(+), 2 deletions(-)
diff --git a/src/main/resources/logback.xml b/src/main/resources/logback.xml
index 1b78d58..9c9426d 100644
--- a/src/main/resources/logback.xml
+++ b/src/main/resources/logback.xml
@@ -15,6 +15,16 @@
<property name="utc-date-pattern" value="%date{ISO8601, UTC}" />
<!-- appender section -->
+ <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>WARN</level>
+ </filter>
+ </appender>
+
<appender name="FILEALL" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${fileall-logname}.log</file>
<encoder>
@@ -116,7 +126,10 @@
<appender-ref ref="FILEUPDATEINDEX" />
</logger>
- <logger name="sun" level="ERROR" />
+ <logger name="org.torproject" >
+ <appender-ref ref="CONSOLE" />
+ </logger>
+
<root level="ALL">
<appender-ref ref="FILEALL" />
diff --git a/src/test/java/org/torproject/collector/MainTest.java b/src/test/java/org/torproject/collector/MainTest.java
index 9a19285..f83cab5 100644
--- a/src/test/java/org/torproject/collector/MainTest.java
+++ b/src/test/java/org/torproject/collector/MainTest.java
@@ -38,7 +38,7 @@ public class MainTest {
@Test()
public void testSmoke() throws Exception {
- System.out.println("\n!!!! Three SEVERE log messages are expected."
+ System.out.println("\n!!!! Three ERROR log messages are expected."
+ "\nOne each from: ExitListDownloader, "
+ "TorperfDownloader, and CreateIndexJson.\n");
File conf = tmpf.newFile("test.conf");
diff --git a/src/test/resources/junittest.policy b/src/test/resources/junittest.policy
index e6eb2ef..208a172 100644
--- a/src/test/resources/junittest.policy
+++ b/src/test/resources/junittest.policy
@@ -7,4 +7,7 @@ grant {
permission java.lang.RuntimePermission "accessDeclaredMembers";
permission java.lang.reflect.ReflectPermission "suppressAccessChecks";
permission java.lang.RuntimePermission "shutdownHooks";
+ permission java.lang.RuntimePermission "accessClassInPackage.sun.reflect";
+ permission java.lang.RuntimePermission "accessClassInPackage.sun.net.www.protocol.http";
+ permission java.lang.RuntimePermission "accessClassInPackage.sun.net.www.http";
};
1
0

06 Jun '16
commit 8767c73d0826dfa9aa21e70a2d857c8a2d77e524
Author: iwakeh <iwakeh(a)torproject.org>
Date: Mon May 30 15:14:49 2016 +0200
Implements task-19021 and task-19005.
Adds the very first tests to CollecTor.
Increases testability and prepares task-19018.
Avoid using literal path separators,use Paths.get instead.
---
.gitignore | 1 +
build.xml | 30 +-
config.template | 107 -------
src/main/java/org/torproject/collector/Main.java | 73 ++++-
.../bridgedescs/SanitizedBridgesWriter.java | 53 ++--
.../torproject/collector/conf/Configuration.java | 123 ++++++++
.../collector/conf/ConfigurationException.java | 18 ++
.../java/org/torproject/collector/conf/Key.java | 55 ++++
.../collector/exitlists/ExitListDownloader.java | 32 ++-
.../collector/index/CreateIndexJson.java | 18 +-
.../torproject/collector/main/Configuration.java | 318 ---------------------
.../org/torproject/collector/main/LockFile.java | 20 +-
.../collector/relaydescs/ArchiveWriter.java | 188 ++++++------
.../relaydescs/CachedRelayDescriptorReader.java | 4 +-
.../relaydescs/RelayDescriptorDownloader.java | 9 +-
.../collector/torperf/TorperfDownloader.java | 59 ++--
src/main/resources/collector.properties | 115 ++++++++
.../java/org/torproject/collector/MainTest.java | 72 +++++
.../collector/conf/ConfigurationTest.java | 143 +++++++++
src/test/resources/junittest.policy | 10 +
20 files changed, 822 insertions(+), 626 deletions(-)
diff --git a/.gitignore b/.gitignore
index afab74f..0ca0b1c 100644
--- a/.gitignore
+++ b/.gitignore
@@ -12,4 +12,5 @@
/generated
/lib
cobertura.ser
+*~
diff --git a/build.xml b/build.xml
index 10c5edd..8e46584 100644
--- a/build.xml
+++ b/build.xml
@@ -53,10 +53,22 @@
<include name="logback-classic-1.1.2.jar" />
</fileset>
</path>
+ <path id="cobertura.test.classpath">
+ <path location="${instrument}" />
+ <path refid="test.classpath" />
+ <path refid="cobertura.classpath" />
+ </path>
<path id="test.classpath">
+ <pathelement path="${classes}"/>
<pathelement path="${testclasses}"/>
+ <pathelement path="${resources}"/>
+ <pathelement path="${testresources}"/>
+ <fileset dir="${libs}">
+ <patternset refid="runtime" />
+ </fileset>
<fileset dir="${libs}">
<include name="junit4-4.11.jar"/>
+ <include name="hamcrest-all-1.3.jar"/>
</fileset>
</path>
<target name="init">
@@ -65,7 +77,6 @@
<mkdir dir="${docs}"/>
<mkdir dir="${testresult}"/>
<mkdir dir="${instrument}"/>
- <copy file="config.template" tofile="config"/>
</target>
<target name="clean">
<delete includeEmptyDirs="true" quiet="true">
@@ -123,6 +134,7 @@
<jar destfile="${jarfile}"
basedir="${classes}">
<fileset dir="${classes}"/>
+ <fileset dir="${resources}" includes="collector.properties"/>
<zipgroupfileset dir="${libs}" >
<patternset refid="runtime" />
</zipgroupfileset>
@@ -185,10 +197,10 @@
<junit fork="true" haltonfailure="false" printsummary="on">
<sysproperty key="net.sourceforge.cobertura.datafile"
file="${cobertura.ser.file}" />
- <classpath location="${instrument}" />
- <classpath refid="classpath" />
- <classpath refid="test.classpath" />
- <classpath refid="cobertura.classpath" />
+ <!-- The following jvmargs prevent test access to the network. -->
+ <jvmarg value="-Djava.security.policy=${testresources}/junittest.policy"/>
+ <jvmarg value="-Djava.security.manager"/>
+ <classpath refid="cobertura.test.classpath" />
<formatter type="xml" />
<batchtest toDir="${testresult}" >
<fileset dir="${testclasses}" />
@@ -199,11 +211,15 @@
<include name="**/*.java" />
</fileset>
</cobertura-report>
- <cobertura-check branchrate="0" totallinerate="0" />
+ <cobertura-check branchrate="0" totallinerate="15" totalbranchrate="5" >
+ <regex pattern="org.torproject.collector.conf.*" branchrate="100" linerate="100"/>
+ </cobertura-check>
</target>
<target name="test" depends="compile,compile-tests">
<junit fork="true" haltonfailure="true" printsummary="off">
- <classpath refid="classpath"/>
+ <!-- The following jvmargs prevent test access to the network. -->
+ <jvmarg value="-Djava.security.policy=${testresources}/junittest.policy"/>
+ <jvmarg value="-Djava.security.manager"/>
<classpath refid="test.classpath"/>
<formatter type="plain" usefile="false"/>
<batchtest>
diff --git a/config.template b/config.template
deleted file mode 100644
index 88407a2..0000000
--- a/config.template
+++ /dev/null
@@ -1,107 +0,0 @@
-######## Relay descriptors ########
-#
-## Read cached-* files from a local Tor data directory
-#ImportCachedRelayDescriptors 0
-#
-## Relative path to Tor data directory to read cached-* files from (can be
-## specified multiple times)
-#CachedRelayDescriptorsDirectory in/relay-descriptors/cacheddesc/
-#
-## Import directory archives from disk, if available
-#ImportDirectoryArchives 0
-#
-## Relative path to directory to import directory archives from
-#DirectoryArchivesDirectory in/relay-descriptors/archives/
-#
-## Keep a history of imported directory archive files to know which files
-## have been imported before. This history can be useful when importing
-## from a changing source to avoid importing descriptors over and over
-## again, but it can be confusing to users who don't know about it.
-#KeepDirectoryArchiveImportHistory 0
-#
-## Download relay descriptors from directory authorities, if required
-#DownloadRelayDescriptors 0
-#
-## Comma separated list of directory authority addresses (IP[:port]) to
-## download missing relay descriptors from
-#DownloadFromDirectoryAuthorities 86.59.21.38,76.73.17.194:9030,171.25.193.9:443,193.23.244.244,208.83.223.34:443,128.31.0.34:9131,194.109.206.212,212.112.245.170,154.35.32.5
-#
-## Comma separated list of directory authority fingerprints to download
-## votes
-#DownloadVotesByFingerprint 14C131DFC5C6F93646BE72FA1401C02A8DF2E8B4,27B6B5996C426270A5C95488AA5BCEB6BCC86956,49015F787433103580E3B66A1707A00E60F2D15B,585769C78764D58426B8B52B6651A5A71137189A,80550987E1D626E3EBA5E5E75A458DE0626D088C,D586D18309DED4CD6D57C18FDB97EFA96D330566,E8A9C45EDE6D711294FADF8E7951F4DE6CA56B58,ED03BB616EB2F60BEC80151114BB25CEF515B226,EFCBE720AB3A82B99F9E953CD5BF50F7EEFC7B97
-#
-## Download the current consensus (only if DownloadRelayDescriptors is 1)
-#DownloadCurrentConsensus 1
-#
-## Download the current microdesc consensus (only if
-## DownloadRelayDescriptors is 1)
-#DownloadCurrentMicrodescConsensus 1
-#
-## Download current votes (only if DownloadRelayDescriptors is 1)
-#DownloadCurrentVotes 1
-#
-## Download missing server descriptors (only if DownloadRelayDescriptors
-## is 1)
-#DownloadMissingServerDescriptors 1
-#
-## Download missing extra-info descriptors (only if
-## DownloadRelayDescriptors is 1)
-#DownloadMissingExtraInfoDescriptors 1
-#
-## Download missing microdescriptors (only if
-## DownloadRelayDescriptors is 1)
-#DownloadMissingMicrodescriptors 1
-#
-## Download all server descriptors from the directory authorities at most
-## once a day (only if DownloadRelayDescriptors is 1)
-#DownloadAllServerDescriptors 0
-#
-## Download all extra-info descriptors from the directory authorities at
-## most once a day (only if DownloadRelayDescriptors is 1)
-#DownloadAllExtraInfoDescriptors 0
-#
-## Compress relay descriptors downloads by adding .z to the URLs
-#CompressRelayDescriptorDownloads 0
-#
-## Relative path to directory to write directory archives to
-#DirectoryArchivesOutputDirectory out/relay-descriptors/
-#
-#
-######## Bridge descriptors ########
-#
-## Relative path to directory to import bridge descriptor snapshots from
-#BridgeSnapshotsDirectory in/bridge-descriptors/
-#
-## Replace IP addresses in sanitized bridge descriptors with 10.x.y.z
-## where x.y.z = H(IP address | bridge identity | secret)[:3], so that we
-## can learn about IP address changes.
-#ReplaceIPAddressesWithHashes 0
-#
-## Limit internal bridge descriptor mapping state to the following number
-## of days, or -1 for unlimited.
-#LimitBridgeDescriptorMappings -1
-#
-## Relative path to directory to write sanitized bridges to
-#SanitizedBridgesWriteDirectory out/bridge-descriptors/
-#
-#
-######## Exit lists ########
-#
-## (No options available)
-#
-#
-######## Torperf downloader ########
-#
-## Relative path to the directory to store Torperf files in
-#TorperfOutputDirectory out/torperf/
-#
-## Torperf source names and base URLs (option can be contained multiple
-## times)
-#TorperfSource torperf http://torperf.torproject.org/
-#
-## Torperf measurement file size in bytes, .data file, and .extradata file
-## available on a given source (option can be contained multiple times)
-#TorperfFiles torperf 51200 50kb.data 50kb.extradata
-#TorperfFiles torperf 1048576 1mb.data 1mb.extradata
-#TorperfFiles torperf 5242880 5mb.data 5mb.extradata
-
diff --git a/src/main/java/org/torproject/collector/Main.java b/src/main/java/org/torproject/collector/Main.java
index 9c64696..d21cfb6 100644
--- a/src/main/java/org/torproject/collector/Main.java
+++ b/src/main/java/org/torproject/collector/Main.java
@@ -4,12 +4,19 @@
package org.torproject.collector;
import org.torproject.collector.bridgedescs.SanitizedBridgesWriter;
+import org.torproject.collector.conf.Configuration;
import org.torproject.collector.exitlists.ExitListDownloader;
import org.torproject.collector.index.CreateIndexJson;
import org.torproject.collector.relaydescs.ArchiveWriter;
import org.torproject.collector.torperf.TorperfDownloader;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.nio.file.StandardCopyOption;
import java.util.HashMap;
import java.util.Map;
import java.util.logging.Logger;
@@ -24,11 +31,12 @@ import java.util.logging.Logger;
public class Main {
private static Logger log = Logger.getLogger(Main.class.getName());
+ public static final String CONF_FILE = "collector.properties";
/** All possible main classes.
* If a new CollecTorMain class is available, just add it to this map.
*/
- private static final Map<String, Class> collecTorMains = new HashMap<>();
+ static final Map<String, Class> collecTorMains = new HashMap<>();
static { // add a new main class here
collecTorMains.put("bridgedescs", SanitizedBridgesWriter.class);
@@ -41,38 +49,73 @@ public class Main {
private static final String modules = collecTorMains.keySet().toString()
.replace("[", "").replace("]", "").replaceAll(", ", "|");
+ private static Configuration conf = new Configuration();
+
/**
* One argument is necessary.
* See class description {@link Main}.
*/
- public static void main(String[] args) {
- if (null == args || args.length != 1) {
- printUsageAndExit("CollecTor needs exactly one argument.");
+ public static void main(String[] args) throws Exception {
+ File confFile = null;
+ if (null == args || args.length < 1 || args.length > 2) {
+ printUsage("CollecTor needs one or two arguments.");
+ return;
+ } else if (args.length == 1) {
+ confFile = new File(CONF_FILE);
+ } else if (args.length == 2) {
+ confFile = new File(args[1]);
+ }
+ if (!confFile.exists() || confFile.length() < 1L) {
+ writeDefaultConfig(confFile);
+ return;
} else {
- invokeGivenMainAndExit(args[0]);
+ readConfigurationFrom(confFile);
}
+ invokeGivenMain(args[0]);
}
- private static void printUsageAndExit(String msg) {
+ private static void printUsage(String msg) {
final String usage = "Usage:\njava -jar collector.jar "
- + "<" + modules + ">";
+ + "<" + modules + "> [path/to/configFile]";
System.out.println(msg + "\n" + usage);
- System.exit(0);
}
- private static void invokeGivenMainAndExit(String mainId) {
+ private static void writeDefaultConfig(File confFile) {
+ try {
+ Files.copy(Main.class.getClassLoader().getResource(CONF_FILE).openStream(),
+ confFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
+ printUsage("Could not find config file. In the default "
+ + "configuration, we are not configured to read data from any "
+ + "data source or write data to any data sink. You need to "
+ + "change the configuration (" + CONF_FILE
+ + ") and provide at least one data source and one data sink. "
+ + "Refer to the manual for more information.");
+ } catch (IOException e) {
+ log.severe("Cannot write default configuration. Reason: " + e);
+ }
+ }
+
+ private static void readConfigurationFrom(File confFile) throws Exception {
+ try (FileInputStream fis = new FileInputStream(confFile)) {
+ conf.load(fis);
+ } catch (Exception e) { // catch all possible problems
+ log.severe("Cannot read configuration. Reason: " + e);
+ throw e;
+ }
+ }
+
+ private static void invokeGivenMain(String mainId) {
Class clazz = collecTorMains.get(mainId);
if (null == clazz) {
- printUsageAndExit("Unknown argument: " + mainId);
+ printUsage("Unknown argument: " + mainId);
}
- invokeMainOnClassAndExit(clazz);
+ invokeMainOnClass(clazz);
}
- private static void invokeMainOnClassAndExit(Class clazz) {
+ private static void invokeMainOnClass(Class clazz) {
try {
- clazz.getMethod("main", new Class[] { String[].class })
- .invoke(null, (Object) new String[]{});
- System.exit(0);
+ clazz.getMethod("main", new Class[] { Configuration.class })
+ .invoke(null, (Object) conf);
} catch (NoSuchMethodException | IllegalAccessException
| InvocationTargetException e) {
log.severe("Cannot invoke 'main' method on "
diff --git a/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java b/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java
index 3214715..fa24a3d 100644
--- a/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java
+++ b/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java
@@ -3,7 +3,9 @@
package org.torproject.collector.bridgedescs;
-import org.torproject.collector.main.Configuration;
+import org.torproject.collector.conf.Configuration;
+import org.torproject.collector.conf.ConfigurationException;
+import org.torproject.collector.conf.Key;
import org.torproject.collector.main.LockFile;
import org.apache.commons.codec.DecoderException;
@@ -35,36 +37,30 @@ import java.util.logging.Level;
import java.util.logging.Logger;
/**
- * Sanitizes bridge descriptors, i.e., removes all possibly sensitive
+ * <p>Sanitizes bridge descriptors, i.e., removes all possibly sensitive
* information from them, and writes them to a local directory structure.
* During the sanitizing process, all information about the bridge
* identity or IP address are removed or replaced. The goal is to keep the
* sanitized bridge descriptors useful for statistical analysis while not
- * making it easier for an adversary to enumerate bridges.
+ * making it easier for an adversary to enumerate bridges.</p>
*
- * There are three types of bridge descriptors: bridge network statuses
+ * <p>There are three types of bridge descriptors: bridge network statuses
* (lists of all bridges at a given time), server descriptors (published
* by the bridge to advertise their capabilities), and extra-info
- * descriptors (published by the bridge, mainly for statistical analysis).
+ * descriptors (published by the bridge, mainly for statistical analysis).</p>
*/
public class SanitizedBridgesWriter extends Thread {
- public static void main(String[] args) {
+ private static Logger logger;
- Logger logger = Logger.getLogger(
- SanitizedBridgesWriter.class.getName());
- logger.info("Starting bridge-descriptors module of CollecTor.");
+ public static void main(Configuration config) throws ConfigurationException {
- // Initialize configuration
- Configuration config = new Configuration();
+ logger = Logger.getLogger(SanitizedBridgesWriter.class.getName());
+ logger.info("Starting bridge-descriptors module of CollecTor.");
// Use lock file to avoid overlapping runs
- LockFile lf = new LockFile("bridge-descriptors");
- if (!lf.acquireLock()) {
- logger.severe("Warning: CollecTor is already running or has not exited "
- + "cleanly! Exiting!");
- System.exit(1);
- }
+ LockFile lf = new LockFile(config.getPath(Key.LockFilePath).toString(), "bridge-descriptors");
+ lf.acquireLock();
// Sanitize bridge descriptors
new SanitizedBridgesWriter(config).run();
@@ -84,11 +80,6 @@ public class SanitizedBridgesWriter extends Thread {
this.config = config;
}
- /**
- * Logger for this class.
- */
- private Logger logger;
-
private String rsyncCatString;
private File bridgeDirectoriesDirectory;
@@ -112,16 +103,26 @@ public class SanitizedBridgesWriter extends Thread {
private SecureRandom secureRandom;
+ @Override
public void run() {
+ try {
+ startProcessing();
+ } catch (ConfigurationException ce) {
+ logger.severe("Configuration failed: " + ce);
+ throw new RuntimeException(ce);
+ }
+ }
+
+ private void startProcessing() throws ConfigurationException {
File bridgeDirectoriesDirectory =
- new File(config.getBridgeSnapshotsDirectory());
+ config.getPath(Key.BridgeSnapshotsDirectory).toFile();
File sanitizedBridgesDirectory =
- new File(config.getSanitizedBridgesWriteDirectory());
+ config.getPath(Key.SanitizedBridgesWriteDirectory).toFile();
boolean replaceIPAddressesWithHashes =
- config.getReplaceIPAddressesWithHashes();
+ config.getBool(Key.ReplaceIPAddressesWithHashes);
long limitBridgeSanitizingInterval =
- config.getLimitBridgeDescriptorMappings();
+ config.getInt(Key.BridgeDescriptorMappingsLimit);
File statsDirectory = new File("stats");
if (bridgeDirectoriesDirectory == null
diff --git a/src/main/java/org/torproject/collector/conf/Configuration.java b/src/main/java/org/torproject/collector/conf/Configuration.java
new file mode 100644
index 0000000..8b8cc12
--- /dev/null
+++ b/src/main/java/org/torproject/collector/conf/Configuration.java
@@ -0,0 +1,123 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
+package org.torproject.collector.conf;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Properties;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+/**
+ * Initialize configuration with defaults from collector.properties,
+ * unless a configuration properties file is available.
+ */
+public class Configuration extends Properties {
+
+ public static final String FIELDSEP = ",";
+ public static final String ARRAYSEP = ";";
+
+ /**
+ * Returns {@code String[][]} from a property. Commas seperate array elements
+ * and semicolons separate arrays, e.g.,
+ * {@code propertyname = a1, a2, a3; b1, b2, b3}
+ */
+ public String[][] getStringArrayArray(Key key) throws ConfigurationException {
+ try {
+ checkClass(key, String[][].class);
+ String[] interim = getProperty(key.name()).split(ARRAYSEP);
+ String[][] res = new String[interim.length][];
+ for (int i = 0; i < interim.length; i++) {
+ res[i] = interim[i].trim().split(FIELDSEP);
+ for (int j = 0; j < res[i].length; j++) {
+ res[i][j] = res[i][j].trim();
+ }
+ }
+ return res;
+ } catch (RuntimeException re) {
+ throw new ConfigurationException("Corrupt property: " + key
+ + " reason: " + re.getMessage(), re);
+ }
+ }
+
+ /**
+ * Returns {@code String[]} from a property. Commas seperate array elements,
+ * e.g.,
+ * {@code propertyname = a1, a2, a3}
+ */
+ public String[] getStringArray(Key key) throws ConfigurationException {
+ try {
+ checkClass(key, String[].class);
+ String[] res = getProperty(key.name()).split(FIELDSEP);
+ for (int i = 0; i < res.length; i++) {
+ res[i] = res[i].trim();
+ }
+ return res;
+ } catch (RuntimeException re) {
+ throw new ConfigurationException("Corrupt property: " + key
+ + " reason: " + re.getMessage(), re);
+ }
+ }
+
+ private void checkClass(Key key, Class clazz) {
+ if (!key.keyClass().getSimpleName().equals(clazz.getSimpleName())) {
+ throw new RuntimeException("Wrong type wanted! My class is "
+ + key.keyClass().getSimpleName());
+ }
+ }
+
+ /**
+ * Returns a {@code boolean} property (case insensitiv), e.g.
+ * {@code propertyOne = True}.
+ */
+ public boolean getBool(Key key) throws ConfigurationException {
+ try {
+ checkClass(key, Boolean.class);
+ return Boolean.parseBoolean(getProperty(key.name()));
+ } catch (RuntimeException re) {
+ throw new ConfigurationException("Corrupt property: " + key
+ + " reason: " + re.getMessage(), re);
+ }
+ }
+
+ /**
+ * Parse an integer property and translate the String
+ * <code>"inf"</code> into Integer.MAX_VALUE.
+ * Verifies that this enum is a Key for an integer value.
+ */
+ public int getInt(Key key) throws ConfigurationException {
+ try {
+ checkClass(key, Integer.class);
+ String prop = getProperty(key.name());
+ if ("inf".equals(prop)) {
+ return Integer.MAX_VALUE;
+ } else {
+ return Integer.parseInt(prop);
+ }
+ } catch (RuntimeException re) {
+ throw new ConfigurationException("Corrupt property: " + key
+ + " reason: " + re.getMessage(), re);
+ }
+ }
+
+ /**
+ * Returns a {@code Path} property, e.g.
+ * {@code pathProperty = /my/path/file}.
+ */
+ public Path getPath(Key key) throws ConfigurationException {
+ try {
+ checkClass(key, Path.class);
+ return Paths.get(getProperty(key.name()));
+ } catch (RuntimeException re) {
+ throw new ConfigurationException("Corrupt property: " + key
+ + " reason: " + re.getMessage(), re);
+ }
+ }
+
+}
diff --git a/src/main/java/org/torproject/collector/conf/ConfigurationException.java b/src/main/java/org/torproject/collector/conf/ConfigurationException.java
new file mode 100644
index 0000000..730b1b3
--- /dev/null
+++ b/src/main/java/org/torproject/collector/conf/ConfigurationException.java
@@ -0,0 +1,18 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+
+package org.torproject.collector.conf;
+
+public class ConfigurationException extends Exception {
+
+ public ConfigurationException() {}
+
+ public ConfigurationException(String msg) {
+ super(msg);
+ }
+
+ public ConfigurationException(String msg, Exception ex) {
+ super(msg, ex);
+ }
+
+}
diff --git a/src/main/java/org/torproject/collector/conf/Key.java b/src/main/java/org/torproject/collector/conf/Key.java
new file mode 100644
index 0000000..67f91c5
--- /dev/null
+++ b/src/main/java/org/torproject/collector/conf/Key.java
@@ -0,0 +1,55 @@
+package org.torproject.collector.conf;
+
+import java.nio.file.Path;
+
+/**
+ * Enum containing all the properties keys of the configuration.
+ * Specifies the key type.
+ */
+public enum Key {
+
+ LockFilePath(Path.class),
+ ArchivePath(Path.class),
+ RecentPath(Path.class),
+ IndexPath(Path.class),
+ StatsPath(Path.class),
+ BridgeSnapshotsDirectory(Path.class),
+ CachedRelayDescriptorsDirectories(String[].class),
+ CompressRelayDescriptorDownloads(Boolean.class),
+ DirectoryArchivesDirectory(Path.class),
+ DirectoryArchivesOutputDirectory(Path.class),
+ DownloadRelayDescriptors(Boolean.class),
+ DirectoryAuthoritiesAddresses(String[].class),
+ DirectoryAuthoritiesFingerprintsForVotes(String[].class),
+ DownloadCurrentConsensus(Boolean.class),
+ DownloadCurrentMicrodescConsensus(Boolean.class),
+ DownloadCurrentVotes(Boolean.class),
+ DownloadMissingServerDescriptors(Boolean.class),
+ DownloadMissingExtraInfoDescriptors(Boolean.class),
+ DownloadMissingMicrodescriptors(Boolean.class),
+ DownloadAllServerDescriptors(Boolean.class),
+ DownloadAllExtraInfoDescriptors(Boolean.class),
+ ImportCachedRelayDescriptors(Boolean.class),
+ ImportDirectoryArchives(Boolean.class),
+ KeepDirectoryArchiveImportHistory(Boolean.class),
+ ReplaceIPAddressesWithHashes(Boolean.class),
+ BridgeDescriptorMappingsLimit(Integer.class),
+ SanitizedBridgesWriteDirectory(Path.class),
+ TorperfOutputDirectory(Path.class),
+ TorperfFilesLines(String[].class),
+ TorperfSources(String[][].class);
+
+ private Class clazz;
+
+ /**
+ * @param Class of key value.
+ */
+ Key(Class clazz) {
+ this.clazz = clazz;
+ }
+
+ public Class keyClass() {
+ return clazz;
+ }
+
+}
diff --git a/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java b/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java
index 54fd50f..53fc300 100644
--- a/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java
+++ b/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java
@@ -3,7 +3,9 @@
package org.torproject.collector.exitlists;
-import org.torproject.collector.main.Configuration;
+import org.torproject.collector.conf.Configuration;
+import org.torproject.collector.conf.ConfigurationException;
+import org.torproject.collector.conf.Key;
import org.torproject.collector.main.LockFile;
import org.torproject.descriptor.Descriptor;
import org.torproject.descriptor.DescriptorParseException;
@@ -31,21 +33,15 @@ import java.util.logging.Logger;
public class ExitListDownloader extends Thread {
- public static void main(String[] args) {
+ private static Logger logger =
+ Logger.getLogger(ExitListDownloader.class.getName());
- Logger logger = Logger.getLogger(ExitListDownloader.class.getName());
+ public static void main(Configuration config) throws ConfigurationException {
logger.info("Starting exit-lists module of CollecTor.");
- // Initialize configuration
- Configuration config = new Configuration();
-
// Use lock file to avoid overlapping runs
- LockFile lf = new LockFile("exit-lists");
- if (!lf.acquireLock()) {
- logger.severe("Warning: CollecTor is already running or has not exited "
- + "cleanly! Exiting!");
- System.exit(1);
- }
+ LockFile lf = new LockFile(config.getPath(Key.LockFilePath).toString(), "exit-lists");
+ lf.acquireLock();
// Download exit list and store it to disk
new ExitListDownloader(config).run();
@@ -56,12 +52,18 @@ public class ExitListDownloader extends Thread {
logger.info("Terminating exit-lists module of CollecTor.");
}
- public ExitListDownloader(Configuration config) {
- }
+ public ExitListDownloader(Configuration config) {}
public void run() {
+ try {
+ startProcessing();
+ } catch (ConfigurationException ce) {
+ logger.severe("Configuration failed: " + ce);
+ throw new RuntimeException(ce);
+ }
+ }
- Logger logger = Logger.getLogger(ExitListDownloader.class.getName());
+ private void startProcessing() throws ConfigurationException {
SimpleDateFormat dateTimeFormat =
new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
diff --git a/src/main/java/org/torproject/collector/index/CreateIndexJson.java b/src/main/java/org/torproject/collector/index/CreateIndexJson.java
index ac5adf5..de69488 100644
--- a/src/main/java/org/torproject/collector/index/CreateIndexJson.java
+++ b/src/main/java/org/torproject/collector/index/CreateIndexJson.java
@@ -3,6 +3,10 @@
package org.torproject.collector.index;
+import org.torproject.collector.conf.Configuration;
+import org.torproject.collector.conf.ConfigurationException;
+import org.torproject.collector.conf.Key;
+
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
@@ -33,12 +37,11 @@ import java.util.zip.GZIPOutputStream;
* we'll likely have to do that. */
public class CreateIndexJson {
- static final File indexJsonFile = new File("index.json");
+ private static File indexJsonFile;
- static final String basePath = "https://collector.torproject.org";
+ private static String basePath = "https://collector.torproject.org";
- static final File[] indexedDirectories = new File[] {
- new File("archive"), new File("recent") };
+ private static File[] indexedDirectories;
static final String dateTimePattern = "yyyy-MM-dd HH:mm";
@@ -46,7 +49,12 @@ public class CreateIndexJson {
static final TimeZone dateTimezone = TimeZone.getTimeZone("UTC");
- public static void main(String[] args) throws IOException {
+ public static void main(Configuration config)
+ throws ConfigurationException, IOException {
+ indexJsonFile = new File(config.getPath(Key.IndexPath).toFile(), "index.json");
+ indexedDirectories = new File[] {
+ new File(config.getPath(Key.ArchivePath).toFile(), "archive"),
+ new File(config.getPath(Key.RecentPath).toFile(), "recent") };
writeIndex(indexDirectories());
}
diff --git a/src/main/java/org/torproject/collector/main/Configuration.java b/src/main/java/org/torproject/collector/main/Configuration.java
deleted file mode 100644
index aee1d02..0000000
--- a/src/main/java/org/torproject/collector/main/Configuration.java
+++ /dev/null
@@ -1,318 +0,0 @@
-/* Copyright 2010--2016 The Tor Project
- * See LICENSE for licensing information */
-
-package org.torproject.collector.main;
-
-import java.io.BufferedReader;
-import java.io.File;
-import java.io.FileReader;
-import java.io.IOException;
-import java.net.MalformedURLException;
-import java.net.URL;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.List;
-import java.util.SortedMap;
-import java.util.TreeMap;
-import java.util.logging.Level;
-import java.util.logging.Logger;
-
-/**
- * Initialize configuration with hard-coded defaults, overwrite with
- * configuration in config file, if exists, and answer Main.java about our
- * configuration.
- */
-public class Configuration {
- private String directoryArchivesOutputDirectory =
- "out/relay-descriptors/";
- private boolean importCachedRelayDescriptors = false;
- private List<String> cachedRelayDescriptorsDirectory =
- new ArrayList<String>(Arrays.asList(
- "in/relay-descriptors/cacheddesc/".split(",")));
- private boolean importDirectoryArchives = false;
- private String directoryArchivesDirectory =
- "in/relay-descriptors/archives/";
- private boolean keepDirectoryArchiveImportHistory = false;
- private boolean replaceIPAddressesWithHashes = false;
- private long limitBridgeDescriptorMappings = -1L;
- private String sanitizedBridgesWriteDirectory =
- "out/bridge-descriptors/";
- private String bridgeSnapshotsDirectory = "in/bridge-descriptors/";
- private boolean downloadRelayDescriptors = false;
- private List<String> downloadFromDirectoryAuthorities = Arrays.asList((
- "86.59.21.38,76.73.17.194:9030,171.25.193.9:443,"
- + "193.23.244.244,208.83.223.34:443,128.31.0.34:9131,"
- + "194.109.206.212,212.112.245.170,154.35.32.5").split(","));
- private List<String> downloadVotesByFingerprint = Arrays.asList((
- "14C131DFC5C6F93646BE72FA1401C02A8DF2E8B4,"
- + "27B6B5996C426270A5C95488AA5BCEB6BCC86956,"
- + "49015F787433103580E3B66A1707A00E60F2D15B,"
- + "585769C78764D58426B8B52B6651A5A71137189A,"
- + "80550987E1D626E3EBA5E5E75A458DE0626D088C,"
- + "D586D18309DED4CD6D57C18FDB97EFA96D330566,"
- + "E8A9C45EDE6D711294FADF8E7951F4DE6CA56B58,"
- + "ED03BB616EB2F60BEC80151114BB25CEF515B226,"
- + "EFCBE720AB3A82B99F9E953CD5BF50F7EEFC7B97").split(","));
- private boolean downloadCurrentConsensus = true;
- private boolean downloadCurrentMicrodescConsensus = true;
- private boolean downloadCurrentVotes = true;
- private boolean downloadMissingServerDescriptors = true;
- private boolean downloadMissingExtraInfoDescriptors = true;
- private boolean downloadMissingMicrodescriptors = true;
- private boolean downloadAllServerDescriptors = false;
- private boolean downloadAllExtraInfoDescriptors = false;
- private boolean compressRelayDescriptorDownloads;
- private String torperfOutputDirectory = "out/torperf/";
- private SortedMap<String, String> torperfSources = null;
- private List<String> torperfFiles = null;
-
- public Configuration() {
-
- /* Initialize logger. */
- Logger logger = Logger.getLogger(Configuration.class.getName());
-
- /* Read config file, if present. */
- File configFile = new File("config");
- if (!configFile.exists()) {
- logger.warning("Could not find config file. In the default "
- + "configuration, we are not configured to read data from any "
- + "data source or write data to any data sink. You need to "
- + "create a config file (" + configFile.getAbsolutePath()
- + ") and provide at least one data source and one data sink. "
- + "Refer to the manual for more information.");
- return;
- }
- String line = null;
- boolean containsCachedRelayDescriptorsDirectory = false;
- try {
- BufferedReader br = new BufferedReader(new FileReader(configFile));
- while ((line = br.readLine()) != null) {
- if (line.startsWith("#") || line.length() < 1) {
- continue;
- } else if (line.startsWith("DirectoryArchivesOutputDirectory")) {
- this.directoryArchivesOutputDirectory = line.split(" ")[1];
- } else if (line.startsWith("ImportCachedRelayDescriptors")) {
- this.importCachedRelayDescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("CachedRelayDescriptorsDirectory")) {
- if (!containsCachedRelayDescriptorsDirectory) {
- this.cachedRelayDescriptorsDirectory.clear();
- containsCachedRelayDescriptorsDirectory = true;
- }
- this.cachedRelayDescriptorsDirectory.add(line.split(" ")[1]);
- } else if (line.startsWith("ImportDirectoryArchives")) {
- this.importDirectoryArchives = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DirectoryArchivesDirectory")) {
- this.directoryArchivesDirectory = line.split(" ")[1];
- } else if (line.startsWith("KeepDirectoryArchiveImportHistory")) {
- this.keepDirectoryArchiveImportHistory = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("ReplaceIPAddressesWithHashes")) {
- this.replaceIPAddressesWithHashes = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("LimitBridgeDescriptorMappings")) {
- this.limitBridgeDescriptorMappings = Long.parseLong(
- line.split(" ")[1]);
- } else if (line.startsWith("SanitizedBridgesWriteDirectory")) {
- this.sanitizedBridgesWriteDirectory = line.split(" ")[1];
- } else if (line.startsWith("BridgeSnapshotsDirectory")) {
- this.bridgeSnapshotsDirectory = line.split(" ")[1];
- } else if (line.startsWith("DownloadRelayDescriptors")) {
- this.downloadRelayDescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadFromDirectoryAuthorities")) {
- this.downloadFromDirectoryAuthorities = new ArrayList<String>();
- for (String dir : line.split(" ")[1].split(",")) {
- // test if IP:port pair has correct format
- if (dir.length() < 1) {
- logger.severe("Configuration file contains directory "
- + "authority IP:port of length 0 in line '" + line
- + "'! Exiting!");
- System.exit(1);
- }
- new URL("http://" + dir + "/");
- this.downloadFromDirectoryAuthorities.add(dir);
- }
- } else if (line.startsWith("DownloadVotesByFingerprint")) {
- this.downloadVotesByFingerprint = new ArrayList<String>();
- for (String fingerprint : line.split(" ")[1].split(",")) {
- this.downloadVotesByFingerprint.add(fingerprint);
- }
- } else if (line.startsWith("DownloadCurrentConsensus")) {
- this.downloadCurrentConsensus = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadCurrentMicrodescConsensus")) {
- this.downloadCurrentMicrodescConsensus = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadCurrentVotes")) {
- this.downloadCurrentVotes = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadMissingServerDescriptors")) {
- this.downloadMissingServerDescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith(
- "DownloadMissingExtraInfoDescriptors")) {
- this.downloadMissingExtraInfoDescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadMissingMicrodescriptors")) {
- this.downloadMissingMicrodescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadAllServerDescriptors")) {
- this.downloadAllServerDescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("DownloadAllExtraInfoDescriptors")) {
- this.downloadAllExtraInfoDescriptors = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("CompressRelayDescriptorDownloads")) {
- this.compressRelayDescriptorDownloads = Integer.parseInt(
- line.split(" ")[1]) != 0;
- } else if (line.startsWith("TorperfOutputDirectory")) {
- this.torperfOutputDirectory = line.split(" ")[1];
- } else if (line.startsWith("TorperfSource")) {
- if (this.torperfSources == null) {
- this.torperfSources = new TreeMap<String, String>();
- }
- String[] parts = line.split(" ");
- String sourceName = parts[1];
- String baseUrl = parts[2];
- this.torperfSources.put(sourceName, baseUrl);
- } else if (line.startsWith("TorperfFiles")) {
- if (this.torperfFiles == null) {
- this.torperfFiles = new ArrayList<String>();
- }
- String[] parts = line.split(" ");
- if (parts.length != 5) {
- logger.severe("Configuration file contains TorperfFiles "
- + "option with wrong number of values in line '" + line
- + "'! Exiting!");
- System.exit(1);
- }
- this.torperfFiles.add(line);
- } else {
- logger.severe("Configuration file contains unrecognized "
- + "configuration key in line '" + line + "'! Exiting!");
- System.exit(1);
- }
- }
- br.close();
- } catch (ArrayIndexOutOfBoundsException e) {
- logger.severe("Configuration file contains configuration key "
- + "without value in line '" + line + "'. Exiting!");
- System.exit(1);
- } catch (MalformedURLException e) {
- logger.severe("Configuration file contains illegal URL or IP:port "
- + "pair in line '" + line + "'. Exiting!");
- System.exit(1);
- } catch (NumberFormatException e) {
- logger.severe("Configuration file contains illegal value in line '"
- + line + "' with legal values being 0 or 1. Exiting!");
- System.exit(1);
- } catch (IOException e) {
- logger.log(Level.SEVERE, "Unknown problem while reading config "
- + "file! Exiting!", e);
- System.exit(1);
- }
- }
-
- public String getDirectoryArchivesOutputDirectory() {
- return this.directoryArchivesOutputDirectory;
- }
-
- public boolean getImportCachedRelayDescriptors() {
- return this.importCachedRelayDescriptors;
- }
-
- public List<String> getCachedRelayDescriptorDirectory() {
- return this.cachedRelayDescriptorsDirectory;
- }
-
- public boolean getImportDirectoryArchives() {
- return this.importDirectoryArchives;
- }
-
- public String getDirectoryArchivesDirectory() {
- return this.directoryArchivesDirectory;
- }
-
- public boolean getKeepDirectoryArchiveImportHistory() {
- return this.keepDirectoryArchiveImportHistory;
- }
-
- public boolean getReplaceIPAddressesWithHashes() {
- return this.replaceIPAddressesWithHashes;
- }
-
- public long getLimitBridgeDescriptorMappings() {
- return this.limitBridgeDescriptorMappings;
- }
-
- public String getSanitizedBridgesWriteDirectory() {
- return this.sanitizedBridgesWriteDirectory;
- }
-
- public String getBridgeSnapshotsDirectory() {
- return this.bridgeSnapshotsDirectory;
- }
-
- public boolean getDownloadRelayDescriptors() {
- return this.downloadRelayDescriptors;
- }
-
- public List<String> getDownloadFromDirectoryAuthorities() {
- return this.downloadFromDirectoryAuthorities;
- }
-
- public List<String> getDownloadVotesByFingerprint() {
- return this.downloadVotesByFingerprint;
- }
-
- public boolean getDownloadCurrentConsensus() {
- return this.downloadCurrentConsensus;
- }
-
- public boolean getDownloadCurrentMicrodescConsensus() {
- return this.downloadCurrentMicrodescConsensus;
- }
-
- public boolean getDownloadCurrentVotes() {
- return this.downloadCurrentVotes;
- }
-
- public boolean getDownloadMissingServerDescriptors() {
- return this.downloadMissingServerDescriptors;
- }
-
- public boolean getDownloadMissingExtraInfoDescriptors() {
- return this.downloadMissingExtraInfoDescriptors;
- }
-
- public boolean getDownloadMissingMicrodescriptors() {
- return this.downloadMissingMicrodescriptors;
- }
-
- public boolean getDownloadAllServerDescriptors() {
- return this.downloadAllServerDescriptors;
- }
-
- public boolean getDownloadAllExtraInfoDescriptors() {
- return this.downloadAllExtraInfoDescriptors;
- }
-
- public boolean getCompressRelayDescriptorDownloads() {
- return this.compressRelayDescriptorDownloads;
- }
-
- public String getTorperfOutputDirectory() {
- return this.torperfOutputDirectory;
- }
-
- public SortedMap<String, String> getTorperfSources() {
- return this.torperfSources;
- }
-
- public List<String> getTorperfFiles() {
- return this.torperfFiles;
- }
-}
-
diff --git a/src/main/java/org/torproject/collector/main/LockFile.java b/src/main/java/org/torproject/collector/main/LockFile.java
index b07d4b1..f168bc3 100644
--- a/src/main/java/org/torproject/collector/main/LockFile.java
+++ b/src/main/java/org/torproject/collector/main/LockFile.java
@@ -13,12 +13,17 @@ import java.util.logging.Logger;
public class LockFile {
- private File lockFile;
- private Logger logger;
+ private final File lockFile;
+ private final String moduleName;
+ private final Logger logger = Logger.getLogger(LockFile.class.getName());
public LockFile(String moduleName) {
- this.lockFile = new File("lock/" + moduleName);
- this.logger = Logger.getLogger(LockFile.class.getName());
+ this("lock", moduleName);
+ }
+
+ public LockFile(String lockFilePath, String moduleName) {
+ this.lockFile = new File(lockFilePath, moduleName);
+ this.moduleName = moduleName;
}
public boolean acquireLock() {
@@ -30,7 +35,7 @@ public class LockFile {
long runStarted = Long.parseLong(br.readLine());
br.close();
if (System.currentTimeMillis() - runStarted < 55L * 60L * 1000L) {
- return false;
+ throw new RuntimeException("Cannot acquire lock for " + moduleName);
}
}
this.lockFile.getParentFile().mkdirs();
@@ -41,9 +46,8 @@ public class LockFile {
this.logger.fine("Acquired lock.");
return true;
} catch (IOException e) {
- this.logger.warning("Caught exception while trying to acquire "
- + "lock!");
- return false;
+ throw new RuntimeException("Caught exception while trying to acquire "
+ + "lock for " + moduleName);
}
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java b/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java
index cf603d1..43c7975 100644
--- a/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java
+++ b/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java
@@ -3,7 +3,9 @@
package org.torproject.collector.relaydescs;
-import org.torproject.collector.main.Configuration;
+import org.torproject.collector.conf.Configuration;
+import org.torproject.collector.conf.ConfigurationException;
+import org.torproject.collector.conf.Key;
import org.torproject.collector.main.LockFile;
import org.torproject.descriptor.DescriptorParseException;
import org.torproject.descriptor.DescriptorParser;
@@ -17,6 +19,8 @@ import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Arrays;
@@ -36,11 +40,12 @@ import java.util.logging.Logger;
public class ArchiveWriter extends Thread {
+ private static Logger logger = Logger.getLogger(ArchiveWriter.class.getName());
+
private Configuration config;
private long now = System.currentTimeMillis();
- private Logger logger;
- private File outputDirectory;
+ private String outputDirectory;
private String rsyncCatString;
private DescriptorParser descriptorParser;
private int storedConsensusesCounter = 0;
@@ -67,12 +72,9 @@ public class ArchiveWriter extends Thread {
private SortedMap<Long, Set<String>> storedMicrodescriptors =
new TreeMap<Long, Set<String>>();
- private File storedServerDescriptorsFile = new File(
- "stats/stored-server-descriptors");
- private File storedExtraInfoDescriptorsFile = new File(
- "stats/stored-extra-info-descriptors");
- private File storedMicrodescriptorsFile = new File(
- "stats/stored-microdescriptors");
+ private File storedServerDescriptorsFile;
+ private File storedExtraInfoDescriptorsFile;
+ private File storedMicrodescriptorsFile;
private static final byte[] CONSENSUS_ANNOTATION =
"@type network-status-consensus-3 1.0\n".getBytes();
@@ -97,28 +99,31 @@ public class ArchiveWriter extends Thread {
private StringBuilder intermediateStats = new StringBuilder();
- public static void main(String[] args) {
+ private static Path recentPath;
+ private static String recentPathName;
+ private static final String RELAY_DESCRIPTORS = "relay-descriptors";
+ private static final String MICRO = "micro";
+ private static final String CONSENSUS_MICRODESC = "consensus-microdesc";
+ private static final String MICRODESC = "microdesc";
+ private static final String MICRODESCS = "microdescs";
+ public static void main(Configuration config) throws ConfigurationException {
- Logger logger = Logger.getLogger(ArchiveWriter.class.getName());
logger.info("Starting relay-descriptors module of CollecTor.");
- // Initialize configuration
- Configuration config = new Configuration();
-
// Use lock file to avoid overlapping runs
- LockFile lf = new LockFile("relay-descriptors");
- if (!lf.acquireLock()) {
- logger.severe("Warning: CollecTor is already running or has not exited "
- + "cleanly! Exiting!");
- System.exit(1);
- }
+ LockFile lf = new LockFile(config.getPath(Key.LockFilePath).toString(), RELAY_DESCRIPTORS);
+ lf.acquireLock();
+
+ recentPath = config.getPath(Key.RecentPath);
+ recentPathName = recentPath.toString();
// Import/download relay descriptors from the various sources
new ArchiveWriter(config).run();
- new ReferenceChecker(new File("recent/relay-descriptors"),
- new File("stats/references"),
- new File("stats/references-history")).check();
+ new ReferenceChecker(
+ recentPath.toFile(),
+ new File(config.getPath(Key.StatsPath).toFile(), "references"),
+ new File(config.getPath(Key.StatsPath).toFile(), "references-history")).check();
// Remove lock file
lf.releaseLock();
@@ -126,18 +131,29 @@ public class ArchiveWriter extends Thread {
logger.info("Terminating relay-descriptors module of CollecTor.");
}
- public ArchiveWriter(Configuration config) {
+ public ArchiveWriter(Configuration config) throws ConfigurationException {
this.config = config;
+ storedServerDescriptorsFile =
+ new File(config.getPath(Key.StatsPath).toFile(), "stored-server-descriptors");
+ storedExtraInfoDescriptorsFile =
+ new File(config.getPath(Key.StatsPath).toFile(), "stored-extra-info-descriptors");
+ storedMicrodescriptorsFile =
+ new File(config.getPath(Key.StatsPath).toFile(), "stored-microdescriptors");
}
public void run() {
+ try {
+ startProcessing();
+ } catch (ConfigurationException ce) {
+ logger.severe("Configuration failed: " + ce);
+ throw new RuntimeException(ce);
+ }
+ }
- File outputDirectory =
- new File(config.getDirectoryArchivesOutputDirectory());
- File statsDirectory = new File("stats");
+ private void startProcessing() throws ConfigurationException {
- this.logger = Logger.getLogger(ArchiveWriter.class.getName());
- this.outputDirectory = outputDirectory;
+ File statsDirectory = new File("stats");
+ this.outputDirectory = config.getPath(Key.DirectoryArchivesOutputDirectory).toString();
SimpleDateFormat rsyncCatFormat = new SimpleDateFormat(
"yyyy-MM-dd-HH-mm-ss");
rsyncCatFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
@@ -152,33 +168,33 @@ public class ArchiveWriter extends Thread {
RelayDescriptorParser rdp = new RelayDescriptorParser(this);
RelayDescriptorDownloader rdd = null;
- if (config.getDownloadRelayDescriptors()) {
- List<String> dirSources =
- config.getDownloadFromDirectoryAuthorities();
+ if (config.getBool(Key.DownloadRelayDescriptors)) {
+ String[] dirSources =
+ config.getStringArray(Key.DirectoryAuthoritiesAddresses);
rdd = new RelayDescriptorDownloader(rdp, dirSources,
- config.getDownloadVotesByFingerprint(),
- config.getDownloadCurrentConsensus(),
- config.getDownloadCurrentMicrodescConsensus(),
- config.getDownloadCurrentVotes(),
- config.getDownloadMissingServerDescriptors(),
- config.getDownloadMissingExtraInfoDescriptors(),
- config.getDownloadMissingMicrodescriptors(),
- config.getDownloadAllServerDescriptors(),
- config.getDownloadAllExtraInfoDescriptors(),
- config.getCompressRelayDescriptorDownloads());
+ config.getStringArray(Key.DirectoryAuthoritiesFingerprintsForVotes),
+ config.getBool(Key.DownloadCurrentConsensus),
+ config.getBool(Key.DownloadCurrentMicrodescConsensus),
+ config.getBool(Key.DownloadCurrentVotes),
+ config.getBool(Key.DownloadMissingServerDescriptors),
+ config.getBool(Key.DownloadMissingExtraInfoDescriptors),
+ config.getBool(Key.DownloadMissingMicrodescriptors),
+ config.getBool(Key.DownloadAllServerDescriptors),
+ config.getBool(Key.DownloadAllExtraInfoDescriptors),
+ config.getBool(Key.CompressRelayDescriptorDownloads));
rdp.setRelayDescriptorDownloader(rdd);
}
- if (config.getImportCachedRelayDescriptors()) {
+ if (config.getBool(Key.ImportCachedRelayDescriptors)) {
new CachedRelayDescriptorReader(rdp,
- config.getCachedRelayDescriptorDirectory(), statsDirectory);
+ config.getStringArray(Key.CachedRelayDescriptorsDirectories), statsDirectory);
this.intermediateStats("importing relay descriptors from local "
+ "Tor data directories");
}
- if (config.getImportDirectoryArchives()) {
+ if (config.getBool(Key.ImportDirectoryArchives)) {
new ArchiveReader(rdp,
- new File(config.getDirectoryArchivesDirectory()),
+ config.getPath(Key.DirectoryArchivesDirectory).toFile(),
statsDirectory,
- config.getKeepDirectoryArchiveImportHistory());
+ config.getBool(Key.KeepDirectoryArchiveImportHistory));
this.intermediateStats("importing relay descriptors from local "
+ "directory");
}
@@ -557,7 +573,7 @@ public class ArchiveWriter extends Thread {
- 3L * 24L * 60L * 60L * 1000L;
long cutOffMicroMillis = cutOffMillis - 27L * 24L * 60L * 60L * 1000L;
Stack<File> allFiles = new Stack<File>();
- allFiles.add(new File("recent/relay-descriptors"));
+ allFiles.add(new File(recentPathName, RELAY_DESCRIPTORS));
while (!allFiles.isEmpty()) {
File file = allFiles.pop();
if (file.isDirectory()) {
@@ -633,11 +649,11 @@ public class ArchiveWriter extends Thread {
SimpleDateFormat printFormat = new SimpleDateFormat(
"yyyy/MM/dd/yyyy-MM-dd-HH-mm-ss");
printFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory + "/consensus/"
- + printFormat.format(new Date(validAfter)) + "-consensus");
+ File tarballFile = Paths.get(this.outputDirectory, "consensus",
+ printFormat.format(new Date(validAfter)) + "-consensus").toFile();
boolean tarballFileExistedBefore = tarballFile.exists();
- File rsyncFile = new File("recent/relay-descriptors/consensuses/"
- + tarballFile.getName());
+ File rsyncFile = Paths.get(recentPathName, RELAY_DESCRIPTORS,
+ "consensuses", tarballFile.getName()).toFile();
File[] outputFiles = new File[] { tarballFile, rsyncFile };
if (this.store(CONSENSUS_ANNOTATION, data, outputFiles, null)) {
this.storedConsensusesCounter++;
@@ -657,14 +673,12 @@ public class ArchiveWriter extends Thread {
SimpleDateFormat dayDirectoryFileFormat = new SimpleDateFormat(
"dd/yyyy-MM-dd-HH-mm-ss");
dayDirectoryFileFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory
- + "/microdesc/" + yearMonthDirectoryFormat.format(validAfter)
- + "/consensus-microdesc/"
- + dayDirectoryFileFormat.format(validAfter)
- + "-consensus-microdesc");
+ File tarballFile = Paths.get(this.outputDirectory, MICRODESC,
+ yearMonthDirectoryFormat.format(validAfter), CONSENSUS_MICRODESC,
+ dayDirectoryFileFormat.format(validAfter) + "-consensus-microdesc").toFile();
boolean tarballFileExistedBefore = tarballFile.exists();
- File rsyncFile = new File("recent/relay-descriptors/microdescs/"
- + "consensus-microdesc/" + tarballFile.getName());
+ File rsyncFile = Paths.get(recentPathName, RELAY_DESCRIPTORS, MICRODESCS,
+ CONSENSUS_MICRODESC, tarballFile.getName()).toFile();
File[] outputFiles = new File[] { tarballFile, rsyncFile };
if (this.store(MICRODESCCONSENSUS_ANNOTATION, data, outputFiles,
null)) {
@@ -683,12 +697,12 @@ public class ArchiveWriter extends Thread {
SimpleDateFormat printFormat = new SimpleDateFormat(
"yyyy/MM/dd/yyyy-MM-dd-HH-mm-ss");
printFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory + "/vote/"
- + printFormat.format(new Date(validAfter)) + "-vote-"
- + fingerprint + "-" + digest);
+ File tarballFile = Paths.get(this.outputDirectory, "vote",
+ printFormat.format(new Date(validAfter)) + "-vote-"
+ + fingerprint + "-" + digest).toFile();
boolean tarballFileExistedBefore = tarballFile.exists();
- File rsyncFile = new File("recent/relay-descriptors/votes/"
- + tarballFile.getName());
+ File rsyncFile = Paths.get(recentPathName, RELAY_DESCRIPTORS, "votes",
+ tarballFile.getName()).toFile();
File[] outputFiles = new File[] { tarballFile, rsyncFile };
if (this.store(VOTE_ANNOTATION, data, outputFiles, null)) {
this.storedVotesCounter++;
@@ -709,8 +723,8 @@ public class ArchiveWriter extends Thread {
SimpleDateFormat printFormat = new SimpleDateFormat(
"yyyy-MM-dd-HH-mm-ss");
printFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory + "/certs/"
- + fingerprint + "-" + printFormat.format(new Date(published)));
+ File tarballFile = Paths.get(this.outputDirectory, "certs",
+ fingerprint + "-" + printFormat.format(new Date(published))).toFile();
File[] outputFiles = new File[] { tarballFile };
if (this.store(CERTIFICATE_ANNOTATION, data, outputFiles, null)) {
this.storedCertsCounter++;
@@ -721,14 +735,13 @@ public class ArchiveWriter extends Thread {
long published, String extraInfoDigest) {
SimpleDateFormat printFormat = new SimpleDateFormat("yyyy/MM/");
printFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory
- + "/server-descriptor/" + printFormat.format(new Date(published))
- + digest.substring(0, 1) + "/" + digest.substring(1, 2) + "/"
- + digest);
+ File tarballFile = Paths.get(this.outputDirectory,
+ "server-descriptor", printFormat.format(new Date(published)),
+ digest.substring(0, 1), digest.substring(1, 2), digest).toFile();
boolean tarballFileExistedBefore = tarballFile.exists();
- File rsyncCatFile = new File("recent/relay-descriptors/"
- + "server-descriptors/" + this.rsyncCatString
- + "-server-descriptors.tmp");
+ File rsyncCatFile = Paths.get(recentPathName, RELAY_DESCRIPTORS,
+ "server-descriptors",
+ this.rsyncCatString + "-server-descriptors.tmp").toFile();
File[] outputFiles = new File[] { tarballFile, rsyncCatFile };
boolean[] append = new boolean[] { false, true };
if (this.store(SERVER_DESCRIPTOR_ANNOTATION, data, outputFiles,
@@ -750,14 +763,14 @@ public class ArchiveWriter extends Thread {
String extraInfoDigest, long published) {
SimpleDateFormat descriptorFormat = new SimpleDateFormat("yyyy/MM/");
descriptorFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory + "/extra-info/"
- + descriptorFormat.format(new Date(published))
- + extraInfoDigest.substring(0, 1) + "/"
- + extraInfoDigest.substring(1, 2) + "/"
- + extraInfoDigest);
+ File tarballFile = Paths.get(this.outputDirectory, "extra-info",
+ descriptorFormat.format(new Date(published)),
+ extraInfoDigest.substring(0, 1),
+ extraInfoDigest.substring(1, 2),
+ extraInfoDigest).toFile();
boolean tarballFileExistedBefore = tarballFile.exists();
- File rsyncCatFile = new File("recent/relay-descriptors/"
- + "extra-infos/" + this.rsyncCatString + "-extra-infos.tmp");
+ File rsyncCatFile = Paths.get(recentPathName, RELAY_DESCRIPTORS,
+ "extra-infos", this.rsyncCatString + "-extra-infos.tmp").toFile();
File[] outputFiles = new File[] { tarballFile, rsyncCatFile };
boolean[] append = new boolean[] { false, true };
if (this.store(EXTRA_INFO_ANNOTATION, data, outputFiles, append)) {
@@ -784,15 +797,14 @@ public class ArchiveWriter extends Thread {
* valid-after months. */
SimpleDateFormat descriptorFormat = new SimpleDateFormat("yyyy/MM/");
descriptorFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
- File tarballFile = new File(this.outputDirectory + "/microdesc/"
- + descriptorFormat.format(validAfter) + "micro/"
- + microdescriptorDigest.substring(0, 1) + "/"
- + microdescriptorDigest.substring(1, 2) + "/"
- + microdescriptorDigest);
+ File tarballFile = Paths.get(this.outputDirectory, MICRODESC,
+ descriptorFormat.format(validAfter), MICRO,
+ microdescriptorDigest.substring(0, 1),
+ microdescriptorDigest.substring(1, 2),
+ microdescriptorDigest).toFile();
boolean tarballFileExistedBefore = tarballFile.exists();
- File rsyncCatFile = new File("recent/relay-descriptors/"
- + "microdescs/micro/" + this.rsyncCatString
- + "-micro.tmp");
+ File rsyncCatFile = Paths.get(recentPathName, RELAY_DESCRIPTORS,
+ MICRODESCS, MICRO, this.rsyncCatString + "-micro.tmp").toFile();
File[] outputFiles = new File[] { tarballFile, rsyncCatFile };
boolean[] append = new boolean[] { false, true };
if (this.store(MICRODESCRIPTOR_ANNOTATION, data, outputFiles,
diff --git a/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java b/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java
index b9001dd..00eeab1 100644
--- a/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java
+++ b/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java
@@ -35,10 +35,10 @@ import java.util.logging.Logger;
*/
public class CachedRelayDescriptorReader {
public CachedRelayDescriptorReader(RelayDescriptorParser rdp,
- List<String> inputDirectories, File statsDirectory) {
+ String[] inputDirectories, File statsDirectory) {
if (rdp == null || inputDirectories == null
- || inputDirectories.isEmpty() || statsDirectory == null) {
+ || inputDirectories.length == 0 || statsDirectory == null) {
throw new IllegalArgumentException();
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java b/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java
index 458332a..bd0a482 100644
--- a/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java
+++ b/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java
@@ -19,7 +19,7 @@ import java.net.HttpURLConnection;
import java.net.URL;
import java.text.ParseException;
import java.text.SimpleDateFormat;
-import java.util.ArrayList;
+import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
@@ -288,7 +288,7 @@ public class RelayDescriptorDownloader {
* <code>stats/last-downloaded-all-descriptors</code>.
*/
public RelayDescriptorDownloader(RelayDescriptorParser rdp,
- List<String> authorities, List<String> authorityFingerprints,
+ String[] authorities, String[] authorityFingerprints,
boolean downloadCurrentConsensus,
boolean downloadCurrentMicrodescConsensus,
boolean downloadCurrentVotes,
@@ -300,9 +300,8 @@ public class RelayDescriptorDownloader {
/* Memorize argument values. */
this.rdp = rdp;
- this.authorities = new ArrayList<String>(authorities);
- this.authorityFingerprints = new ArrayList<String>(
- authorityFingerprints);
+ this.authorities = Arrays.asList(authorities);
+ this.authorityFingerprints = Arrays.asList(authorityFingerprints);
this.downloadCurrentConsensus = downloadCurrentConsensus;
this.downloadCurrentMicrodescConsensus =
downloadCurrentMicrodescConsensus;
diff --git a/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java b/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java
index 7bcfbf3..c80f99e 100644
--- a/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java
+++ b/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java
@@ -3,7 +3,9 @@
package org.torproject.collector.torperf;
-import org.torproject.collector.main.Configuration;
+import org.torproject.collector.conf.Configuration;
+import org.torproject.collector.conf.ConfigurationException;
+import org.torproject.collector.conf.Key;
import org.torproject.collector.main.LockFile;
import java.io.BufferedReader;
@@ -17,6 +19,7 @@ import java.net.HttpURLConnection;
import java.net.URL;
import java.text.SimpleDateFormat;
import java.util.Arrays;
+import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.SortedMap;
@@ -30,22 +33,14 @@ import java.util.logging.Logger;
* configured sources, append them to the files we already have, and merge
* the two files into the .tpf format. */
public class TorperfDownloader extends Thread {
+ private static Logger logger = Logger.getLogger(TorperfDownloader.class.getName());
- public static void main(String[] args) {
-
- Logger logger = Logger.getLogger(TorperfDownloader.class.getName());
+ public static void main(Configuration config) throws ConfigurationException {
logger.info("Starting torperf module of CollecTor.");
- // Initialize configuration
- Configuration config = new Configuration();
-
// Use lock file to avoid overlapping runs
- LockFile lf = new LockFile("torperf");
- if (!lf.acquireLock()) {
- logger.severe("Warning: CollecTor is already running or has not exited "
- + "cleanly! Exiting!");
- System.exit(1);
- }
+ LockFile lf = new LockFile(config.getPath(Key.LockFilePath).toString(), "torperf");
+ lf.acquireLock();
// Process Torperf files
new TorperfDownloader(config).run();
@@ -63,30 +58,34 @@ public class TorperfDownloader extends Thread {
}
private File torperfOutputDirectory = null;
- private SortedMap<String, String> torperfSources = null;
- private List<String> torperfFilesLines = null;
- private Logger logger = null;
+ private Map<String, String> torperfSources = new HashMap<>();
+ private String[] torperfFilesLines = null;
private SimpleDateFormat dateFormat;
public void run() {
+ try {
+ startProcessing();
+ } catch (ConfigurationException ce) {
+ logger.severe("Configuration failed: " + ce);
+ throw new RuntimeException(ce);
+ }
+ }
- File torperfOutputDirectory =
- new File(config.getTorperfOutputDirectory());
- SortedMap<String, String> torperfSources = config.getTorperfSources();
- List<String> torperfFilesLines = config.getTorperfFiles();
-
- this.torperfOutputDirectory = torperfOutputDirectory;
- this.torperfSources = torperfSources;
- this.torperfFilesLines = torperfFilesLines;
+ private void startProcessing() throws ConfigurationException {
+ this.torperfFilesLines = config.getStringArray(Key.TorperfFilesLines);
+ this.torperfOutputDirectory = config.getPath(Key.TorperfOutputDirectory)
+ .toFile();
if (!this.torperfOutputDirectory.exists()) {
this.torperfOutputDirectory.mkdirs();
}
- this.logger = Logger.getLogger(TorperfDownloader.class.getName());
this.dateFormat = new SimpleDateFormat("yyyy-MM-dd");
this.dateFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
this.readLastMergedTimestamps();
+ for (String[] source : config.getStringArrayArray(Key.TorperfSources)) {
+ torperfSources.put(source[0], source[1]);
+ }
for (String torperfFilesLine : this.torperfFilesLines) {
- this.downloadAndMergeFiles(torperfFilesLine);
+ this.downloadAndMergeFiles(torperfFilesLine);
}
this.writeLastMergedTimestamps();
@@ -161,10 +160,10 @@ public class TorperfDownloader extends Thread {
private void downloadAndMergeFiles(String torperfFilesLine) {
String[] parts = torperfFilesLine.split(" ");
- String sourceName = parts[1];
+ String sourceName = parts[0];
int fileSize = -1;
try {
- fileSize = Integer.parseInt(parts[2]);
+ fileSize = Integer.parseInt(parts[1]);
} catch (NumberFormatException e) {
this.logger.log(Level.WARNING, "Could not parse file size in "
+ "TorperfFiles configuration line '" + torperfFilesLine
@@ -173,7 +172,7 @@ public class TorperfDownloader extends Thread {
}
/* Download and append the .data file. */
- String dataFileName = parts[3];
+ String dataFileName = parts[2];
String sourceBaseUrl = torperfSources.get(sourceName);
String dataUrl = sourceBaseUrl + dataFileName;
String dataOutputFileName = sourceName + "-" + dataFileName;
@@ -183,7 +182,7 @@ public class TorperfDownloader extends Thread {
dataOutputFile, true);
/* Download and append the .extradata file. */
- String extradataFileName = parts[4];
+ String extradataFileName = parts[3];
String extradataUrl = sourceBaseUrl + extradataFileName;
String extradataOutputFileName = sourceName + "-" + extradataFileName;
File extradataOutputFile = new File(torperfOutputDirectory,
diff --git a/src/main/resources/collector.properties b/src/main/resources/collector.properties
new file mode 100644
index 0000000..2645d01
--- /dev/null
+++ b/src/main/resources/collector.properties
@@ -0,0 +1,115 @@
+######## Collector Properties
+#
+######## General Properties ########
+LockFilePath = lock
+IndexPath = out/index
+ArchivePath = out/archive
+RecentPath = out/recent
+StatsPath = out/stats
+
+######## Relay descriptors ########
+#
+## Read cached-* files from a local Tor data directory
+ImportCachedRelayDescriptors = false
+#
+## Relative path to Tor data directory to read cached-* files from
+## the listed path(s). If there is more that one separated by comma.
+CachedRelayDescriptorsDirectories = in/relay-descriptors/cacheddesc/
+#
+## Import directory archives from disk, if available
+ImportDirectoryArchives = false
+#
+## Relative path to directory to import directory archives from
+DirectoryArchivesDirectory = in/relay-descriptors/archives/
+#
+## Keep a history of imported directory archive files to know which files
+## have been imported before. This history can be useful when importing
+## from a changing source to avoid importing descriptors over and over
+## again, but it can be confusing to users who don't know about it.
+KeepDirectoryArchiveImportHistory = false
+#
+## Download relay descriptors from directory authorities, if required
+DownloadRelayDescriptors = false
+#
+## Comma separated list of directory authority addresses (IP[:port]) to
+## download missing relay descriptors from
+DirectoryAuthoritiesAddresses = 86.59.21.38,76.73.17.194:9030,171.25.193.9:443,193.23.244.244,208.83.223.34:443,128.31.0.34:9131,194.109.206.212,212.112.245.170,154.35.32.5
+#
+## Comma separated list of directory authority fingerprints to download
+## votes
+DirectoryAuthoritiesFingerprintsForVotes = 14C131DFC5C6F93646BE72FA1401C02A8DF2E8B4,27B6B5996C426270A5C95488AA5BCEB6BCC86956,49015F787433103580E3B66A1707A00E60F2D15B,585769C78764D58426B8B52B6651A5A71137189A,80550987E1D626E3EBA5E5E75A458DE0626D088C,D586D18309DED4CD6D57C18FDB97EFA96D330566,E8A9C45EDE6D711294FADF8E7951F4DE6CA56B58,ED03BB616EB2F60BEC80151114BB25CEF515B226,EFCBE720AB3A82B99F9E953CD5BF50F7EEFC7B97
+#
+## Download the current consensus (only if DownloadRelayDescriptors is 1)
+DownloadCurrentConsensus = true
+#
+## Download the current microdesc consensus (only if
+## DownloadRelayDescriptors is true)
+DownloadCurrentMicrodescConsensus = true
+#
+## Download current votes (only if DownloadRelayDescriptors is true)
+DownloadCurrentVotes = true
+#
+## Download missing server descriptors (only if DownloadRelayDescriptors
+## is true)
+DownloadMissingServerDescriptors = true
+#
+## Download missing extra-info descriptors (only if
+## DownloadRelayDescriptors is true)
+DownloadMissingExtraInfoDescriptors = true
+#
+## Download missing microdescriptors (only if
+## DownloadRelayDescriptors is true)
+DownloadMissingMicrodescriptors = true
+#
+## Download all server descriptors from the directory authorities at most
+## once a day (only if DownloadRelayDescriptors is true)
+DownloadAllServerDescriptors false
+#
+## Download all extra-info descriptors from the directory authorities at
+## most once a day (only if DownloadRelayDescriptors is true)
+DownloadAllExtraInfoDescriptors = false
+#
+## Compress relay descriptors downloads by adding .z to the URLs
+CompressRelayDescriptorDownloads = false
+#
+## Relative path to directory to write directory archives to
+DirectoryArchivesOutputDirectory = out/relay-descriptors/
+#
+#
+######## Bridge descriptors ########
+#
+## Relative path to directory to import bridge descriptor snapshots from
+BridgeSnapshotsDirectory = in/bridge-descriptors/
+#
+## Replace IP addresses in sanitized bridge descriptors with 10.x.y.z
+## where x.y.z = H(IP address | bridge identity | secret)[:3], so that we
+## can learn about IP address changes.
+ReplaceIPAddressesWithHashes = false
+#
+## Limit internal bridge descriptor mapping state to the following number
+## of days, or inf for unlimited.
+BridgeDescriptorMappingsLimit = inf
+#
+## Relative path to directory to write sanitized bridges to
+SanitizedBridgesWriteDirectory = out/bridge-descriptors/
+
+######## Exit lists ########
+#
+## (No options available)
+#
+#
+######## Torperf downloader ########
+#
+## Path to the directory to store Torperf files in.
+## A relative path starts with ./
+TorperfOutputDirectory = out/torperf/
+
+## Torperf source names and base URLs
+## multiple pairs can be specified separated by semi-colon, e.g.
+## TorperfSourceName = torperf_A, http://some.torproject.org/; another, http://another.torproject.org/
+TorperfSources = torperf, http://torperf.torproject.org/
+
+## Torperf measurement file size in bytes, .data file, and .extradata file
+## available on a given source (multiple times lists can be given
+## TorperfFiles = torperf 51200 50kb.data 50kb.extradata, torperf 1048576 1mb.data 1mb.extradata
+TorperfFilesLines = torperf 51200 50kb.data 50kb.extradata, torperf 1048576 1mb.data 1mb.extradata, torperf 5242880 5mb.data 5mb.extradata
diff --git a/src/test/java/org/torproject/collector/MainTest.java b/src/test/java/org/torproject/collector/MainTest.java
new file mode 100644
index 0000000..9a19285
--- /dev/null
+++ b/src/test/java/org/torproject/collector/MainTest.java
@@ -0,0 +1,72 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+package org.torproject.collector;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.assertTrue;
+import static org.junit.Assert.assertFalse;
+
+import org.torproject.collector.conf.Key;
+import org.torproject.collector.conf.ConfigurationException;
+
+import java.io.ByteArrayInputStream;
+import java.io.File;
+import java.net.URL;
+import java.io.BufferedWriter;
+import java.io.File;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.security.AccessControlException;
+import java.security.Policy;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Random;
+
+import org.junit.rules.TemporaryFolder;
+import org.junit.Rule;
+import org.junit.Test;
+
+public class MainTest {
+
+ private Random randomSource = new Random();
+
+ @Rule
+ public TemporaryFolder tmpf = new TemporaryFolder();
+
+ @Test()
+ public void testSmoke() throws Exception {
+ System.out.println("\n!!!! Three SEVERE log messages are expected."
+ + "\nOne each from: ExitListDownloader, "
+ + "TorperfDownloader, and CreateIndexJson.\n");
+ File conf = tmpf.newFile("test.conf");
+ File lockPath = tmpf.newFolder("test.lock");
+ assertEquals(0L, conf.length());
+ Main.main(new String[]{"relaydescs", conf.toString()});
+ assertTrue(4_000L <= conf.length());
+ changeLockFilePath(conf, lockPath);
+ for ( String key : Main.collecTorMains.keySet()) {
+ Main.main(new String[]{key, conf.toString()});
+ }
+ }
+
+ private void changeLockFilePath(File f, File l) throws Exception {
+ List<String> lines = Files.readAllLines(f.toPath());
+ BufferedWriter bw = Files.newBufferedWriter(f.toPath());
+ File out = tmpf.newFolder();
+ for(String line : lines) {
+ if (line.contains(Key.LockFilePath.name())) {
+ line = Key.LockFilePath.name() + " = " + l.toString();
+ } else if (line.contains("out")) {
+ line = line.replace("out", out.toString() + "out");
+ }
+ bw.write(line);
+ bw.newLine();
+ }
+ bw.flush();
+ bw.close();
+ }
+
+}
diff --git a/src/test/java/org/torproject/collector/conf/ConfigurationTest.java b/src/test/java/org/torproject/collector/conf/ConfigurationTest.java
new file mode 100644
index 0000000..aa98031
--- /dev/null
+++ b/src/test/java/org/torproject/collector/conf/ConfigurationTest.java
@@ -0,0 +1,143 @@
+/* Copyright 2016 The Tor Project
+ * See LICENSE for licensing information */
+package org.torproject.collector.conf;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.assertTrue;
+import static org.junit.Assert.assertFalse;
+
+import java.io.ByteArrayInputStream;
+import java.io.File;
+import java.net.URL;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Random;
+
+import org.junit.Test;
+
+public class ConfigurationTest {
+
+ private Random randomSource = new Random();
+
+ @Test()
+ public void testKeyCount() throws Exception {
+ assertEquals("The number of properties keys in enum Key changed."
+ + "\n This test class should be adapted.",
+ 30, Key.values().length);
+ }
+
+ @Test()
+ public void testConfiguration() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("TorperfOutputDirectory = xyz".getBytes()));
+ assertEquals(1, conf.size());
+ assertEquals("xyz", conf.getProperty("TorperfOutputDirectory"));
+ }
+
+ @Test()
+ public void testArrayValues() throws Exception {
+ String[] array = new String[randomSource.nextInt(30) + 1];
+ for (int i = 0; i < array.length; i++){
+ array[i] = Integer.toBinaryString(randomSource.nextInt(100));
+ }
+ String[] arrays = new String[] {
+ Arrays.toString(array).replace("[", "").replace("]", ""),
+ Arrays.toString(array).replace("[", "").replace("]", "").replaceAll(" ", "")
+ };
+ Configuration conf = new Configuration();
+ for(String input : arrays) {
+ conf.clear();
+ conf.load(new ByteArrayInputStream(("CachedRelayDescriptorsDirectories = " + input).getBytes()));
+ assertArrayEquals("expected " + Arrays.toString(array) + "\nreceived: "
+ + Arrays.toString(conf.getStringArray(Key.CachedRelayDescriptorsDirectories)),
+ array, conf.getStringArray(Key.CachedRelayDescriptorsDirectories));
+ }
+ }
+
+ @Test()
+ public void testBoolValues() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream(("CompressRelayDescriptorDownloads=false"
+ + "\nImportDirectoryArchives = trUe"
+ + "\nReplaceIPAddressesWithHashes= false").getBytes()));
+ assertFalse(conf.getBool(Key.CompressRelayDescriptorDownloads));
+ assertTrue(conf.getBool(Key.ImportDirectoryArchives));
+ assertFalse(conf.getBool(Key.ReplaceIPAddressesWithHashes));
+ }
+
+ @Test()
+ public void testIntValues() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("BridgeDescriptorMappingsLimit = inf".getBytes()));
+ assertEquals(Integer.MAX_VALUE,
+ conf.getInt(Key.BridgeDescriptorMappingsLimit));
+ int r = randomSource.nextInt(Integer.MAX_VALUE);
+ conf.clear();
+ conf.load(new ByteArrayInputStream(("BridgeDescriptorMappingsLimit =" + r).getBytes()));
+ assertEquals(r,
+ conf.getInt(Key.BridgeDescriptorMappingsLimit));
+ }
+
+ @Test()
+ public void testFileValues() throws Exception {
+ String[] files = new String[] { "/the/path/file.txt", "another/path"};
+ Configuration conf = new Configuration();
+ for(String file : files) {
+ conf.clear();
+ conf.load(new ByteArrayInputStream(("DirectoryArchivesOutputDirectory = " + file).getBytes()));
+ assertEquals(new File(file), conf.getPath(Key.DirectoryArchivesOutputDirectory).toFile());
+ }
+ }
+
+ @Test()
+ public void testArrayArrayValues() throws Exception {
+ String[][] sourceStrings = new String[][] {
+ new String[]{"localsource", "http://127.0.0.1:12345"},
+ new String[]{"somesource", "https://some.host.org:12345"}};
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream(("TorperfSources = "
+ + Arrays.deepToString(sourceStrings)).replace("[[", "").replace("]]", "")
+ .replace("], [", Configuration.ARRAYSEP).getBytes()));
+ assertArrayEquals(sourceStrings, conf.getStringArrayArray(Key.TorperfSources));
+ }
+
+ @Test( expected = ConfigurationException.class)
+ public void testArrayArrayValueException() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("CachedRelayDescriptorsDirectories".getBytes()));
+ conf.getStringArrayArray(Key.TorperfOutputDirectory);
+ }
+
+ @Test( expected = ConfigurationException.class)
+ public void testArrayValueException() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("CachedRelayDescriptorsDirectories".getBytes()));
+ conf.getStringArray(Key.TorperfSources);
+ }
+
+ @Test( expected = ConfigurationException.class)
+ public void testBoolValueException() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("TorperfSource = http://x.y.z".getBytes()));
+ conf.getBool(Key.CachedRelayDescriptorsDirectories);
+ }
+
+ @Test( expected = ConfigurationException.class)
+ public void testPathValueException() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("DirectoryArchivesDirectory = \\u0000:".getBytes()));
+ conf.getPath(Key.DirectoryArchivesDirectory);
+ }
+
+ @Test( expected = ConfigurationException.class)
+ public void testIntValueException() throws Exception {
+ Configuration conf = new Configuration();
+ conf.load(new ByteArrayInputStream("BridgeDescriptorMappingsLimit = y7".getBytes()));
+ conf.getInt(Key.BridgeDescriptorMappingsLimit);
+ }
+
+}
diff --git a/src/test/resources/junittest.policy b/src/test/resources/junittest.policy
new file mode 100644
index 0000000..e6eb2ef
--- /dev/null
+++ b/src/test/resources/junittest.policy
@@ -0,0 +1,10 @@
+/* Prevent tests from bothering production servers. */
+
+grant {
+ permission java.io.FilePermission "<<ALL FILES>>", "read, write, delete, execute";
+ permission java.util.PropertyPermission "*", "read, write";
+ permission java.lang.RuntimePermission "setIO";
+ permission java.lang.RuntimePermission "accessDeclaredMembers";
+ permission java.lang.reflect.ReflectPermission "suppressAccessChecks";
+ permission java.lang.RuntimePermission "shutdownHooks";
+};
1
0

[collector/master] Implements task-19015, switch from jul to slf4j and logback.
by karsten@torproject.org 06 Jun '16
by karsten@torproject.org 06 Jun '16
06 Jun '16
commit e89f50ec0e0bbcddc295456ee9e83b4ec7c30db0
Author: iwakeh <iwakeh(a)torproject.org>
Date: Fri Jun 3 15:10:38 2016 +0200
Implements task-19015, switch from jul to slf4j and logback.
---
build.xml | 10 +-
src/main/java/org/torproject/collector/Main.java | 14 ++-
.../bridgedescs/BridgeDescriptorParser.java | 10 +-
.../bridgedescs/BridgeSnapshotReader.java | 26 ++---
.../bridgedescs/SanitizedBridgesWriter.java | 124 ++++++++++----------
.../torproject/collector/conf/Configuration.java | 2 -
.../collector/exitlists/ExitListDownloader.java | 28 ++---
.../collector/index/CreateIndexJson.java | 3 +
.../org/torproject/collector/main/LockFile.java | 14 ++-
.../collector/relaydescs/ArchiveReader.java | 27 ++---
.../collector/relaydescs/ArchiveWriter.java | 45 ++++----
.../relaydescs/CachedRelayDescriptorReader.java | 24 ++--
.../collector/relaydescs/ReferenceChecker.java | 15 +--
.../relaydescs/RelayDescriptorDownloader.java | 43 ++++---
.../relaydescs/RelayDescriptorParser.java | 17 +--
.../collector/torperf/TorperfDownloader.java | 65 +++++------
src/main/resources/logback.xml | 126 +++++++++++++++++++++
17 files changed, 370 insertions(+), 223 deletions(-)
diff --git a/build.xml b/build.xml
index 8e46584..ffb1fca 100644
--- a/build.xml
+++ b/build.xml
@@ -25,6 +25,9 @@
<include name="gson-2.2.4.jar"/>
<include name="xz-1.5.jar"/>
<include name="descriptor-${descriptorversion}.jar"/>
+ <include name="logback-core-1.1.2.jar" />
+ <include name="logback-classic-1.1.2.jar" />
+ <include name="slf4j-api-1.7.7.jar" />
</patternset>
<path id="classpath">
<pathelement path="${classes}"/>
@@ -134,7 +137,10 @@
<jar destfile="${jarfile}"
basedir="${classes}">
<fileset dir="${classes}"/>
- <fileset dir="${resources}" includes="collector.properties"/>
+ <fileset dir="${resources}" >
+ <include name="collector.properties"/>
+ <include name="logback.xml"/>
+ </fileset>
<zipgroupfileset dir="${libs}" >
<patternset refid="runtime" />
</zipgroupfileset>
@@ -200,6 +206,7 @@
<!-- The following jvmargs prevent test access to the network. -->
<jvmarg value="-Djava.security.policy=${testresources}/junittest.policy"/>
<jvmarg value="-Djava.security.manager"/>
+ <jvmarg value="-DLOGBASE=${generated}/testcoverage-logs"/>
<classpath refid="cobertura.test.classpath" />
<formatter type="xml" />
<batchtest toDir="${testresult}" >
@@ -220,6 +227,7 @@
<!-- The following jvmargs prevent test access to the network. -->
<jvmarg value="-Djava.security.policy=${testresources}/junittest.policy"/>
<jvmarg value="-Djava.security.manager"/>
+ <jvmarg value="-DLOGBASE=${generated}/test-logs"/>
<classpath refid="test.classpath"/>
<formatter type="plain" usefile="false"/>
<batchtest>
diff --git a/src/main/java/org/torproject/collector/Main.java b/src/main/java/org/torproject/collector/Main.java
index d21cfb6..34bb95b 100644
--- a/src/main/java/org/torproject/collector/Main.java
+++ b/src/main/java/org/torproject/collector/Main.java
@@ -10,6 +10,9 @@ import org.torproject.collector.index.CreateIndexJson;
import org.torproject.collector.relaydescs.ArchiveWriter;
import org.torproject.collector.torperf.TorperfDownloader;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
@@ -19,7 +22,6 @@ import java.nio.file.Paths;
import java.nio.file.StandardCopyOption;
import java.util.HashMap;
import java.util.Map;
-import java.util.logging.Logger;
/**
* Main class for starting a CollecTor instance.
@@ -30,7 +32,7 @@ import java.util.logging.Logger;
*/
public class Main {
- private static Logger log = Logger.getLogger(Main.class.getName());
+ private static Logger log = LoggerFactory.getLogger(Main.class);
public static final String CONF_FILE = "collector.properties";
/** All possible main classes.
@@ -91,7 +93,7 @@ public class Main {
+ ") and provide at least one data source and one data sink. "
+ "Refer to the manual for more information.");
} catch (IOException e) {
- log.severe("Cannot write default configuration. Reason: " + e);
+ log.error("Cannot write default configuration. Reason: " + e, e);
}
}
@@ -99,7 +101,7 @@ public class Main {
try (FileInputStream fis = new FileInputStream(confFile)) {
conf.load(fis);
} catch (Exception e) { // catch all possible problems
- log.severe("Cannot read configuration. Reason: " + e);
+ log.error("Cannot read configuration. Reason: " + e, e);
throw e;
}
}
@@ -118,8 +120,8 @@ public class Main {
.invoke(null, (Object) conf);
} catch (NoSuchMethodException | IllegalAccessException
| InvocationTargetException e) {
- log.severe("Cannot invoke 'main' method on "
- + clazz.getName() + ". " + e);
+ log.error("Cannot invoke 'main' method on "
+ + clazz.getName() + ". " + e, e);
}
}
}
diff --git a/src/main/java/org/torproject/collector/bridgedescs/BridgeDescriptorParser.java b/src/main/java/org/torproject/collector/bridgedescs/BridgeDescriptorParser.java
index f683ea0..94d554f 100644
--- a/src/main/java/org/torproject/collector/bridgedescs/BridgeDescriptorParser.java
+++ b/src/main/java/org/torproject/collector/bridgedescs/BridgeDescriptorParser.java
@@ -3,11 +3,12 @@
package org.torproject.collector.bridgedescs;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedReader;
import java.io.IOException;
import java.io.StringReader;
-import java.util.logging.Level;
-import java.util.logging.Logger;
public class BridgeDescriptorParser {
@@ -18,7 +19,7 @@ public class BridgeDescriptorParser {
public BridgeDescriptorParser(SanitizedBridgesWriter sbw) {
this.sbw = sbw;
this.logger =
- Logger.getLogger(BridgeDescriptorParser.class.getName());
+ LoggerFactory.getLogger(BridgeDescriptorParser.class);
}
public void parse(byte[] allData, String dateTime) {
@@ -42,8 +43,7 @@ public class BridgeDescriptorParser {
}
}
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not parse bridge descriptor.",
- e);
+ this.logger.warn("Could not parse bridge descriptor.", e);
return;
}
}
diff --git a/src/main/java/org/torproject/collector/bridgedescs/BridgeSnapshotReader.java b/src/main/java/org/torproject/collector/bridgedescs/BridgeSnapshotReader.java
index 2d41d18..b1aacec 100644
--- a/src/main/java/org/torproject/collector/bridgedescs/BridgeSnapshotReader.java
+++ b/src/main/java/org/torproject/collector/bridgedescs/BridgeSnapshotReader.java
@@ -8,6 +8,9 @@ import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.compress.archivers.tar.TarArchiveInputStream;
import org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
@@ -23,8 +26,6 @@ import java.util.Set;
import java.util.SortedSet;
import java.util.Stack;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/**
* Reads the half-hourly snapshots of bridge descriptors from Tonga.
@@ -38,15 +39,14 @@ public class BridgeSnapshotReader {
throw new IllegalArgumentException();
}
- Logger logger =
- Logger.getLogger(BridgeSnapshotReader.class.getName());
+ Logger logger = LoggerFactory.getLogger(BridgeSnapshotReader.class);
SortedSet<String> parsed = new TreeSet<String>();
File bdDir = bridgeDirectoriesDir;
File pbdFile = new File(statsDirectory, "parsed-bridge-directories");
boolean modified = false;
if (bdDir.exists()) {
if (pbdFile.exists()) {
- logger.fine("Reading file " + pbdFile.getAbsolutePath() + "...");
+ logger.debug("Reading file " + pbdFile.getAbsolutePath() + "...");
try {
BufferedReader br = new BufferedReader(new FileReader(pbdFile));
String line = null;
@@ -54,15 +54,15 @@ public class BridgeSnapshotReader {
parsed.add(line);
}
br.close();
- logger.fine("Finished reading file "
+ logger.debug("Finished reading file "
+ pbdFile.getAbsolutePath() + ".");
} catch (IOException e) {
- logger.log(Level.WARNING, "Failed reading file "
+ logger.warn("Failed reading file "
+ pbdFile.getAbsolutePath() + "!", e);
return;
}
}
- logger.fine("Importing files in directory " + bridgeDirectoriesDir
+ logger.debug("Importing files in directory " + bridgeDirectoriesDir
+ "/...");
Set<String> descriptorImportHistory = new HashSet<String>();
int parsedFiles = 0;
@@ -192,13 +192,13 @@ public class BridgeSnapshotReader {
parsed.add(pop.getName());
modified = true;
} catch (IOException e) {
- logger.log(Level.WARNING, "Could not parse bridge snapshot "
+ logger.warn("Could not parse bridge snapshot "
+ pop.getName() + "!", e);
continue;
}
}
}
- logger.fine("Finished importing files in directory "
+ logger.debug("Finished importing files in directory "
+ bridgeDirectoriesDir + "/. In total, we parsed "
+ parsedFiles + " files (skipped " + skippedFiles
+ ") containing " + parsedStatuses + " statuses, "
@@ -207,7 +207,7 @@ public class BridgeSnapshotReader {
+ parsedExtraInfoDescriptors + " extra-info descriptors "
+ "(skipped " + skippedExtraInfoDescriptors + ").");
if (!parsed.isEmpty() && modified) {
- logger.fine("Writing file " + pbdFile.getAbsolutePath() + "...");
+ logger.debug("Writing file " + pbdFile.getAbsolutePath() + "...");
try {
pbdFile.getParentFile().mkdirs();
BufferedWriter bw = new BufferedWriter(new FileWriter(pbdFile));
@@ -215,10 +215,10 @@ public class BridgeSnapshotReader {
bw.append(f + "\n");
}
bw.close();
- logger.fine("Finished writing file " + pbdFile.getAbsolutePath()
+ logger.debug("Finished writing file " + pbdFile.getAbsolutePath()
+ ".");
} catch (IOException e) {
- logger.log(Level.WARNING, "Failed writing file "
+ logger.warn("Failed writing file "
+ pbdFile.getAbsolutePath() + "!", e);
}
}
diff --git a/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java b/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java
index fa24a3d..e483353 100644
--- a/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java
+++ b/src/main/java/org/torproject/collector/bridgedescs/SanitizedBridgesWriter.java
@@ -13,6 +13,9 @@ import org.apache.commons.codec.binary.Base64;
import org.apache.commons.codec.binary.Hex;
import org.apache.commons.codec.digest.DigestUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
@@ -33,8 +36,6 @@ import java.util.SortedMap;
import java.util.Stack;
import java.util.TimeZone;
import java.util.TreeMap;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/**
* <p>Sanitizes bridge descriptors, i.e., removes all possibly sensitive
@@ -51,11 +52,10 @@ import java.util.logging.Logger;
*/
public class SanitizedBridgesWriter extends Thread {
- private static Logger logger;
+ private static Logger logger = LoggerFactory.getLogger(SanitizedBridgesWriter.class);
public static void main(Configuration config) throws ConfigurationException {
- logger = Logger.getLogger(SanitizedBridgesWriter.class.getName());
logger.info("Starting bridge-descriptors module of CollecTor.");
// Use lock file to avoid overlapping runs
@@ -108,7 +108,7 @@ public class SanitizedBridgesWriter extends Thread {
try {
startProcessing();
} catch (ConfigurationException ce) {
- logger.severe("Configuration failed: " + ce);
+ logger.error("Configuration failed: " + ce, ce);
throw new RuntimeException(ce);
}
}
@@ -135,10 +135,6 @@ public class SanitizedBridgesWriter extends Thread {
this.sanitizedBridgesDirectory = sanitizedBridgesDirectory;
this.replaceIPAddressesWithHashes = replaceIPAddressesWithHashes;
- /* Initialize logger. */
- this.logger = Logger.getLogger(
- SanitizedBridgesWriter.class.getName());
-
SimpleDateFormat rsyncCatFormat = new SimpleDateFormat(
"yyyy-MM-dd-HH-mm-ss");
rsyncCatFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
@@ -150,7 +146,7 @@ public class SanitizedBridgesWriter extends Thread {
try {
this.secureRandom = SecureRandom.getInstance("SHA1PRNG", "SUN");
} catch (GeneralSecurityException e) {
- this.logger.log(Level.WARNING, "Could not initialize secure "
+ this.logger.warn("Could not initialize secure "
+ "random number generator! Not calculating any IP address "
+ "hashes in this execution!", e);
this.persistenceProblemWithSecrets = true;
@@ -172,7 +168,7 @@ public class SanitizedBridgesWriter extends Thread {
if ((line.length() != ("yyyy-MM,".length() + 31 * 2)
&& line.length() != ("yyyy-MM,".length() + 50 * 2))
|| parts.length != 2) {
- this.logger.warning("Invalid line in bridge-ip-secrets file "
+ this.logger.warn("Invalid line in bridge-ip-secrets file "
+ "starting with '" + line.substring(0, 7) + "'! "
+ "Not calculating any IP address hashes in this "
+ "execution!");
@@ -185,17 +181,17 @@ public class SanitizedBridgesWriter extends Thread {
}
br.close();
if (!this.persistenceProblemWithSecrets) {
- this.logger.fine("Read "
+ this.logger.debug("Read "
+ this.secretsForHashingIPAddresses.size() + " secrets for "
+ "hashing bridge IP addresses.");
}
} catch (DecoderException e) {
- this.logger.log(Level.WARNING, "Failed to decode hex string in "
+ this.logger.warn("Failed to decode hex string in "
+ this.bridgeIpSecretsFile + "! Not calculating any IP "
+ "address hashes in this execution!", e);
this.persistenceProblemWithSecrets = true;
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed to read "
+ this.logger.warn("Failed to read "
+ this.bridgeIpSecretsFile + "! Not calculating any IP "
+ "address hashes in this execution!", e);
this.persistenceProblemWithSecrets = true;
@@ -374,7 +370,7 @@ public class SanitizedBridgesWriter extends Thread {
}
if (month.compareTo(
this.bridgeSanitizingCutOffTimestamp) < 0) {
- this.logger.warning("Generated a secret that we won't make "
+ this.logger.warn("Generated a secret that we won't make "
+ "persistent, because it's outside our bridge descriptor "
+ "sanitizing interval.");
} else {
@@ -390,7 +386,7 @@ public class SanitizedBridgesWriter extends Thread {
bw.write(month + "," + Hex.encodeHexString(secret) + "\n");
bw.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not store new secret "
+ this.logger.warn("Could not store new secret "
+ "to disk! Not calculating any IP address hashes in "
+ "this execution!", e);
this.persistenceProblemWithSecrets = true;
@@ -422,11 +418,15 @@ public class SanitizedBridgesWriter extends Thread {
if (this.bridgeSanitizingCutOffTimestamp
.compareTo(publicationTime) > 0) {
- this.logger.log(!this.haveWarnedAboutInterval ? Level.WARNING
- : Level.FINE, "Sanitizing and storing network status with "
+ String text = "Sanitizing and storing network status with "
+ "publication time outside our descriptor sanitizing "
- + "interval.");
- this.haveWarnedAboutInterval = true;
+ + "interval.";
+ if (this.haveWarnedAboutInterval) {
+ this.logger.debug(text);
+ } else {
+ this.logger.warn(text);
+ this.haveWarnedAboutInterval = true;
+ }
}
/* Parse the given network status line by line. */
@@ -510,7 +510,7 @@ public class SanitizedBridgesWriter extends Thread {
if (scrubbedOrAddress != null) {
scrubbed.append("a " + scrubbedOrAddress + "\n");
} else {
- this.logger.warning("Invalid address in line '" + line
+ this.logger.warn("Invalid address in line '" + line
+ "' in bridge network status. Skipping line!");
}
@@ -524,7 +524,7 @@ public class SanitizedBridgesWriter extends Thread {
* network status. If there is, we should probably learn before
* writing anything to the sanitized descriptors. */
} else {
- this.logger.fine("Unknown line '" + line + "' in bridge "
+ this.logger.debug("Unknown line '" + line + "' in bridge "
+ "network status. Not writing to disk!");
return;
}
@@ -544,18 +544,18 @@ public class SanitizedBridgesWriter extends Thread {
if (formatter.parse(publicationTime).getTime()
- formatter.parse(mostRecentDescPublished).getTime()
> 60L * 60L * 1000L) {
- this.logger.warning("The most recent descriptor in the bridge "
+ this.logger.warn("The most recent descriptor in the bridge "
+ "network status published at " + publicationTime + " was "
+ "published at " + mostRecentDescPublished + " which is "
+ "more than 1 hour before the status. This is a sign for "
+ "the status being stale. Please check!");
}
} catch (ParseException e) {
- this.logger.log(Level.WARNING, "Could not parse timestamp in "
+ this.logger.warn("Could not parse timestamp in "
+ "bridge network status.", e);
return;
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not parse bridge network "
+ this.logger.warn("Could not parse bridge network "
+ "status.", e);
return;
}
@@ -589,7 +589,7 @@ public class SanitizedBridgesWriter extends Thread {
bw.close();
}
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not write sanitized bridge "
+ this.logger.warn("Could not write sanitized bridge "
+ "network status to disk.", e);
return;
}
@@ -656,11 +656,15 @@ public class SanitizedBridgesWriter extends Thread {
}
if (this.bridgeSanitizingCutOffTimestamp
.compareTo(published) > 0) {
- this.logger.log(!this.haveWarnedAboutInterval
- ? Level.WARNING : Level.FINE, "Sanitizing and storing "
+ String text = "Sanitizing and storing "
+ "server descriptor with publication time outside our "
- + "descriptor sanitizing interval.");
- this.haveWarnedAboutInterval = true;
+ + "descriptor sanitizing interval.";
+ if (this.haveWarnedAboutInterval) {
+ this.logger.debug(text);
+ } else {
+ this.logger.warn(text);
+ this.haveWarnedAboutInterval = true;
+ }
}
scrubbed.append(line + "\n");
@@ -686,7 +690,7 @@ public class SanitizedBridgesWriter extends Thread {
if (scrubbedOrAddress != null) {
scrubbedOrAddresses.add(scrubbedOrAddress);
} else {
- this.logger.warning("Invalid address in line "
+ this.logger.warn("Invalid address in line "
+ "'or-address " + orAddress + "' in bridge server "
+ "descriptor. Skipping line!");
}
@@ -776,7 +780,7 @@ public class SanitizedBridgesWriter extends Thread {
+ "\n");
if (masterKeyEd25519 != null && !masterKeyEd25519.equals(
masterKeyEd25519FromIdentityEd25519)) {
- this.logger.warning("Mismatch between identity-ed25519 and "
+ this.logger.warn("Mismatch between identity-ed25519 and "
+ "master-key-ed25519. Skipping.");
return;
}
@@ -787,7 +791,7 @@ public class SanitizedBridgesWriter extends Thread {
if (masterKeyEd25519FromIdentityEd25519 != null
&& !masterKeyEd25519FromIdentityEd25519.equals(
masterKeyEd25519)) {
- this.logger.warning("Mismatch between identity-ed25519 and "
+ this.logger.warn("Mismatch between identity-ed25519 and "
+ "master-key-ed25519. Skipping.");
return;
}
@@ -854,14 +858,14 @@ public class SanitizedBridgesWriter extends Thread {
* that we need to remove or replace for the sanitized descriptor
* version. */
} else {
- this.logger.warning("Unrecognized line '" + line
+ this.logger.warn("Unrecognized line '" + line
+ "'. Skipping.");
return;
}
}
br.close();
} catch (Exception e) {
- this.logger.log(Level.WARNING, "Could not parse server "
+ this.logger.warn("Could not parse server "
+ "descriptor.", e);
return;
}
@@ -883,7 +887,7 @@ public class SanitizedBridgesWriter extends Thread {
/* Handle below. */
}
if (descriptorDigest == null) {
- this.logger.log(Level.WARNING, "Could not calculate server "
+ this.logger.warn("Could not calculate server "
+ "descriptor digest.");
return;
}
@@ -906,7 +910,7 @@ public class SanitizedBridgesWriter extends Thread {
/* Handle below. */
}
if (descriptorDigestSha256Base64 == null) {
- this.logger.log(Level.WARNING, "Could not calculate server "
+ this.logger.warn("Could not calculate server "
+ "descriptor SHA256 digest.");
return;
}
@@ -947,7 +951,7 @@ public class SanitizedBridgesWriter extends Thread {
bw.close();
}
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not write sanitized server "
+ this.logger.warn("Could not write sanitized server "
+ "descriptor to disk.", e);
return;
}
@@ -957,26 +961,26 @@ public class SanitizedBridgesWriter extends Thread {
String identityEd25519Base64) {
byte[] identityEd25519 = Base64.decodeBase64(identityEd25519Base64);
if (identityEd25519.length < 40) {
- this.logger.warning("Invalid length of identity-ed25519 (in "
+ this.logger.warn("Invalid length of identity-ed25519 (in "
+ "bytes): " + identityEd25519.length);
} else if (identityEd25519[0] != 0x01) {
- this.logger.warning("Unknown version in identity-ed25519: "
+ this.logger.warn("Unknown version in identity-ed25519: "
+ identityEd25519[0]);
} else if (identityEd25519[1] != 0x04) {
- this.logger.warning("Unknown cert type in identity-ed25519: "
+ this.logger.warn("Unknown cert type in identity-ed25519: "
+ identityEd25519[1]);
} else if (identityEd25519[6] != 0x01) {
- this.logger.warning("Unknown certified key type in "
+ this.logger.warn("Unknown certified key type in "
+ "identity-ed25519: " + identityEd25519[1]);
} else if (identityEd25519[39] == 0x00) {
- this.logger.warning("No extensions in identity-ed25519 (which "
+ this.logger.warn("No extensions in identity-ed25519 (which "
+ "would contain the encoded master-key-ed25519): "
+ identityEd25519[39]);
} else {
int extensionStart = 40;
for (int i = 0; i < (int) identityEd25519[39]; i++) {
if (identityEd25519.length < extensionStart + 4) {
- this.logger.warning("Invalid extension with id " + i
+ this.logger.warn("Invalid extension with id " + i
+ " in identity-ed25519.");
break;
}
@@ -986,7 +990,7 @@ public class SanitizedBridgesWriter extends Thread {
int extensionType = identityEd25519[extensionStart + 2];
if (extensionLength == 32 && extensionType == 4) {
if (identityEd25519.length < extensionStart + 4 + 32) {
- this.logger.warning("Invalid extension with id " + i
+ this.logger.warn("Invalid extension with id " + i
+ " in identity-ed25519.");
break;
}
@@ -1002,7 +1006,7 @@ public class SanitizedBridgesWriter extends Thread {
extensionStart += 4 + extensionLength;
}
}
- this.logger.warning("Unable to locate master-key-ed25519 in "
+ this.logger.warn("Unable to locate master-key-ed25519 in "
+ "identity-ed25519.");
return null;
}
@@ -1050,7 +1054,7 @@ public class SanitizedBridgesWriter extends Thread {
* name. */
} else if (line.startsWith("transport ")) {
if (parts.length < 3) {
- this.logger.fine("Illegal line in extra-info descriptor: '"
+ this.logger.debug("Illegal line in extra-info descriptor: '"
+ line + "'. Skipping descriptor.");
return;
}
@@ -1080,7 +1084,7 @@ public class SanitizedBridgesWriter extends Thread {
+ "\n");
if (masterKeyEd25519 != null && !masterKeyEd25519.equals(
masterKeyEd25519FromIdentityEd25519)) {
- this.logger.warning("Mismatch between identity-ed25519 and "
+ this.logger.warn("Mismatch between identity-ed25519 and "
+ "master-key-ed25519. Skipping.");
return;
}
@@ -1091,7 +1095,7 @@ public class SanitizedBridgesWriter extends Thread {
if (masterKeyEd25519FromIdentityEd25519 != null
&& !masterKeyEd25519FromIdentityEd25519.equals(
masterKeyEd25519)) {
- this.logger.warning("Mismatch between identity-ed25519 and "
+ this.logger.warn("Mismatch between identity-ed25519 and "
+ "master-key-ed25519. Skipping.");
return;
}
@@ -1128,18 +1132,18 @@ public class SanitizedBridgesWriter extends Thread {
* that we need to remove or replace for the sanitized descriptor
* version. */
} else {
- this.logger.warning("Unrecognized line '" + line
+ this.logger.warn("Unrecognized line '" + line
+ "'. Skipping.");
return;
}
}
br.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not parse extra-info "
+ this.logger.warn("Could not parse extra-info "
+ "descriptor.", e);
return;
} catch (DecoderException e) {
- this.logger.log(Level.WARNING, "Could not parse extra-info "
+ this.logger.warn("Could not parse extra-info "
+ "descriptor.", e);
return;
}
@@ -1161,7 +1165,7 @@ public class SanitizedBridgesWriter extends Thread {
/* Handle below. */
}
if (descriptorDigest == null) {
- this.logger.log(Level.WARNING, "Could not calculate extra-info "
+ this.logger.warn("Could not calculate extra-info "
+ "descriptor digest.");
return;
}
@@ -1184,7 +1188,7 @@ public class SanitizedBridgesWriter extends Thread {
/* Handle below. */
}
if (descriptorDigestSha256Base64 == null) {
- this.logger.log(Level.WARNING, "Could not calculate extra-info "
+ this.logger.warn("Could not calculate extra-info "
+ "descriptor SHA256 digest.");
return;
}
@@ -1224,7 +1228,7 @@ public class SanitizedBridgesWriter extends Thread {
bw.close();
}
} catch (Exception e) {
- this.logger.log(Level.WARNING, "Could not write sanitized "
+ this.logger.warn("Could not write sanitized "
+ "extra-info descriptor to disk.", e);
}
}
@@ -1261,7 +1265,7 @@ public class SanitizedBridgesWriter extends Thread {
this.logger.info("Deleted " + deleted + " secrets that we don't "
+ "need anymore and kept " + kept + ".");
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not store reduced set of "
+ this.logger.warn("Could not store reduced set of "
+ "secrets to disk! This is a bad sign, better check what's "
+ "going on!", e);
}
@@ -1278,7 +1282,7 @@ public class SanitizedBridgesWriter extends Thread {
dateTimeFormat.parse(maxNetworkStatusPublishedTime).getTime();
if (maxNetworkStatusPublishedMillis > 0L
&& maxNetworkStatusPublishedMillis < tooOldMillis) {
- this.logger.warning("The last known bridge network status was "
+ this.logger.warn("The last known bridge network status was "
+ "published " + maxNetworkStatusPublishedTime + ", which is "
+ "more than 5:30 hours in the past.");
}
@@ -1287,7 +1291,7 @@ public class SanitizedBridgesWriter extends Thread {
.getTime();
if (maxServerDescriptorPublishedMillis > 0L
&& maxServerDescriptorPublishedMillis < tooOldMillis) {
- this.logger.warning("The last known bridge server descriptor was "
+ this.logger.warn("The last known bridge server descriptor was "
+ "published " + maxServerDescriptorPublishedTime + ", which "
+ "is more than 5:30 hours in the past.");
}
@@ -1296,12 +1300,12 @@ public class SanitizedBridgesWriter extends Thread {
.getTime();
if (maxExtraInfoDescriptorPublishedMillis > 0L
&& maxExtraInfoDescriptorPublishedMillis < tooOldMillis) {
- this.logger.warning("The last known bridge extra-info descriptor "
+ this.logger.warn("The last known bridge extra-info descriptor "
+ "was published " + maxExtraInfoDescriptorPublishedTime
+ ", which is more than 5:30 hours in the past.");
}
} catch (ParseException e) {
- this.logger.log(Level.WARNING, "Unable to parse timestamp for "
+ this.logger.warn("Unable to parse timestamp for "
+ "stale check.", e);
}
}
diff --git a/src/main/java/org/torproject/collector/conf/Configuration.java b/src/main/java/org/torproject/collector/conf/Configuration.java
index 8b8cc12..2166402 100644
--- a/src/main/java/org/torproject/collector/conf/Configuration.java
+++ b/src/main/java/org/torproject/collector/conf/Configuration.java
@@ -11,8 +11,6 @@ import java.nio.file.Paths;
import java.util.Arrays;
import java.util.List;
import java.util.Properties;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/**
* Initialize configuration with defaults from collector.properties,
diff --git a/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java b/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java
index 53fc300..65d7b87 100644
--- a/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java
+++ b/src/main/java/org/torproject/collector/exitlists/ExitListDownloader.java
@@ -13,6 +13,9 @@ import org.torproject.descriptor.DescriptorParser;
import org.torproject.descriptor.DescriptorSourceFactory;
import org.torproject.descriptor.ExitList;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedInputStream;
import java.io.BufferedWriter;
import java.io.File;
@@ -28,13 +31,10 @@ import java.util.SortedSet;
import java.util.Stack;
import java.util.TimeZone;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
public class ExitListDownloader extends Thread {
- private static Logger logger =
- Logger.getLogger(ExitListDownloader.class.getName());
+ private static Logger logger = LoggerFactory.getLogger(ExitListDownloader.class);
public static void main(Configuration config) throws ConfigurationException {
logger.info("Starting exit-lists module of CollecTor.");
@@ -58,7 +58,7 @@ public class ExitListDownloader extends Thread {
try {
startProcessing();
} catch (ConfigurationException ce) {
- logger.severe("Configuration failed: " + ce);
+ logger.error("Configuration failed: " + ce, ce);
throw new RuntimeException(ce);
}
}
@@ -72,7 +72,7 @@ public class ExitListDownloader extends Thread {
Date downloadedDate = new Date();
String downloadedExitList = null;
try {
- logger.fine("Downloading exit list...");
+ logger.debug("Downloading exit list...");
StringBuilder sb = new StringBuilder();
sb.append("@type tordnsel 1.0\n");
sb.append("Downloaded " + dateTimeFormat.format(downloadedDate)
@@ -85,7 +85,7 @@ public class ExitListDownloader extends Thread {
huc.connect();
int response = huc.getResponseCode();
if (response != 200) {
- logger.warning("Could not download exit list. Response code "
+ logger.warn("Could not download exit list. Response code "
+ response);
return;
}
@@ -98,13 +98,13 @@ public class ExitListDownloader extends Thread {
}
in.close();
downloadedExitList = sb.toString();
- logger.fine("Finished downloading exit list.");
+ logger.debug("Finished downloading exit list.");
} catch (IOException e) {
- logger.log(Level.WARNING, "Failed downloading exit list", e);
+ logger.warn("Failed downloading exit list", e);
return;
}
if (downloadedExitList == null) {
- logger.warning("Failed downloading exit list");
+ logger.warn("Failed downloading exit list");
return;
}
@@ -123,7 +123,7 @@ public class ExitListDownloader extends Thread {
tarballFile.getName());
if (parsedDescriptors.size() != 1
|| !(parsedDescriptors.get(0) instanceof ExitList)) {
- logger.warning("Could not parse downloaded exit list");
+ logger.warn("Could not parse downloaded exit list");
return;
}
ExitList parsedExitList = (ExitList) parsedDescriptors.get(0);
@@ -133,12 +133,12 @@ public class ExitListDownloader extends Thread {
}
}
} catch (DescriptorParseException e) {
- logger.log(Level.WARNING, "Could not parse downloaded exit list",
+ logger.warn("Could not parse downloaded exit list",
e);
}
if (maxScanMillis > 0L
&& maxScanMillis + 330L * 60L * 1000L < System.currentTimeMillis()) {
- logger.warning("The last reported scan in the downloaded exit list "
+ logger.warn("The last reported scan in the downloaded exit list "
+ "took place at " + dateTimeFormat.format(maxScanMillis)
+ ", which is more than 5:30 hours in the past.");
}
@@ -155,7 +155,7 @@ public class ExitListDownloader extends Thread {
bw.write(downloadedExitList);
bw.close();
} catch (IOException e) {
- logger.log(Level.WARNING, "Could not write downloaded exit list "
+ logger.warn("Could not write downloaded exit list "
+ "to " + outputFile.getAbsolutePath(), e);
}
}
diff --git a/src/main/java/org/torproject/collector/index/CreateIndexJson.java b/src/main/java/org/torproject/collector/index/CreateIndexJson.java
index de69488..639a4be 100644
--- a/src/main/java/org/torproject/collector/index/CreateIndexJson.java
+++ b/src/main/java/org/torproject/collector/index/CreateIndexJson.java
@@ -13,6 +13,9 @@ import com.google.gson.GsonBuilder;
import org.apache.commons.compress.compressors.bzip2.BZip2CompressorOutputStream;
import org.apache.commons.compress.compressors.xz.XZCompressorOutputStream;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileOutputStream;
diff --git a/src/main/java/org/torproject/collector/main/LockFile.java b/src/main/java/org/torproject/collector/main/LockFile.java
index f168bc3..0931d1f 100644
--- a/src/main/java/org/torproject/collector/main/LockFile.java
+++ b/src/main/java/org/torproject/collector/main/LockFile.java
@@ -3,19 +3,21 @@
package org.torproject.collector.main;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
-import java.util.logging.Logger;
public class LockFile {
private final File lockFile;
private final String moduleName;
- private final Logger logger = Logger.getLogger(LockFile.class.getName());
+ private final Logger logger = LoggerFactory.getLogger(LockFile.class);
public LockFile(String moduleName) {
this("lock", moduleName);
@@ -27,7 +29,7 @@ public class LockFile {
}
public boolean acquireLock() {
- this.logger.fine("Trying to acquire lock...");
+ this.logger.debug("Trying to acquire lock...");
try {
if (this.lockFile.exists()) {
BufferedReader br = new BufferedReader(new FileReader(
@@ -43,7 +45,7 @@ public class LockFile {
this.lockFile));
bw.append("" + System.currentTimeMillis() + "\n");
bw.close();
- this.logger.fine("Acquired lock.");
+ this.logger.debug("Acquired lock.");
return true;
} catch (IOException e) {
throw new RuntimeException("Caught exception while trying to acquire "
@@ -52,9 +54,9 @@ public class LockFile {
}
public void releaseLock() {
- this.logger.fine("Releasing lock...");
+ this.logger.debug("Releasing lock...");
this.lockFile.delete();
- this.logger.fine("Released lock.");
+ this.logger.debug("Released lock.");
}
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/ArchiveReader.java b/src/main/java/org/torproject/collector/relaydescs/ArchiveReader.java
index 72f8231..c1981cc 100644
--- a/src/main/java/org/torproject/collector/relaydescs/ArchiveReader.java
+++ b/src/main/java/org/torproject/collector/relaydescs/ArchiveReader.java
@@ -7,6 +7,9 @@ import org.apache.commons.codec.binary.Base64;
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.compress.compressors.bzip2.BZip2CompressorInputStream;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
@@ -30,8 +33,6 @@ import java.util.SortedSet;
import java.util.Stack;
import java.util.TimeZone;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/**
* Read in all files in a given directory and pass buffered readers of
@@ -53,7 +54,7 @@ public class ArchiveReader {
rdp.setArchiveReader(this);
int parsedFiles = 0;
int ignoredFiles = 0;
- Logger logger = Logger.getLogger(ArchiveReader.class.getName());
+ Logger logger = LoggerFactory.getLogger(ArchiveReader.class);
SortedSet<String> archivesImportHistory = new TreeSet<String>();
File archivesImportHistoryFile = new File(statsDirectory,
"archives-import-history");
@@ -67,12 +68,12 @@ public class ArchiveReader {
}
br.close();
} catch (IOException e) {
- logger.log(Level.WARNING, "Could not read in archives import "
- + "history file. Skipping.");
+ logger.warn("Could not read in archives import "
+ + "history file. Skipping.", e);
}
}
if (archivesDirectory.exists()) {
- logger.fine("Importing files in directory " + archivesDirectory
+ logger.debug("Importing files in directory " + archivesDirectory
+ "/...");
Stack<File> filesInInputDir = new Stack<File>();
filesInInputDir.add(archivesDirectory);
@@ -93,7 +94,7 @@ public class ArchiveReader {
ignoredFiles++;
continue;
} else if (pop.getName().endsWith(".tar.bz2")) {
- logger.warning("Cannot parse compressed tarball "
+ logger.warn("Cannot parse compressed tarball "
+ pop.getAbsolutePath() + ". Skipping.");
continue;
} else if (pop.getName().endsWith(".bz2")) {
@@ -165,12 +166,12 @@ public class ArchiveReader {
} while (line != null && line.startsWith("@"));
br.close();
if (line == null) {
- logger.fine("We were given an empty descriptor for "
+ logger.debug("We were given an empty descriptor for "
+ "parsing. Ignoring.");
continue;
}
if (!line.equals("onion-key")) {
- logger.fine("Skipping non-recognized descriptor.");
+ logger.debug("Skipping non-recognized descriptor.");
continue;
}
SimpleDateFormat parseFormat =
@@ -204,7 +205,7 @@ public class ArchiveReader {
String digest256Hex = DigestUtils.sha256Hex(descBytes);
if (!this.microdescriptorValidAfterTimes.containsKey(
digest256Hex)) {
- logger.fine("Could not store microdescriptor '"
+ logger.debug("Could not store microdescriptor '"
+ digest256Hex + "', which was not contained in a "
+ "microdesc consensus.");
continue;
@@ -217,7 +218,7 @@ public class ArchiveReader {
rdp.storeMicrodescriptor(descBytes, digest256Hex,
digest256Base64, validAfter);
} catch (ParseException e) {
- logger.log(Level.WARNING, "Could not parse "
+ logger.warn("Could not parse "
+ "valid-after time '" + validAfterTime + "'. Not "
+ "storing microdescriptor.", e);
}
@@ -236,7 +237,7 @@ public class ArchiveReader {
}
}
if (problems.isEmpty()) {
- logger.fine("Finished importing files in directory "
+ logger.debug("Finished importing files in directory "
+ archivesDirectory + "/.");
} else {
StringBuilder sb = new StringBuilder("Failed importing files in "
@@ -261,7 +262,7 @@ public class ArchiveReader {
}
bw.close();
} catch (IOException e) {
- logger.log(Level.WARNING, "Could not write archives import "
+ logger.warn("Could not write archives import "
+ "history file.");
}
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java b/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java
index 43c7975..6495df6 100644
--- a/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java
+++ b/src/main/java/org/torproject/collector/relaydescs/ArchiveWriter.java
@@ -11,6 +11,9 @@ import org.torproject.descriptor.DescriptorParseException;
import org.torproject.descriptor.DescriptorParser;
import org.torproject.descriptor.DescriptorSourceFactory;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedOutputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
@@ -35,12 +38,10 @@ import java.util.SortedSet;
import java.util.Stack;
import java.util.TimeZone;
import java.util.TreeMap;
-import java.util.logging.Level;
-import java.util.logging.Logger;
public class ArchiveWriter extends Thread {
- private static Logger logger = Logger.getLogger(ArchiveWriter.class.getName());
+ private static Logger logger = LoggerFactory.getLogger(ArchiveWriter.class);
private Configuration config;
@@ -145,7 +146,7 @@ public class ArchiveWriter extends Thread {
try {
startProcessing();
} catch (ConfigurationException ce) {
- logger.severe("Configuration failed: " + ce);
+ logger.error("Configuration failed: " + ce, ce);
throw new RuntimeException(ce);
}
}
@@ -227,7 +228,7 @@ public class ArchiveWriter extends Thread {
while ((line = br.readLine()) != null) {
String[] parts = line.split(",");
if (parts.length != 3) {
- this.logger.warning("Could not load server descriptor "
+ this.logger.warn("Could not load server descriptor "
+ "digests because of illegal line '" + line + "'. We "
+ "might not be able to correctly check descriptors for "
+ "completeness.");
@@ -256,7 +257,7 @@ public class ArchiveWriter extends Thread {
while ((line = br.readLine()) != null) {
String[] parts = line.split(",");
if (parts.length != 2) {
- this.logger.warning("Could not load extra-info descriptor "
+ this.logger.warn("Could not load extra-info descriptor "
+ "digests because of illegal line '" + line + "'. We "
+ "might not be able to correctly check descriptors for "
+ "completeness.");
@@ -283,7 +284,7 @@ public class ArchiveWriter extends Thread {
while ((line = br.readLine()) != null) {
String[] parts = line.split(",");
if (parts.length != 2) {
- this.logger.warning("Could not load microdescriptor digests "
+ this.logger.warn("Could not load microdescriptor digests "
+ "because of illegal line '" + line + "'. We might not "
+ "be able to correctly check descriptors for "
+ "completeness.");
@@ -304,11 +305,11 @@ public class ArchiveWriter extends Thread {
br.close();
}
} catch (ParseException e) {
- this.logger.log(Level.WARNING, "Could not load descriptor "
+ this.logger.warn("Could not load descriptor "
+ "digests. We might not be able to correctly check "
+ "descriptors for completeness.", e);
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not load descriptor "
+ this.logger.warn("Could not load descriptor "
+ "digests. We might not be able to correctly check "
+ "descriptors for completeness.", e);
}
@@ -494,7 +495,7 @@ public class ArchiveWriter extends Thread {
}
this.logger.info(sb.toString());
if (missingDescriptors) {
- this.logger.fine("We are missing at least 0.5% of server or "
+ this.logger.debug("We are missing at least 0.5% of server or "
+ "extra-info descriptors referenced from a consensus or "
+ "vote or at least 0.5% of microdescriptors referenced from a "
+ "microdesc consensus.");
@@ -502,13 +503,13 @@ public class ArchiveWriter extends Thread {
if (missingVotes) {
/* TODO Shouldn't warn if we're not trying to archive votes at
* all. */
- this.logger.fine("We are missing at least one vote that was "
+ this.logger.debug("We are missing at least one vote that was "
+ "referenced from a consensus.");
}
if (missingMicrodescConsensus) {
/* TODO Shouldn't warn if we're not trying to archive microdesc
* consensuses at all. */
- this.logger.fine("We are missing at least one microdesc "
+ this.logger.debug("We are missing at least one microdesc "
+ "consensus that was published together with a known "
+ "consensus.");
}
@@ -521,14 +522,14 @@ public class ArchiveWriter extends Thread {
long tooOldMillis = this.now - 330L * 60L * 1000L;
if (!this.storedConsensuses.isEmpty()
&& this.storedConsensuses.lastKey() < tooOldMillis) {
- this.logger.warning("The last known relay network status "
+ this.logger.warn("The last known relay network status "
+ "consensus was valid after "
+ dateTimeFormat.format(this.storedConsensuses.lastKey())
+ ", which is more than 5:30 hours in the past.");
}
if (!this.storedMicrodescConsensuses.isEmpty()
&& this.storedMicrodescConsensuses.lastKey() < tooOldMillis) {
- this.logger.warning("The last known relay network status "
+ this.logger.warn("The last known relay network status "
+ "microdesc consensus was valid after "
+ dateTimeFormat.format(
this.storedMicrodescConsensuses.lastKey())
@@ -536,28 +537,28 @@ public class ArchiveWriter extends Thread {
}
if (!this.storedVotes.isEmpty()
&& this.storedVotes.lastKey() < tooOldMillis) {
- this.logger.warning("The last known relay network status vote "
+ this.logger.warn("The last known relay network status vote "
+ "was valid after " + dateTimeFormat.format(
this.storedVotes.lastKey()) + ", which is more than 5:30 hours "
+ "in the past.");
}
if (!this.storedServerDescriptors.isEmpty()
&& this.storedServerDescriptors.lastKey() < tooOldMillis) {
- this.logger.warning("The last known relay server descriptor was "
+ this.logger.warn("The last known relay server descriptor was "
+ "published at "
+ dateTimeFormat.format(this.storedServerDescriptors.lastKey())
+ ", which is more than 5:30 hours in the past.");
}
if (!this.storedExtraInfoDescriptors.isEmpty()
&& this.storedExtraInfoDescriptors.lastKey() < tooOldMillis) {
- this.logger.warning("The last known relay extra-info descriptor "
+ this.logger.warn("The last known relay extra-info descriptor "
+ "was published at " + dateTimeFormat.format(
this.storedExtraInfoDescriptors.lastKey())
+ ", which is more than 5:30 hours in the past.");
}
if (!this.storedMicrodescriptors.isEmpty()
&& this.storedMicrodescriptors.lastKey() < tooOldMillis) {
- this.logger.warning("The last known relay microdescriptor was "
+ this.logger.warn("The last known relay microdescriptor was "
+ "contained in a microdesc consensus that was valid after "
+ dateTimeFormat.format(this.storedMicrodescriptors.lastKey())
+ ", which is more than 5:30 hours in the past.");
@@ -637,7 +638,7 @@ public class ArchiveWriter extends Thread {
}
bw.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not save descriptor "
+ this.logger.warn("Could not save descriptor "
+ "digests. We might not be able to correctly check "
+ "descriptors for completeness in the next run.", e);
}
@@ -825,7 +826,7 @@ public class ArchiveWriter extends Thread {
private boolean store(byte[] typeAnnotation, byte[] data,
File[] outputFiles, boolean[] append) {
try {
- this.logger.finer("Storing " + outputFiles[0]);
+ this.logger.trace("Storing " + outputFiles[0]);
if (this.descriptorParser.parseDescriptors(data,
outputFiles[0].getName()).size() != 1) {
this.logger.info("Relay descriptor file " + outputFiles[0]
@@ -846,10 +847,10 @@ public class ArchiveWriter extends Thread {
}
return true;
} catch (DescriptorParseException e) {
- this.logger.log(Level.WARNING, "Could not parse relay descriptor "
+ this.logger.warn("Could not parse relay descriptor "
+ outputFiles[0] + " before storing it to disk. Skipping.", e);
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not store relay descriptor "
+ this.logger.warn("Could not store relay descriptor "
+ outputFiles[0], e);
}
return false;
diff --git a/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java b/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java
index 00eeab1..6bee6d6 100644
--- a/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java
+++ b/src/main/java/org/torproject/collector/relaydescs/CachedRelayDescriptorReader.java
@@ -6,6 +6,9 @@ package org.torproject.collector.relaydescs;
import org.apache.commons.codec.binary.Hex;
import org.apache.commons.codec.digest.DigestUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
@@ -26,8 +29,6 @@ import java.util.SortedSet;
import java.util.Stack;
import java.util.TimeZone;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/**
* Parses all descriptors in local directory cacheddesc/ and sorts them
@@ -44,8 +45,7 @@ public class CachedRelayDescriptorReader {
StringBuilder dumpStats = new StringBuilder("Finished importing "
+ "relay descriptors from local Tor data directories:");
- Logger logger = Logger.getLogger(
- CachedRelayDescriptorReader.class.getName());
+ Logger logger = LoggerFactory.getLogger(CachedRelayDescriptorReader.class);
/* Read import history containing SHA-1 digests of previously parsed
* statuses and descriptors, so that we can skip them in this run. */
@@ -63,7 +63,7 @@ public class CachedRelayDescriptorReader {
}
br.close();
} catch (IOException e) {
- logger.log(Level.WARNING, "Could not read import history from "
+ logger.warn("Could not read import history from "
+ importHistoryFile.getAbsolutePath() + ".", e);
}
}
@@ -72,11 +72,11 @@ public class CachedRelayDescriptorReader {
for (String inputDirectory : inputDirectories) {
File cachedDescDir = new File(inputDirectory);
if (!cachedDescDir.exists()) {
- logger.warning("Directory " + cachedDescDir.getAbsolutePath()
+ logger.warn("Directory " + cachedDescDir.getAbsolutePath()
+ " does not exist. Skipping.");
continue;
}
- logger.fine("Reading " + cachedDescDir.getAbsolutePath()
+ logger.debug("Reading " + cachedDescDir.getAbsolutePath()
+ " directory.");
SortedSet<File> cachedDescFiles = new TreeSet<File>();
Stack<File> files = new Stack<File>();
@@ -118,7 +118,7 @@ public class CachedRelayDescriptorReader {
if (dateTimeFormat.parse(line.substring("valid-after "
.length())).getTime() < System.currentTimeMillis()
- 6L * 60L * 60L * 1000L) {
- logger.warning("Cached descriptor files in "
+ logger.warn("Cached descriptor files in "
+ cachedDescDir.getAbsolutePath() + " are stale. "
+ "The valid-after line in cached-consensus is '"
+ line + "'.");
@@ -224,14 +224,14 @@ public class CachedRelayDescriptorReader {
? "server" : "extra-info") + " descriptors");
}
} catch (IOException e) {
- logger.log(Level.WARNING, "Failed reading "
+ logger.warn("Failed reading "
+ cachedDescDir.getAbsolutePath() + " directory.", e);
} catch (ParseException e) {
- logger.log(Level.WARNING, "Failed reading "
+ logger.warn("Failed reading "
+ cachedDescDir.getAbsolutePath() + " directory.", e);
}
}
- logger.fine("Finished reading "
+ logger.debug("Finished reading "
+ cachedDescDir.getAbsolutePath() + " directory.");
}
@@ -245,7 +245,7 @@ public class CachedRelayDescriptorReader {
}
bw.close();
} catch (IOException e) {
- logger.log(Level.WARNING, "Could not write import history to "
+ logger.warn("Could not write import history to "
+ importHistoryFile.getAbsolutePath() + ".", e);
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/ReferenceChecker.java b/src/main/java/org/torproject/collector/relaydescs/ReferenceChecker.java
index 9f0f183..0255163 100644
--- a/src/main/java/org/torproject/collector/relaydescs/ReferenceChecker.java
+++ b/src/main/java/org/torproject/collector/relaydescs/ReferenceChecker.java
@@ -17,6 +17,9 @@ import org.torproject.descriptor.ServerDescriptor;
import com.google.gson.Gson;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
@@ -31,12 +34,10 @@ import java.util.Set;
import java.util.SortedSet;
import java.util.TimeZone;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
public class ReferenceChecker {
- private Logger log = Logger.getLogger(ReferenceChecker.class.getName());
+ private Logger log = LoggerFactory.getLogger(ReferenceChecker.class);
private File descriptorsDir;
@@ -141,7 +142,7 @@ public class ReferenceChecker {
Reference[].class)));
fr.close();
} catch (IOException e) {
- this.log.log(Level.WARNING, "Cannot read existing references file "
+ this.log.warn("Cannot read existing references file "
+ "from previous run.", e);
}
}
@@ -297,9 +298,9 @@ public class ReferenceChecker {
totalMissingDescriptorsWeight));
}
}
- this.log.log(Level.INFO, sb.toString());
+ this.log.info(sb.toString());
if (totalMissingDescriptorsWeight > 0.999) {
- this.log.log(Level.WARNING, "Missing too many referenced "
+ this.log.warn("Missing too many referenced "
+ "descriptors (" + totalMissingDescriptorsWeight + ").");
}
}
@@ -311,7 +312,7 @@ public class ReferenceChecker {
gson.toJson(this.references, fw);
fw.close();
} catch (IOException e) {
- this.log.log(Level.WARNING, "Cannot write references file for next "
+ this.log.warn("Cannot write references file for next "
+ "run.", e);
}
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java b/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java
index bd0a482..fe3d504 100644
--- a/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java
+++ b/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorDownloader.java
@@ -6,6 +6,9 @@ package org.torproject.collector.relaydescs;
import org.apache.commons.codec.binary.Base64;
import org.apache.commons.codec.digest.DigestUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
@@ -31,8 +34,6 @@ import java.util.SortedSet;
import java.util.TimeZone;
import java.util.TreeMap;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
import java.util.zip.InflaterInputStream;
/**
@@ -319,8 +320,7 @@ public class RelayDescriptorDownloader {
Collections.shuffle(this.authorities);
/* Initialize logger. */
- this.logger = Logger.getLogger(
- RelayDescriptorDownloader.class.getName());
+ this.logger = LoggerFactory.getLogger(RelayDescriptorDownloader.class);
/* Prepare cut-off times and timestamp for the missing descriptors
* list and the list of authorities to download all server and
@@ -345,7 +345,7 @@ public class RelayDescriptorDownloader {
"stats/missing-relay-descriptors");
if (this.missingDescriptorsFile.exists()) {
try {
- this.logger.fine("Reading file "
+ this.logger.debug("Reading file "
+ this.missingDescriptorsFile.getAbsolutePath() + "...");
BufferedReader br = new BufferedReader(new FileReader(
this.missingDescriptorsFile));
@@ -396,16 +396,16 @@ public class RelayDescriptorDownloader {
}
}
} else {
- this.logger.fine("Invalid line '" + line + "' in "
+ this.logger.debug("Invalid line '" + line + "' in "
+ this.missingDescriptorsFile.getAbsolutePath()
+ ". Ignoring.");
}
}
br.close();
- this.logger.fine("Finished reading file "
+ this.logger.debug("Finished reading file "
+ this.missingDescriptorsFile.getAbsolutePath() + ".");
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed to read file "
+ this.logger.warn("Failed to read file "
+ this.missingDescriptorsFile.getAbsolutePath()
+ "! This means that we might forget to dowload relay "
+ "descriptors we are missing.", e);
@@ -419,7 +419,7 @@ public class RelayDescriptorDownloader {
"stats/last-downloaded-all-descriptors");
if (this.lastDownloadedAllDescriptorsFile.exists()) {
try {
- this.logger.fine("Reading file "
+ this.logger.debug("Reading file "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath()
+ "...");
BufferedReader br = new BufferedReader(new FileReader(
@@ -427,7 +427,7 @@ public class RelayDescriptorDownloader {
String line;
while ((line = br.readLine()) != null) {
if (line.split(",").length != 2) {
- this.logger.fine("Invalid line '" + line + "' in "
+ this.logger.debug("Invalid line '" + line + "' in "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath()
+ ". Ignoring.");
} else {
@@ -439,11 +439,11 @@ public class RelayDescriptorDownloader {
}
}
br.close();
- this.logger.fine("Finished reading file "
+ this.logger.debug("Finished reading file "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath()
+ ".");
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed to read file "
+ this.logger.warn("Failed to read file "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath()
+ "! This means that we might download all server and "
+ "extra-info descriptors more often than we should.", e);
@@ -842,8 +842,7 @@ public class RelayDescriptorDownloader {
/* If a download failed, stop requesting descriptors from this
* authority and move on to the next. */
} catch (IOException e) {
- logger.log(Level.FINE, "Failed downloading from " + authority
- + "!", e);
+ logger.debug("Failed downloading from " + authority + "!", e);
}
}
}
@@ -886,7 +885,7 @@ public class RelayDescriptorDownloader {
in.close();
allData = baos.toByteArray();
}
- logger.fine("Downloaded " + fullUrl + " -> " + response + " ("
+ logger.debug("Downloaded " + fullUrl + " -> " + response + " ("
+ (allData == null ? 0 : allData.length) + " bytes)");
int receivedDescriptors = 0;
if (allData != null) {
@@ -980,7 +979,7 @@ public class RelayDescriptorDownloader {
this.rdp.storeMicrodescriptor(descBytes, digest256Hex,
digest256Base64, validAfter);
} catch (ParseException e) {
- this.logger.log(Level.WARNING, "Could not parse "
+ this.logger.warn("Could not parse "
+ "valid-after time '" + validAfterTime + "' in "
+ "microdescriptor key. Not storing microdescriptor.",
e);
@@ -1006,7 +1005,7 @@ public class RelayDescriptorDownloader {
int missingServerDescriptors = 0;
int missingExtraInfoDescriptors = 0;
try {
- this.logger.fine("Writing file "
+ this.logger.debug("Writing file "
+ this.missingDescriptorsFile.getAbsolutePath() + "...");
this.missingDescriptorsFile.getParentFile().mkdirs();
BufferedWriter bw = new BufferedWriter(new FileWriter(
@@ -1033,10 +1032,10 @@ public class RelayDescriptorDownloader {
bw.write(key + "," + value + "\n");
}
bw.close();
- this.logger.fine("Finished writing file "
+ this.logger.debug("Finished writing file "
+ this.missingDescriptorsFile.getAbsolutePath() + ".");
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed writing "
+ this.logger.warn("Failed writing "
+ this.missingDescriptorsFile.getAbsolutePath() + "!", e);
}
int missingMicrodescriptors = this.missingMicrodescriptors.size();
@@ -1045,7 +1044,7 @@ public class RelayDescriptorDownloader {
* last downloaded all server and extra-info descriptors from them to
* disk. */
try {
- this.logger.fine("Writing file "
+ this.logger.debug("Writing file "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath()
+ "...");
this.lastDownloadedAllDescriptorsFile.getParentFile().mkdirs();
@@ -1058,11 +1057,11 @@ public class RelayDescriptorDownloader {
bw.write(authority + "," + lastDownloaded + "\n");
}
bw.close();
- this.logger.fine("Finished writing file "
+ this.logger.debug("Finished writing file "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath()
+ ".");
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed writing "
+ this.logger.warn("Failed writing "
+ this.lastDownloadedAllDescriptorsFile.getAbsolutePath() + "!",
e);
}
diff --git a/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorParser.java b/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorParser.java
index 3f9b912..125b32a 100644
--- a/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorParser.java
+++ b/src/main/java/org/torproject/collector/relaydescs/RelayDescriptorParser.java
@@ -7,6 +7,9 @@ import org.apache.commons.codec.binary.Base64;
import org.apache.commons.codec.binary.Hex;
import org.apache.commons.codec.digest.DigestUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedReader;
import java.io.IOException;
import java.io.StringReader;
@@ -15,8 +18,6 @@ import java.text.SimpleDateFormat;
import java.util.SortedSet;
import java.util.TimeZone;
import java.util.TreeSet;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/**
* Parses relay descriptors including network status consensuses and
@@ -54,7 +55,7 @@ public class RelayDescriptorParser {
this.aw = aw;
/* Initialize logger. */
- this.logger = Logger.getLogger(RelayDescriptorParser.class.getName());
+ this.logger = LoggerFactory.getLogger(RelayDescriptorParser.class);
this.dateTimeFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
this.dateTimeFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
@@ -82,7 +83,7 @@ public class RelayDescriptorParser {
line = br.readLine();
} while (line != null && line.startsWith("@"));
if (line == null) {
- this.logger.fine("We were given an empty descriptor for "
+ this.logger.debug("We were given an empty descriptor for "
+ "parsing. Ignoring.");
return false;
}
@@ -150,7 +151,7 @@ public class RelayDescriptorParser {
+ lastRelayIdentity + "," + serverDesc);
serverDescriptorDigests.add(serverDesc);
} else {
- this.logger.log(Level.WARNING, "Could not parse r line '"
+ this.logger.warn("Could not parse r line '"
+ line + "' in descriptor. Skipping.");
break;
}
@@ -167,7 +168,7 @@ public class RelayDescriptorParser {
} else if (parts.length != 3
|| !parts[2].startsWith("sha256=")
|| parts[2].length() != 50) {
- this.logger.log(Level.WARNING, "Could not parse m line '"
+ this.logger.warn("Could not parse m line '"
+ line + "' in descriptor. Skipping.");
break;
}
@@ -315,10 +316,10 @@ public class RelayDescriptorParser {
}
br.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Could not parse descriptor. "
+ this.logger.warn("Could not parse descriptor. "
+ "Skipping.", e);
} catch (ParseException e) {
- this.logger.log(Level.WARNING, "Could not parse descriptor. "
+ this.logger.warn("Could not parse descriptor. "
+ "Skipping.", e);
}
return stored;
diff --git a/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java b/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java
index c80f99e..53b1523 100644
--- a/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java
+++ b/src/main/java/org/torproject/collector/torperf/TorperfDownloader.java
@@ -8,6 +8,9 @@ import org.torproject.collector.conf.ConfigurationException;
import org.torproject.collector.conf.Key;
import org.torproject.collector.main.LockFile;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
@@ -26,14 +29,12 @@ import java.util.SortedMap;
import java.util.Stack;
import java.util.TimeZone;
import java.util.TreeMap;
-import java.util.logging.Level;
-import java.util.logging.Logger;
/* Download possibly truncated Torperf .data and .extradata files from
* configured sources, append them to the files we already have, and merge
* the two files into the .tpf format. */
public class TorperfDownloader extends Thread {
- private static Logger logger = Logger.getLogger(TorperfDownloader.class.getName());
+ private static Logger logger = LoggerFactory.getLogger(TorperfDownloader.class);
public static void main(Configuration config) throws ConfigurationException {
logger.info("Starting torperf module of CollecTor.");
@@ -66,7 +67,7 @@ public class TorperfDownloader extends Thread {
try {
startProcessing();
} catch (ConfigurationException ce) {
- logger.severe("Configuration failed: " + ce);
+ logger.error("Configuration failed: " + ce, ce);
throw new RuntimeException(ce);
}
}
@@ -120,7 +121,7 @@ public class TorperfDownloader extends Thread {
}
}
if (fileName == null || timestamp == null) {
- this.logger.log(Level.WARNING, "Invalid line '" + line + "' in "
+ this.logger.warn("Invalid line '" + line + "' in "
+ this.torperfLastMergedFile.getAbsolutePath() + ". "
+ "Ignoring past history of merging .data and .extradata "
+ "files.");
@@ -131,7 +132,7 @@ public class TorperfDownloader extends Thread {
}
br.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Error while reading '"
+ this.logger.warn("Error while reading '"
+ this.torperfLastMergedFile.getAbsolutePath() + ". Ignoring "
+ "past history of merging .data and .extradata files.");
this.lastMergedTimestamps.clear();
@@ -151,7 +152,7 @@ public class TorperfDownloader extends Thread {
}
bw.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Error while writing '"
+ this.logger.warn("Error while writing '"
+ this.torperfLastMergedFile.getAbsolutePath() + ". This may "
+ "result in ignoring history of merging .data and .extradata "
+ "files in the next execution.", e);
@@ -165,9 +166,9 @@ public class TorperfDownloader extends Thread {
try {
fileSize = Integer.parseInt(parts[1]);
} catch (NumberFormatException e) {
- this.logger.log(Level.WARNING, "Could not parse file size in "
+ this.logger.warn("Could not parse file size in "
+ "TorperfFiles configuration line '" + torperfFilesLine
- + "'.");
+ + "'.", e);
return;
}
@@ -202,7 +203,7 @@ public class TorperfDownloader extends Thread {
skipUntil = this.mergeFiles(dataOutputFile, extradataOutputFile,
sourceName, fileSize, skipUntil);
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed merging " + dataOutputFile
+ this.logger.warn("Failed merging " + dataOutputFile
+ " and " + extradataOutputFile + ".", e);
}
if (skipUntil != null) {
@@ -232,14 +233,14 @@ public class TorperfDownloader extends Thread {
}
br.close();
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed reading '"
+ this.logger.warn("Failed reading '"
+ outputFile.getAbsolutePath() + "' to determine the first "
+ "line to append to it.", e);
return false;
}
}
try {
- this.logger.fine("Downloading " + (isDataFile ? ".data" :
+ this.logger.debug("Downloading " + (isDataFile ? ".data" :
".extradata") + " file from '" + url + "' and merging it into "
+ "'" + outputFile.getAbsolutePath() + "'.");
URL u = new URL(url);
@@ -267,19 +268,19 @@ public class TorperfDownloader extends Thread {
bw.close();
br.close();
if (!copyLines) {
- this.logger.warning("The last timestamp line in '"
+ this.logger.warn("The last timestamp line in '"
+ outputFile.getAbsolutePath() + "' is not contained in the "
+ "new file downloaded from '" + url + "'. Cannot append "
+ "new lines without possibly leaving a gap. Skipping.");
return false;
}
} catch (IOException e) {
- this.logger.log(Level.WARNING, "Failed downloading and/or merging '"
+ this.logger.warn("Failed downloading and/or merging '"
+ url + "'.", e);
return false;
}
if (lastTimestampLine == null) {
- this.logger.warning("'" + outputFile.getAbsolutePath()
+ this.logger.warn("'" + outputFile.getAbsolutePath()
+ "' doesn't contain any timestamp lines. Unable to check "
+ "whether that file is stale or not.");
} else {
@@ -295,7 +296,7 @@ public class TorperfDownloader extends Thread {
}
if (lastTimestampMillis < System.currentTimeMillis()
- 330L * 60L * 1000L) {
- this.logger.warning("The last timestamp in '"
+ this.logger.warn("The last timestamp in '"
+ outputFile.getAbsolutePath() + "' is more than 5:30 hours "
+ "old: " + lastTimestampMillis);
}
@@ -309,11 +310,11 @@ public class TorperfDownloader extends Thread {
config.put("SOURCE", source);
config.put("FILESIZE", String.valueOf(fileSize));
if (!dataFile.exists() || !extradataFile.exists()) {
- this.logger.warning("File " + dataFile.getAbsolutePath() + " or "
+ this.logger.warn("File " + dataFile.getAbsolutePath() + " or "
+ extradataFile.getAbsolutePath() + " is missing.");
return null;
}
- this.logger.fine("Merging " + dataFile.getAbsolutePath() + " and "
+ this.logger.debug("Merging " + dataFile.getAbsolutePath() + " and "
+ extradataFile.getAbsolutePath() + " into .tpf format.");
BufferedReader brD = new BufferedReader(new FileReader(dataFile));
BufferedReader brE = new BufferedReader(new FileReader(extradataFile));
@@ -329,14 +330,14 @@ public class TorperfDownloader extends Thread {
* format, either with additional information from the .extradata
* file or without it. */
if (lineD.isEmpty()) {
- this.logger.finer("Skipping empty line " + dataFile.getName()
+ this.logger.trace("Skipping empty line " + dataFile.getName()
+ ":" + d++ + ".");
lineD = brD.readLine();
continue;
}
SortedMap<String, String> data = this.parseDataLine(lineD);
if (data == null) {
- this.logger.finer("Skipping illegal line " + dataFile.getName()
+ this.logger.trace("Skipping illegal line " + dataFile.getName()
+ ":" + d++ + " '" + lineD + "'.");
lineD = brD.readLine();
continue;
@@ -344,7 +345,7 @@ public class TorperfDownloader extends Thread {
String dataComplete = data.get("DATACOMPLETE");
double dataCompleteSeconds = Double.parseDouble(dataComplete);
if (skipUntil != null && dataComplete.compareTo(skipUntil) < 0) {
- this.logger.finer("Skipping " + dataFile.getName() + ":"
+ this.logger.trace("Skipping " + dataFile.getName() + ":"
+ d++ + " which we already processed before.");
lineD = brD.readLine();
continue;
@@ -356,33 +357,33 @@ public class TorperfDownloader extends Thread {
SortedMap<String, String> extradata = null;
while (lineE != null) {
if (lineE.isEmpty()) {
- this.logger.finer("Skipping " + extradataFile.getName() + ":"
+ this.logger.trace("Skipping " + extradataFile.getName() + ":"
+ e++ + " which is empty.");
lineE = brE.readLine();
continue;
}
if (lineE.startsWith("BUILDTIMEOUT_SET ")) {
- this.logger.finer("Skipping " + extradataFile.getName() + ":"
+ this.logger.trace("Skipping " + extradataFile.getName() + ":"
+ e++ + " which is a BUILDTIMEOUT_SET line.");
lineE = brE.readLine();
continue;
} else if (lineE.startsWith("ok ")
|| lineE.startsWith("error ")) {
- this.logger.finer("Skipping " + extradataFile.getName() + ":"
+ this.logger.trace("Skipping " + extradataFile.getName() + ":"
+ e++ + " which is in the old format.");
lineE = brE.readLine();
continue;
}
extradata = this.parseExtradataLine(lineE);
if (extradata == null) {
- this.logger.finer("Skipping Illegal line "
+ this.logger.trace("Skipping Illegal line "
+ extradataFile.getName() + ":" + e++ + " '" + lineE
+ "'.");
lineE = brE.readLine();
continue;
}
if (!extradata.containsKey("USED_AT")) {
- this.logger.finer("Skipping " + extradataFile.getName() + ":"
+ this.logger.trace("Skipping " + extradataFile.getName() + ":"
+ e++ + " which doesn't contain a USED_AT element.");
lineE = brE.readLine();
continue;
@@ -390,24 +391,24 @@ public class TorperfDownloader extends Thread {
String usedAt = extradata.get("USED_AT");
double usedAtSeconds = Double.parseDouble(usedAt);
if (skipUntil != null && usedAt.compareTo(skipUntil) < 0) {
- this.logger.finer("Skipping " + extradataFile.getName() + ":"
+ this.logger.trace("Skipping " + extradataFile.getName() + ":"
+ e++ + " which we already processed before.");
lineE = brE.readLine();
continue;
}
maxUsedAt = usedAt;
if (Math.abs(usedAtSeconds - dataCompleteSeconds) <= 1.0) {
- this.logger.fine("Merging " + extradataFile.getName() + ":"
+ this.logger.debug("Merging " + extradataFile.getName() + ":"
+ e++ + " into the current .data line.");
lineE = brE.readLine();
break;
} else if (usedAtSeconds > dataCompleteSeconds) {
- this.logger.finer("Comparing " + extradataFile.getName()
+ this.logger.trace("Comparing " + extradataFile.getName()
+ " to the next .data line.");
extradata = null;
break;
} else {
- this.logger.finer("Skipping " + extradataFile.getName() + ":"
+ this.logger.trace("Skipping " + extradataFile.getName() + ":"
+ e++ + " which is too old to be merged with "
+ dataFile.getName() + ":" + d + ".");
lineE = brE.readLine();
@@ -423,12 +424,12 @@ public class TorperfDownloader extends Thread {
}
keysAndValues.putAll(data);
keysAndValues.putAll(config);
- this.logger.fine("Writing " + dataFile.getName() + ":" + d++ + ".");
+ this.logger.debug("Writing " + dataFile.getName() + ":" + d++ + ".");
lineD = brD.readLine();
try {
this.writeTpfLine(source, fileSize, keysAndValues);
} catch (IOException ex) {
- this.logger.log(Level.WARNING, "Error writing output line. "
+ this.logger.warn("Error writing output line. "
+ "Aborting to merge " + dataFile.getName() + " and "
+ extradataFile.getName() + ".", e);
break;
diff --git a/src/main/resources/logback.xml b/src/main/resources/logback.xml
new file mode 100644
index 0000000..1b78d58
--- /dev/null
+++ b/src/main/resources/logback.xml
@@ -0,0 +1,126 @@
+<configuration debug="false">
+
+ <!-- a path and a prefix -->
+ <property name="logfile-base" value="${LOGBASE}/collector-" />
+
+ <!-- log file names -->
+ <property name="fileall-logname" value="${logfile-base}all" />
+ <property name="file-bridgedescs-logname" value="${logfile-base}bridgedescs" />
+ <property name="file-exitlists-logname" value="${logfile-base}exitlists" />
+ <property name="file-relaydescs-logname" value="${logfile-base}relaydescs" />
+ <property name="file-torperf-logname" value="${logfile-base}torperf" />
+ <property name="file-updateindex-logname" value="${logfile-base}updateindex" />
+
+ <!-- date pattern -->
+ <property name="utc-date-pattern" value="%date{ISO8601, UTC}" />
+
+ <!-- appender section -->
+ <appender name="FILEALL" class="ch.qos.logback.core.rolling.RollingFileAppender">
+ <file>${fileall-logname}.log</file>
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+ <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
+ <!-- rollover daily -->
+ <FileNamePattern>${fileall-logname}.%d{yyyy-MM-dd}.%i.log</FileNamePattern>
+ <maxHistory>10</maxHistory>
+ <timeBasedFileNamingAndTriggeringPolicy
+ class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
+ <!-- or whenever the file size reaches 1MB -->
+ <maxFileSize>1MB</maxFileSize>
+ </timeBasedFileNamingAndTriggeringPolicy>
+ </rollingPolicy>
+ </appender>
+
+ <appender name="FILEBRIDGEDESCS" class="ch.qos.logback.core.FileAppender">
+ <file>${file-bridgedescs-logname}.log</file>
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>TRACE</level>
+ </filter>
+ </appender>
+
+ <appender name="FILEEXITLISTS" class="ch.qos.logback.core.FileAppender">
+ <file>${file-exitlists-logname}.log</file>
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>TRACE</level>
+ </filter>
+ </appender>
+
+ <appender name="FILERELAYDESCS" class="ch.qos.logback.core.FileAppender">
+ <file>${file-relaydescs-logname}.log</file>
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>TRACE</level>
+ </filter>
+ </appender>
+
+ <appender name="FILETORPERF" class="ch.qos.logback.core.FileAppender">
+ <file>${file-torperf-logname}.log</file>
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>TRACE</level>
+ </filter>
+ </appender>
+
+ <appender name="FILEUPDATEINDEX" class="ch.qos.logback.core.FileAppender">
+ <file>${file-updateindex-logname}.log</file>
+ <encoder>
+ <pattern>${utc-date-pattern} %level %logger{20}:%line %msg%n</pattern>
+ </encoder>
+
+ <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
+ <level>TRACE</level>
+ </filter>
+ </appender>
+
+ <!-- logger section -->
+ <logger name="org.torproject.collector.bridgedescs" >
+ <appender-ref ref="FILEBRIDGEDESCS" />
+ </logger>
+
+ <logger name="org.torproject.collector.exitlists" >
+ <appender-ref ref="FILEEXITLISTS" />
+ </logger>
+
+ <logger name="org.torproject.collector.relaydescs" >
+ <appender-ref ref="FILERELAYDESCS" />
+ </logger>
+
+ <logger name="org.torproject.collector.torperf" >
+ <appender-ref ref="FILETORPERF" />
+ </logger>
+
+ <logger name="org.torproject.collector.index" >
+ <appender-ref ref="FILEUPDATEINDEX" />
+ </logger>
+
+ <logger name="org.torproject.collector.Main" >
+ <appender-ref ref="FILEBRIDGEDESCS" />
+ <appender-ref ref="FILEEXITLISTS" />
+ <appender-ref ref="FILERELAYDESCS" />
+ <appender-ref ref="FILETORPERF" />
+ <appender-ref ref="FILEUPDATEINDEX" />
+ </logger>
+
+ <logger name="sun" level="ERROR" />
+
+ <root level="ALL">
+ <appender-ref ref="FILEALL" />
+ </root>
+
+</configuration>
+
1
0

[translation/liveusb-creator] Update translations for liveusb-creator
by translation@torproject.org 06 Jun '16
by translation@torproject.org 06 Jun '16
06 Jun '16
commit 5c513ca488ef2565176af1c3f82bac88545844c5
Author: Translation commit bot <translation(a)torproject.org>
Date: Mon Jun 6 20:15:19 2016 +0000
Update translations for liveusb-creator
---
lb/lb.po | 36 ++++++++++++++++++------------------
1 file changed, 18 insertions(+), 18 deletions(-)
diff --git a/lb/lb.po b/lb/lb.po
index 045c4c0..5c986c3 100644
--- a/lb/lb.po
+++ b/lb/lb.po
@@ -9,7 +9,7 @@ msgstr ""
"Project-Id-Version: The Tor Project\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2015-11-02 21:23+0100\n"
-"PO-Revision-Date: 2016-06-06 19:44+0000\n"
+"PO-Revision-Date: 2016-06-06 20:15+0000\n"
"Last-Translator: Tyler Durden <virii(a)enn.lu>\n"
"Language-Team: Luxembourgish (http://www.transifex.com/otf/torproject/language/lb/)\n"
"MIME-Version: 1.0\n"
@@ -462,88 +462,88 @@ msgstr "Datei vum leschte LiveOS konnt net geläscht ginn: %(message)s"
#: ../liveusb/creator.py:1189
msgid ""
"Unable to reset MBR. You may not have the `syslinux` package installed."
-msgstr ""
+msgstr "MBR konnt net zeréck gesat ginn. `syslinux` Pak ass vläicht net installéiert."
#: ../liveusb/gui.py:798
msgid ""
"Unable to use the selected file. You may have better luck if you move your "
"ISO to the root of your drive (ie: C:\\)"
-msgstr ""
+msgstr "Net méiglech déi ausgewielten Datei ze benotzen. Dir hutt wuel méi Gléck wann dir är ISO direkt an engem Lafwierk ofleet (zB: C:\\)"
#: ../liveusb/creator.py:723
#, python-format
msgid "Unable to write on %(device)s, skipping."
-msgstr ""
+msgstr "Net méiglech op %(device)s ze schreiwen, iwwersprangen."
#: ../liveusb/creator.py:399
msgid "Unknown ISO, skipping checksum verification"
-msgstr ""
+msgstr "Onbekannten ISO, Checkzomm Verifikatioun gëtt iwwersprongen"
#: ../liveusb/creator.py:810
#, python-format
msgid "Unknown dbus exception while trying to mount device: %(message)s"
-msgstr ""
+msgstr "Onbekannten dbus Exceptioun während dem unhänke vum Medium: %(message)s"
#: ../liveusb/creator.py:791 ../liveusb/creator.py:964
msgid "Unknown filesystem. Your device may need to be reformatted."
-msgstr ""
+msgstr "Onbekanntent Dateisystem. D'Medium muss néi formatéiert ginn."
#: ../liveusb/gui.py:85
#, python-format
msgid "Unknown release: %s"
-msgstr ""
+msgstr "Onbekannt release: %s"
#: ../liveusb/creator.py:851
#, python-format
msgid "Unmounting '%(udi)s' on '%(device)s'"
-msgstr ""
+msgstr "Aushänke vu '%(udi)s' op '%(device)s'"
#: ../liveusb/creator.py:847
#, python-format
msgid "Unmounting mounted filesystems on '%(device)s'"
-msgstr ""
+msgstr "Aushänke vum agehaangenem Dateisystem op '%(device)s'"
#: ../liveusb/creator.py:949
#, python-format
msgid "Unsupported device '%(device)s', please report a bug."
-msgstr ""
+msgstr "Medium net ënnerstëtzt '%(device)s'. Mellt wann ech gelift de Feeler."
#: ../liveusb/creator.py:794 ../liveusb/creator.py:967
#, python-format
msgid "Unsupported filesystem: %s"
-msgstr ""
+msgstr "Dateisystem net ënnerstëtzt: %s"
#: ../liveusb/creator.py:1287
#, python-format
msgid ""
"Unsupported filesystem: %s\n"
"Please backup and format your USB key with the FAT filesystem."
-msgstr ""
+msgstr "Dateisystem net ënnerstëtzt: %s\nMacht ee Backup a formatéiert den USB Stick mat dem FAT Dateisystem."
#: ../liveusb/creator.py:892
#, python-format
msgid "Updating properties of system partition %(system_partition)s"
-msgstr ""
+msgstr "Astellunge vun der System Partitioun %(system_partition)s gi geupdatet"
#: ../liveusb/launcher_ui.py:156
msgid ""
"Upgrade\n"
"by cloning"
-msgstr ""
+msgstr "Upgraden\nmat klonen"
#: ../liveusb/launcher_ui.py:158
msgid ""
"Upgrade\n"
"from ISO"
-msgstr ""
+msgstr "Upgrade\nvun ISO"
#: ../liveusb/dialog.py:159
msgid "Use existing Live system ISO"
-msgstr ""
+msgstr "Existéierend Live System ISO benotzen"
#: ../liveusb/creator.py:143
msgid "Verifying ISO MD5 checksum"
-msgstr ""
+msgstr "ISO MD5 Checkzomm verifizéieren"
#: ../liveusb/creator.py:373
msgid "Verifying SHA1 checksum of LiveCD image..."
1
0

[translation/liveusb-creator] Update translations for liveusb-creator
by translation@torproject.org 06 Jun '16
by translation@torproject.org 06 Jun '16
06 Jun '16
commit 99e4ecf5242d5d4d38e919eb42c6ba861692624f
Author: Translation commit bot <translation(a)torproject.org>
Date: Mon Jun 6 19:45:20 2016 +0000
Update translations for liveusb-creator
---
lb/lb.po | 24 ++++++++++++------------
1 file changed, 12 insertions(+), 12 deletions(-)
diff --git a/lb/lb.po b/lb/lb.po
index 2b01bdb..045c4c0 100644
--- a/lb/lb.po
+++ b/lb/lb.po
@@ -9,7 +9,7 @@ msgstr ""
"Project-Id-Version: The Tor Project\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2015-11-02 21:23+0100\n"
-"PO-Revision-Date: 2016-06-06 18:31+0000\n"
+"PO-Revision-Date: 2016-06-06 19:44+0000\n"
"Last-Translator: Tyler Durden <virii(a)enn.lu>\n"
"Language-Team: Luxembourgish (http://www.transifex.com/otf/torproject/language/lb/)\n"
"MIME-Version: 1.0\n"
@@ -411,53 +411,53 @@ msgstr "USB Medium fonnt"
#: ../liveusb/creator.py:985
#, python-format
msgid "Unable to change volume label: %(message)s"
-msgstr ""
+msgstr "Ännerung vun der Volumebeschreiwung net méiglech: %(message)s"
#: ../liveusb/creator.py:501 ../liveusb/creator.py:512
#, python-format
msgid "Unable to chmod %(file)s: %(message)s"
-msgstr ""
+msgstr "chmod vun %(file)s net méiglech: %(message)s"
#: ../liveusb/creator.py:478
#, python-format
msgid "Unable to copy %(infile)s to %(outfile)s: %(message)s"
-msgstr ""
+msgstr "Kopéiere vun %(infile)s no %(outfile)s ass schif gaangen: %(message)s"
#: ../liveusb/gui.py:403
msgid "Unable to find any USB drive"
-msgstr ""
+msgstr "Keen USB Späichermedium fonnt"
#: ../liveusb/creator.py:1274
msgid "Unable to find any supported device"
-msgstr ""
+msgstr "Et konnt kee Medium fonnt gi wat ënnerstëtzt gëtt"
#: ../liveusb/creator.py:1117
msgid "Unable to find partition"
-msgstr ""
+msgstr "Et gouf keng Partitioun fonnt"
#: ../liveusb/creator.py:1354
msgid ""
"Unable to get Win32_LogicalDisk; win32com query did not return any results"
-msgstr ""
+msgstr "Win32_LogicalDisk konnt net ermëttelt ginn; de win32com Opruff huet keng Resultater bruecht."
#: ../liveusb/gui.py:691
msgid "Unable to mount device"
-msgstr ""
+msgstr "Medium konnt net agehaange ginn"
#: ../liveusb/creator.py:814
#, python-format
msgid "Unable to mount device: %(message)s"
-msgstr ""
+msgstr "Medium konnt net agehaangen ginn: %(message)s"
#: ../liveusb/creator.py:517
#, python-format
msgid "Unable to remove directory from previous LiveOS: %(message)s"
-msgstr ""
+msgstr "Verzeechnes vum leschte LiveOS konnt net geläscht ginn: %(message)s"
#: ../liveusb/creator.py:505
#, python-format
msgid "Unable to remove file from previous LiveOS: %(message)s"
-msgstr ""
+msgstr "Datei vum leschte LiveOS konnt net geläscht ginn: %(message)s"
#: ../liveusb/creator.py:1189
msgid ""
1
0

[translation/liveusb-creator] Update translations for liveusb-creator
by translation@torproject.org 06 Jun '16
by translation@torproject.org 06 Jun '16
06 Jun '16
commit 35b300fe75fa914ce2f37e144a7b89af0e4e9978
Author: Translation commit bot <translation(a)torproject.org>
Date: Mon Jun 6 18:45:19 2016 +0000
Update translations for liveusb-creator
---
lb/lb.po | 14 +++++++-------
1 file changed, 7 insertions(+), 7 deletions(-)
diff --git a/lb/lb.po b/lb/lb.po
index 386943e..2b01bdb 100644
--- a/lb/lb.po
+++ b/lb/lb.po
@@ -9,7 +9,7 @@ msgstr ""
"Project-Id-Version: The Tor Project\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2015-11-02 21:23+0100\n"
-"PO-Revision-Date: 2016-06-06 18:14+0000\n"
+"PO-Revision-Date: 2016-06-06 18:31+0000\n"
"Last-Translator: Tyler Durden <virii(a)enn.lu>\n"
"Language-Team: Luxembourgish (http://www.transifex.com/otf/torproject/language/lb/)\n"
"MIME-Version: 1.0\n"
@@ -382,31 +382,31 @@ msgid ""
"optionally downloading a release (if an existing one wasn't selected), "
"extracting the ISO to the USB device, creating the persistent overlay, and "
"installing the bootloader."
-msgstr ""
+msgstr "Dëse Knäppche start de LiveUSB Kreatiounsprozess. Dat beinhalt en optionalt erofluede vun engem Release (wa keent ausgewielt gouf), d'entpake vun der ISO op en USB Stick, Kreatioun vun engem persistente Späicher an d'Installatioun vum Bootloader."
#: ../liveusb/dialog.py:165
msgid ""
"This is the USB stick that you want to install your Live system on. This "
"device must be formatted with the FAT filesystem."
-msgstr ""
+msgstr "Dat ass den USB Stick op deem dir de Live system wëllt installéieren. D'Medium muss FAT formatéiert sinn."
#: ../liveusb/dialog.py:170
msgid ""
"This is the progress bar that will indicate how far along in the LiveUSB "
"creation process you are"
-msgstr ""
+msgstr "Dat ass d'Fortschrëttsbar déi uweist wei wäit de LiveUSB Prozess am Moment ass"
#: ../liveusb/dialog.py:169
msgid "This is the status console, where all messages get written to."
-msgstr ""
+msgstr "Dëst ass d'Statuskonsol wou all d'Noriichten hi geschriwwe ginn."
#: ../liveusb/creator.py:952
msgid "Trying to continue anyway."
-msgstr ""
+msgstr "Versichen trotzdem weider ze maachen."
#: ../liveusb/gui.py:464
msgid "USB drive found"
-msgstr ""
+msgstr "USB Medium fonnt"
#: ../liveusb/creator.py:985
#, python-format
1
0