tor-commits
Threads by month
- ----- 2025 -----
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
December 2012
- 17 participants
- 1600 discussions

[ooni-probe/master] Cleanup all the source tree from dead code and cruft
by art@torproject.org 06 Dec '12
by art@torproject.org 06 Dec '12
06 Dec '12
commit 5fc1204fd56275e4dde7116fb0db2564a00d55f4
Author: Arturo Filastò <art(a)fuffa.org>
Date: Thu Dec 6 23:16:26 2012 +0100
Cleanup all the source tree from dead code and cruft
* Create directory for storing test decks
---
before_i_commit.sh | 29 --
before_i_commit.testdeck | 33 --
bin/INSTRUCTIONS | 15 -
bin/Makefile | 54 ---
bin/…
[View More]canary | 27 --
bin/old_ooniprobe | 80 ----
bin/oonib | 4 -
decks/before_i_commit.testdeck | 33 ++
ooni/__init__.py | 3 +-
ooni/config.py | 5 -
ooni/kit/domclass.py | 53 ++--
ooni/lib/__init__.py | 5 -
ooni/lib/secdev.org.pem | 20 -
ooni/nettest.py | 13 +-
ooni/nodes.py | 174 --------
ooni/oonicli.py | 11 +-
ooni/otime.py | 3 -
ooni/reporter.py | 9 -
ooni/runner.py | 9 -
ooni/templates/httpt.py | 4 -
ooni/templates/scapyt.py | 5 -
ooni/utils/__init__.py | 4 -
ooni/utils/geodata.py | 30 --
ooni/utils/hacks.py | 7 -
ooni/utils/log.py | 5 -
ooni/utils/net.py | 13 -
ooni/utils/txscapy.py | 8 -
scripts/before_i_commit.sh | 29 ++
scripts/submit-patch | 100 +++++
submit-patch | 100 -----
to-be-ported/TODO | 418 --------------------
to-be-ported/spec/proxooni-spec.txt | 65 ---
to-be-ported/very-old/TODO.plgoons | 79 ----
to-be-ported/very-old/TO_BE_PORTED | 14 -
to-be-ported/very-old/ooni-probe.diff | 358 -----------------
to-be-ported/very-old/ooni/#namecheck.py# | 39 --
to-be-ported/very-old/ooni/.DS_Store | Bin 15364 -> 0 bytes
to-be-ported/very-old/ooni/__init__.py | 12 -
to-be-ported/very-old/ooni/command.py | 250 ------------
to-be-ported/very-old/ooni/dns_poisoning.py | 43 --
to-be-ported/very-old/ooni/dnsooni.py | 356 -----------------
to-be-ported/very-old/ooni/helpers.py | 38 --
to-be-ported/very-old/ooni/http.py | 306 --------------
to-be-ported/very-old/ooni/input.py | 33 --
to-be-ported/very-old/ooni/namecheck.py | 39 --
.../very-old/ooni/plugins/dnstest_plgoo.py | 84 ----
to-be-ported/very-old/ooni/plugins/http_plgoo.py | 70 ----
to-be-ported/very-old/ooni/plugins/marco_plgoo.py | 377 ------------------
to-be-ported/very-old/ooni/plugins/proxy_plgoo.py | 69 ----
.../very-old/ooni/plugins/simple_dns_plgoo.py | 35 --
to-be-ported/very-old/ooni/plugins/tcpcon_plgoo.py | 278 -------------
to-be-ported/very-old/ooni/plugins/tor.py | 80 ----
to-be-ported/very-old/ooni/plugins/torrc | 9 -
to-be-ported/very-old/ooni/plugooni.py | 106 -----
to-be-ported/very-old/ooni/transparenthttp.py | 41 --
var/old_notes.txt | 418 ++++++++++++++++++++
var/proxooni-spec.txt | 65 +++
var/secdev.org.pem | 20 +
58 files changed, 692 insertions(+), 3895 deletions(-)
diff --git a/before_i_commit.sh b/before_i_commit.sh
deleted file mode 100755
index 8b8180f..0000000
--- a/before_i_commit.sh
+++ /dev/null
@@ -1,29 +0,0 @@
-#!/bin/sh
-# This script should be run before you commit to verify that the basic tests
-# are working as they should
-# Once you have run it you can inspect the log file via
-#
-# $ less before_i_commit.log
-# To clean up everything that is left by the running of this tool, do as
-# following:
-#
-# rm *.yamloo; rm before_i_commit.log
-#
-
-rm before_i_commit.log
-
-find . -type f -name "*.py[co]" -delete
-
-./bin/ooniprobe -i before_i_commit.testdeck
-
-echo "Below you should not see anything"
-echo "---------------------------------"
-grep "Error: " before_i_commit.log
-echo "---------------------------------"
-echo "If you do, it means something is wrong."
-echo "Read through the log file and fix it."
-echo "If you are having some problems fixing some things that have to do with"
-echo "the core of OONI, let's first discuss it on IRC, or open a ticket"
-read
-cat *yamloo | less
-rm -f *yamloo
diff --git a/before_i_commit.testdeck b/before_i_commit.testdeck
deleted file mode 100644
index 7877d74..0000000
--- a/before_i_commit.testdeck
+++ /dev/null
@@ -1,33 +0,0 @@
-- options:
- collector: null
- help: 0
- logfile: before_i_commit.log
- pcapfile: null
- reportfile: captive_portal_test.yamloo
- subargs: []
- test: nettests/manipulation/captiveportal.py
-- options:
- collector: null
- help: 0
- logfile: before_i_commit.log
- pcapfile: null
- reportfile: dns_tamper_test.yamloo
- subargs: [-T, example_inputs/dns_tamper_test_resolvers.txt, -f, example_inputs/dns_tamper_file.txt]
- test: nettests/blocking/dnstamper.py
-- options:
- collector: null
- help: 0
- logfile: before_i_commit.log
- pcapfile: null
- reportfile: http_host.yamloo
- subargs: [-b, 'http://ooni.nu/test', -f, example_inputs/http_host_file.txt]
- test: nettests/manipulation/http_host.py
-# XXX this is disabled because it requires oonib to be running
-#- options:
-# collector: null
-# help: 0
-# logfile: null
-# pcapfile: null
-# reportfile: null
-# subargs: [-h, example_inputs/test_header_field_manipulation.txt]
-# test: nettests/core/http_header_field_manipulation.py
diff --git a/bin/INSTRUCTIONS b/bin/INSTRUCTIONS
deleted file mode 100644
index a624ae7..0000000
--- a/bin/INSTRUCTIONS
+++ /dev/null
@@ -1,15 +0,0 @@
-That Makefile is for testing that the tests work. However, not
-all of the tests have a line in the Makefile yet. If you would
-like to add things to it, feel free to. Or even to write OONI
-unittests for realsies.
-
-The 'old_ooniprobe' script used to call the old script for
-starting up tests, ooni/ooniprobe.py, but I have just removed
-ooniprobe.py and kept the remaining useful pieces in
-ooni/utils/legacy.py where the rest of the new API for running
-old code lives. I think they are happier if we keep them together.
-
-tl;dr: the thing you actually want to use is the script called
- 'ooniprobe'.
-
- --isis
diff --git a/bin/Makefile b/bin/Makefile
deleted file mode 100644
index ee8976f..0000000
--- a/bin/Makefile
+++ /dev/null
@@ -1,54 +0,0 @@
-#########################################################
-# This janky Makefile is Isis' halfassed solution for #
-# a "one-click" zomg-are-we-there-yet oracle. #
-# #
-# Obviously, you just do '$ make all'. <(A)3 #
-#########################################################
-
-#
-# you'll also need a file of things to test things on.
-# for all of the things.
-#
-
-## all tests, without debugging
-## ----------------------------
-all: echot simplet captivet dnst httphostt squidt
-
-## all tests, with debugging
-## -------------------------
-## note: you will be doing "n-Ret-n-Ret-n-Ret-s-Ret-n-Ret..."
-## for a *REALLY* long time
-all_debug: echod simpled captived dnsd httphostd squidd
-
-simplet:
- ../bin/ooniprobe ../nettests/simpletest.py -a ../lists/short_hostname_list.txt
-
-simpled:
- python -m pdb ../bin/ooniprobe ../nettests/simpletest.py -a ../lists/short_hostname_list.txt
-
-echot:
- ../bin/ooniprobe ../nettests/core/echo.py -f ../lists/short_hostname_list.txt
-
-echod:
- python -m pdb ../bin/ooniprobe -B ../nettests/core/echo.py -f ../lists/short_hostname_list.txt
-
-captivet:
- ../bin/ooniprobe ../nettests/core/captiveportal.py -f ../lists/short_hostname_list.txt
-
-captived:
- python -m pdb ../bin/ooniprobe --spew ../nettests/core/captiveportal.py -f ../lists/short_hostname_list.txt
-
-dnst:
- ../bin/ooniprobe ../nettests/core/dnstamper.py -f ../lists/short_hostname_list.txt
-
-dnsd:
- python -m pdb ../bin/ooniprobe --spew ../nettests/core/dnstamper.py -f ../lists/short_hostname_list.txt
-
-squidt:
- ../bin/ooniprobe ../nettests/core/squid.py -f ../lists/short_hostname_list.txt
-
-squidd:
- python -m pdb ../bin/ooniprobe --spew ../nettests/core/squid.py -f ../lists/short_hostname_list.txt
-
-#mvreports:
-# for $report in `find ../ -name "report*"`; do mv $report test-results #; done
diff --git a/bin/canary b/bin/canary
deleted file mode 100755
index 1473ae4..0000000
--- a/bin/canary
+++ /dev/null
@@ -1,27 +0,0 @@
-#!/usr/bin/env python
-# -*- encoding: utf-8 -*-
-###############################################################################
-#
-# canary
-# -----------------
-# Test Tor bridge reachability.
-#
-# :authors: Isis Lovecruft
-# :copyright: 2012 Isis Lovecruft, The Tor Project
-# :licence: see included LICENSE file
-# :version: 0.2.0-beta
-###############################################################################
-
-import os, sys
-import copy_reg
-
-# Hack to set the proper sys.path. Overcomes the export PYTHONPATH pain.
-sys.path[:] = map(os.path.abspath, sys.path)
-sys.path.insert(0, os.path.abspath(os.getcwd()))
-
-# This is a hack to overcome a bug in python
-from ooni.utils.hacks import patched_reduce_ex
-copy_reg._reduce_ex = patched_reduce_ex
-
-from ooni.bridget import spelunker
-spelunker.descend()
diff --git a/bin/old_ooniprobe b/bin/old_ooniprobe
deleted file mode 100755
index e234587..0000000
--- a/bin/old_ooniprobe
+++ /dev/null
@@ -1,80 +0,0 @@
-#!/bin/bash
-##############################################################################
-#
-# ooniprobe
-# -------------------
-# Setup environment variables and launch /ooni/ooniprobe.py without
-# installing.
-#
-#-----------------------------------------------------------------------------
-# :authors: Isis Lovecruft, Arturo Filasto
-# :license: see included LICENSE file
-# :version: 0.0.1-pre-alpha
-#
-##############################################################################
-
-OONI_EXEC="ooniprobe.py"
-#OONI_EXEC="oonicli.py"
-OONI_PROCESS_NAME=$(echo $OONI_EXEC | sed s/\.py//)
-
-OONI_SCRIPT_IS_HERE=$(dirname ${BASH_SOURCE[0]})
-OONI_BIN="$(cd $OONI_SCRIPT_IS_HERE && pwd)"
-OONI_REPO="$(cd $OONI_BIN"/.." && pwd)"
-OONI_DIR="$OONI_REPO/ooni"
-
-OONI_PATH_ALREADY_SET=false
-
-function usage() {
- echo "$0 - A simple wrapper around ooniprobe and oonicli to set"
- echo "up environment variables, so that it can be run without installation."
- echo;
- echo "Usage: $0 [oonitest || file || script] [options]"
- echo "All options and parameters are passed directly to ooniprobe, do"
- echo "ooniprobe.py --help to see more."
- echo;
-}
-
-function check_pythonpath_for_ooni() {
- pythonpaths="$(echo $PYTHONPATH | cut -d ':' -f '1-' --output-delimiter=' ')"
- for dir in $pythonpaths; do
- if [[ "x$dir" == "x$OONI_REPO" ]]; then
- export OONI_PATH_ALREADY_SET=true
- else
- continue
- fi
- done
-}
-
-function add_ooni_to_pythonpath() {
- if test ! $OONI_PATH_ALREADY_SET ; then
- echo "Appending $OONI_REPO to PYTHONPATH..."
- export PYTHONPATH=$PYTHONPATH:$OONI_REPO
- fi
-}
-
-function add_exec_dir_to_stack() {
- cwd_ending=$(echo $(pwd) | awk -F/ '{print $NF}')
- if [[ "x$cwd_ending" == "xooni" ]]; then
- pushd $(pwd) 2&>/dev/null ## $(dirs -l -p -1)
- else
- pushd $OONI_DIR 2&>/dev/null
- fi
- export OONI_RUN_PATH="$(popd)/$OONI_EXEC"
-}
-
-function run_ooni_in_background() {
- ## :param $1:
- ## The full path to the script to run, i.e. $OONI_RUN_PATH.
- coproc $1
-}
-
-if [[ "x$#" == "x0" ]]; then
- usage
-else
- check_pythonpath_for_ooni
- add_ooni_to_pythonpath
- add_exec_dir_to_stack
- OONI_START_CMD="python "$OONI_DIR"/"$OONI_EXEC" $@"
- #run_ooni_in_background $OONI_START_CMD
- $($OONI_START_CMD)
-fi
diff --git a/bin/oonib b/bin/oonib
index f09f790..79885ba 100755
--- a/bin/oonib
+++ b/bin/oonib
@@ -1,8 +1,4 @@
#!/usr/bin/env python
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
import sys
import os
diff --git a/decks/before_i_commit.testdeck b/decks/before_i_commit.testdeck
new file mode 100644
index 0000000..7877d74
--- /dev/null
+++ b/decks/before_i_commit.testdeck
@@ -0,0 +1,33 @@
+- options:
+ collector: null
+ help: 0
+ logfile: before_i_commit.log
+ pcapfile: null
+ reportfile: captive_portal_test.yamloo
+ subargs: []
+ test: nettests/manipulation/captiveportal.py
+- options:
+ collector: null
+ help: 0
+ logfile: before_i_commit.log
+ pcapfile: null
+ reportfile: dns_tamper_test.yamloo
+ subargs: [-T, example_inputs/dns_tamper_test_resolvers.txt, -f, example_inputs/dns_tamper_file.txt]
+ test: nettests/blocking/dnstamper.py
+- options:
+ collector: null
+ help: 0
+ logfile: before_i_commit.log
+ pcapfile: null
+ reportfile: http_host.yamloo
+ subargs: [-b, 'http://ooni.nu/test', -f, example_inputs/http_host_file.txt]
+ test: nettests/manipulation/http_host.py
+# XXX this is disabled because it requires oonib to be running
+#- options:
+# collector: null
+# help: 0
+# logfile: null
+# pcapfile: null
+# reportfile: null
+# subargs: [-h, example_inputs/test_header_field_manipulation.txt]
+# test: nettests/core/http_header_field_manipulation.py
diff --git a/ooni/__init__.py b/ooni/__init__.py
index 36afc9a..7b5eb5f 100644
--- a/ooni/__init__.py
+++ b/ooni/__init__.py
@@ -3,7 +3,6 @@
from . import config
from . import inputunit
from . import kit
-from . import lib
from . import nettest
from . import oonicli
from . import reporter
@@ -12,7 +11,7 @@ from . import templates
from . import utils
__author__ = "Arturo Filastò"
-__version__ = "0.0.7.1-alpha"
+__version__ = "0.0.8"
__all__ = ['config', 'inputunit', 'kit',
'lib', 'nettest', 'oonicli', 'reporter',
diff --git a/ooni/config.py b/ooni/config.py
index ce3fdcc..ec9865d 100644
--- a/ooni/config.py
+++ b/ooni/config.py
@@ -1,8 +1,3 @@
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
-
import os
import yaml
diff --git a/ooni/kit/domclass.py b/ooni/kit/domclass.py
index f287f98..ea68808 100644
--- a/ooni/kit/domclass.py
+++ b/ooni/kit/domclass.py
@@ -1,33 +1,26 @@
-#!/usr/bin/env python
-#-*- encoding: utf-8 -*-
-#
-# domclass
-# ********
-#
-# :copyright: (c) 2012 by Arturo Filastò
-# :license: see LICENSE for more details.
-#
-# how this works
-# --------------
-#
-# This classifier uses the DOM structure of a website to determine how similar
-# the two sites are.
-# The procedure we use is the following:
-# * First we parse all the DOM tree of the web page and we build a list of
-# TAG parent child relationships (ex. <html><a><b></b></a><c></c></html> =>
-# (html, a), (a, b), (html, c)).
-#
-# * We then use this information to build a matrix (M) where m[i][j] = P(of
-# transitioning from tag[i] to tag[j]). If tag[i] does not exists P() = 0.
-# Note: M is a square matrix that is number_of_tags wide.
-#
-# * We then calculate the eigenvectors (v_i) and eigenvalues (e) of M.
-#
-# * The corelation between page A and B is given via this formula:
-# correlation = dot_product(e_A, e_B), where e_A and e_B are
-# resepectively the eigenvalues for the probability matrix A and the
-# probability matrix B.
-#
+"""
+how this works
+--------------
+
+This classifier uses the DOM structure of a website to determine how similar
+the two sites are.
+The procedure we use is the following:
+ * First we parse all the DOM tree of the web page and we build a list of
+ TAG parent child relationships (ex. <html><a><b></b></a><c></c></html> =>
+ (html, a), (a, b), (html, c)).
+
+ * We then use this information to build a matrix (M) where m[i][j] = P(of
+ transitioning from tag[i] to tag[j]). If tag[i] does not exists P() = 0.
+ Note: M is a square matrix that is number_of_tags wide.
+
+ * We then calculate the eigenvectors (v_i) and eigenvalues (e) of M.
+
+ * The corelation between page A and B is given via this formula:
+ correlation = dot_product(e_A, e_B), where e_A and e_B are
+ resepectively the eigenvalues for the probability matrix A and the
+ probability matrix B.
+"""
+
import yaml
import numpy
import time
diff --git a/ooni/lib/__init__.py b/ooni/lib/__init__.py
deleted file mode 100644
index 611d50c..0000000
--- a/ooni/lib/__init__.py
+++ /dev/null
@@ -1,5 +0,0 @@
-from sys import path as syspath
-from os import path as ospath
-
-pwd = ospath.dirname(__file__)
-syspath.append(pwd)
diff --git a/ooni/lib/secdev.org.pem b/ooni/lib/secdev.org.pem
deleted file mode 100644
index 6fdbb97..0000000
--- a/ooni/lib/secdev.org.pem
+++ /dev/null
@@ -1,20 +0,0 @@
------BEGIN CERTIFICATE-----
-MIIDVDCCAjwCCQD6iQnFvlSvNjANBgkqhkiG9w0BAQUFADBsMQswCQYDVQQGEwJG
-UjEMMAoGA1UECBMDSWRGMQ4wDAYDVQQHEwVQYXJpczETMBEGA1UEChMKc2VjZGV2
-Lm9yZzETMBEGA1UECxMKc2VjZGV2Lm9yZzEVMBMGA1UEAxQMKi5zZWNkZXYub3Jn
-MB4XDTA4MDUxOTIxMzAxNVoXDTE4MDUyMDIxMzAxNVowbDELMAkGA1UEBhMCRlIx
-DDAKBgNVBAgTA0lkRjEOMAwGA1UEBxMFUGFyaXMxEzARBgNVBAoTCnNlY2Rldi5v
-cmcxEzARBgNVBAsTCnNlY2Rldi5vcmcxFTATBgNVBAMUDCouc2VjZGV2Lm9yZzCC
-ASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAMijlApVIOF86nIsPvIfKjkQ
-qpw8DWtugsUQkspXGaJM5qM3CvoxQ3VQejIZiLIS/B57WtpwfhD63h+dswUZy1wI
-Z4injE/uF4R7ylNammROjS1ycQbFM1fWX/1nzKFrxWpX3lU2YjwB9qIAlE3u/SyH
-U10oq9ZJ5KlyOrjTPe3eb0KBwF5W0AJxcTiGQJhADZAaAivZRT880GYJAo3UaL/G
-JaBYIYSFxvGnqmUVM9kbnGLFQEQahBpgmtCzMRVFXp/AccxCtXKY+LORtSGNKaB6
-ODDidG8jyb3S9GmjtgxwyWHvY/9YRW2BkB3AufRsOAWUN7jWDtRLKy6FCLbxE/sC
-AwEAATANBgkqhkiG9w0BAQUFAAOCAQEAex0loqATvXxZEagphrLASUGpKIlTf3a1
-1adokzrKvbuDXcxNqUKEPxI09TjnT/zySLfVc18t+yy2baSstPFC9RrLPeu8rfzL
-k+NTDmM3OfW60MCeEnyNxPvW0wCIrFLfH3t5XPT3J2DtYLmecg8Lf/sQOEWPyMVc
-uCaFIYsAypGYi0wwG5VDQHEsKxkHC2nBRwGJdx9w70yy14H/JOAZl5yQpLHEc4Db
-RUfNTIV2myXOIET2VbCN2Yc8Gegsclc506XVOQypp5Ndvy4GW2yRRE2ps1c1xH6P
-OHENUp0JPyLeyibmoOCUfrlrq2KoSashFZmPCGYFFJvcKAYI45GcaQ==
------END CERTIFICATE-----
diff --git a/ooni/nettest.py b/ooni/nettest.py
index 8374db1..22a35c9 100644
--- a/ooni/nettest.py
+++ b/ooni/nettest.py
@@ -1,16 +1,7 @@
-# -*- encoding: utf-8 -*-
-#
-# nettest.py
-# ----------
-# In here is the NetTest API definition. This is how people
-# interested in writing ooniprobe tests will be specifying them
-#
-# :license: see included LICENSE file
-
-import sys
-import os
import itertools
import traceback
+import sys
+import os
from twisted.trial import unittest, itrial, util
from twisted.internet import defer, utils
diff --git a/ooni/nodes.py b/ooni/nodes.py
deleted file mode 100644
index 070ffe7..0000000
--- a/ooni/nodes.py
+++ /dev/null
@@ -1,174 +0,0 @@
-#-*- coding: utf-8 -*-
-#
-# nodes.py
-# --------
-# here is code for handling the interaction with remote
-# services that will run ooniprobe tests.
-# XXX most of the code in here is broken or not tested and
-# probably should be trashed
-#
-# :authors: Arturo Filastò, Isis Lovecruft
-# :license: see included LICENSE file
-
-import os
-from binascii import hexlify
-
-try:
- import paramiko
-except:
- print "Error: module paramiko is not installed."
-from pprint import pprint
-import sys
-import socks
-import xmlrpclib
-
-class Node(object):
- def __init__(self, address, port):
- self.address = address
- self.port = port
-
-class LocalNode(object):
- def __init__(self):
- pass
-
-"""
-[]: node = NetworkNode("192.168.0.112", 5555, "SOCKS5")
-[]: node_socket = node.wrap_socket()
-"""
-class NetworkNode(Node):
- def __init__(self, address, port, node_type="SOCKS5", auth_creds=None):
- self.node = Node(address,port)
-
- # XXX support for multiple types
- # node type (SOCKS proxy, HTTP proxy, GRE tunnel, ...)
- self.node_type = node_type
- # type-specific authentication credentials
- self.auth_creds = auth_creds
-
- def _get_socksipy_socket(self, proxy_type, auth_creds):
- import socks
- s = socks.socksocket()
- # auth_creds[0] -> username
- # auth_creds[1] -> password
- s.setproxy(proxy_type, self.node.address, self.node.port,
- self.auth_creds[0], self.auth_creds[1])
- return s
-
- def _get_socket_wrapper(self):
- if (self.node_type.startswith("SOCKS")): # SOCKS proxies
- if (self.node_type != "SOCKS5"):
- proxy_type = socks.PROXY_TYPE_SOCKS5
- elif (self.node_type != "SOCKS4"):
- proxy_type = socks.PROXY_TYPE_SOCKS4
- else:
- print "We don't know this proxy type."
- sys.exit(1)
-
- return self._get_socksipy_socket(proxy_type)
- elif (self.node_type == "HTTP"): # HTTP proxies
- return self._get_socksipy_socket(PROXY_TYPE_HTTP)
- else: # Unknown proxies
- print "We don't know this proxy type."
- sys.exit(1)
-
- def wrap_socket(self):
- return self._get_socket_wrapper()
-
-class CodeExecNode(Node):
- def __init__(self, address, port, node_type, auth_creds):
- self.node = Node(address,port)
-
- # node type (SSH proxy, etc.)
- self.node_type = node_type
- # type-specific authentication credentials
- self.auth_creds = auth_creds
-
- def add_unit(self):
- pass
-
- def get_status(self):
- pass
-
-class PlanetLab(CodeExecNode):
- def __init__(self, address, auth_creds, ooni):
- self.auth_creds = auth_creds
-
- self.config = ooni.utils.config
- self.logger = ooni.logger
- self.name = "PlanetLab"
-
- def _api_auth(self):
- api_server = xmlrpclib.ServerProxy('https://www.planet-lab.org/PLCAPI/')
- auth = {}
- ## should be changed to separate node.conf file
- auth['Username'] = self.config.main.pl_username
- auth['AuthString'] = self.config.main.pl_password
- auth['AuthMethod'] = "password"
- authorized = api_server.AuthCheck(auth)
-
- if authorized:
- print 'We are authorized!'
- return auth
- else:
- print 'Authorization failed. Please check your settings for pl_username and pl_password in the ooni-probe.conf file.'
-
- def _search_for_nodes(self, node_filter=None):
- api_server = xmlrpclib.ServerProxy('https://www.planet-lab.org/PLCAPI/', allow_none=True)
- node_filter = {'hostname': '*.cert.org.cn'}
- return_fields = ['hostname', 'site_id']
- all_nodes = api_server.GetNodes(self.api_auth(), node_filter, boot_state_filter)
- pprint(all_nodes)
- return all_nodes
-
- def _add_nodes_to_slice(self):
- api_server = xmlrpclib.ServerProxy('https://www.planet-lab.org/PLCAPI/', allow_none=True)
- all_nodes = self.search_for_nodes()
- for node in all_nodes:
- api_server.AddNode(self.api_auth(), node['site_id'], all_nodes)
- print 'Adding nodes %s' % node['hostname']
-
- def _auth_login(slicename, machinename):
- """Attempt to authenticate to the given PL node, slicename and
- machinename, using any of the private keys in ~/.ssh/ """
-
- agent = paramiko.Agent()
- agent_keys = agent.get_keys()
- if len(agent_keys) == 0:
- return
-
- for key in agent_keys:
- print 'Trying ssh-agent key %s' % hexlify(key.get_fingerprint()),
- try:
- paramiko.transport.auth_publickey(machinename, slicename)
- print 'Public key authentication to PlanetLab node %s successful.' % machinename,
- return
- except paramiko.SSHException:
- print 'Public key authentication to PlanetLab node %s failed.' % machinename,
-
- def _get_command():
- pass
-
- def ssh_and_run_(slicename, machinename, command):
- """Attempt to make a standard OpenSSH client to PL node, and run
- commands from a .conf file."""
-
- ## needs a way to specify 'ssh -l <slicename> <machinename>'
- ## with public key authentication.
-
- command = PlanetLab.get_command()
-
- client = paramiko.SSHClient()
- client.load_system_host_keys()
- client.connect(machinename)
-
- stdin, stdout, stderr = client.exec_command(command)
-
- def send_files_to_node(directory, files):
- """Attempt to rsync a tree to the PL node."""
- pass
-
- def add_unit():
- pass
-
- def get_status():
- pass
diff --git a/ooni/oonicli.py b/ooni/oonicli.py
index f706ee3..c4da345 100644
--- a/ooni/oonicli.py
+++ b/ooni/oonicli.py
@@ -1,13 +1,4 @@
-# -*- coding: UTF-8
-#
-# oonicli
-# -------
-# In here we take care of running ooniprobe from the command
-# line interface
-#
-# :authors: Arturo Filastò, Isis Lovecruft
-# :license: see included LICENSE file
-
+#-*- coding: utf-8 -*-
import sys
import os
diff --git a/ooni/otime.py b/ooni/otime.py
index 0af4ec8..84758eb 100644
--- a/ooni/otime.py
+++ b/ooni/otime.py
@@ -1,6 +1,3 @@
-"""
-Here is the location for all time and date related utility functions.
-"""
import time
from datetime import datetime
diff --git a/ooni/reporter.py b/ooni/reporter.py
index e33d395..6ab32e8 100644
--- a/ooni/reporter.py
+++ b/ooni/reporter.py
@@ -1,12 +1,3 @@
-#-*- coding: utf-8 -*-
-#
-# reporter.py
-# -----------
-# In here goes the logic for the creation of ooniprobe reports.
-#
-# :authors: Arturo Filastò, Isis Lovecruft
-# :license: see included LICENSE file
-
import traceback
import itertools
import logging
diff --git a/ooni/runner.py b/ooni/runner.py
index fa66f8c..2936a99 100644
--- a/ooni/runner.py
+++ b/ooni/runner.py
@@ -1,12 +1,3 @@
-#-*- coding: utf-8 -*-
-#
-# runner.py
-# ---------
-# Handles running ooni.nettests as well as
-# ooni.plugoo.tests.OONITests.
-#
-# :license: see included LICENSE file
-
import os
import sys
import time
diff --git a/ooni/templates/httpt.py b/ooni/templates/httpt.py
index eb574be..b350ae8 100644
--- a/ooni/templates/httpt.py
+++ b/ooni/templates/httpt.py
@@ -1,7 +1,3 @@
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
import copy
import random
import struct
diff --git a/ooni/templates/scapyt.py b/ooni/templates/scapyt.py
index 8e6ce9c..6310000 100644
--- a/ooni/templates/scapyt.py
+++ b/ooni/templates/scapyt.py
@@ -1,8 +1,3 @@
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
-
import random
from zope.interface import implements
from twisted.python import usage
diff --git a/ooni/utils/__init__.py b/ooni/utils/__init__.py
index 5947519..abcf370 100644
--- a/ooni/utils/__init__.py
+++ b/ooni/utils/__init__.py
@@ -1,7 +1,3 @@
-"""
-
-"""
-
import imp
import os
import logging
diff --git a/ooni/utils/geodata.py b/ooni/utils/geodata.py
index 6acefd8..5cad748 100644
--- a/ooni/utils/geodata.py
+++ b/ooni/utils/geodata.py
@@ -1,12 +1,3 @@
-# -*- encoding: utf-8 -*-
-#
-# geodata.py
-# **********
-# In here go functions related to the understanding of
-# geographical information of the probe
-#
-# :licence: see LICENSE
-
import re
import os
@@ -21,27 +12,6 @@ try:
except ImportError:
log.err("Unable to import pygeoip. We will not be able to run geo IP related measurements")
-(a)defer.inlineCallbacks
-def myIP():
- target_site = 'https://check.torproject.org/'
- regexp = "Your IP address appears to be: <b>(.+?)<\/b>"
- myAgent = Agent(reactor)
-
- result = yield myAgent.request('GET', target_site)
-
- finished = defer.Deferred()
- result.deliverBody(net.BodyReceiver(finished))
-
- body = yield finished
-
- match = re.search(regexp, body)
- try:
- myip = match.group(1)
- except:
- myip = "unknown"
-
- defer.returnValue(myip)
-
class GeoIPDataFilesNotFound(Exception):
pass
diff --git a/ooni/utils/hacks.py b/ooni/utils/hacks.py
index 18e9102..64b5a53 100644
--- a/ooni/utils/hacks.py
+++ b/ooni/utils/hacks.py
@@ -1,12 +1,5 @@
-# -*- encoding: utf-8 -*-
-#
-# hacks.py
-# ********
# When some software has issues and we need to fix it in a
# hackish way, we put it in here. This one day will be empty.
-#
-# :licence: see LICENSE
-
import copy_reg
diff --git a/ooni/utils/log.py b/ooni/utils/log.py
index 3e24804..9ab8880 100644
--- a/ooni/utils/log.py
+++ b/ooni/utils/log.py
@@ -1,8 +1,3 @@
-# -*- encoding: utf-8 -*-
-#
-# :authors: Arturo Filastò
-# :licence: see LICENSE
-
import sys
import os
import traceback
diff --git a/ooni/utils/net.py b/ooni/utils/net.py
index 12d1939..824d720 100644
--- a/ooni/utils/net.py
+++ b/ooni/utils/net.py
@@ -1,14 +1,3 @@
-# -*- encoding: utf-8 -*-
-#
-# net.py
-# --------
-# OONI utilities for network infrastructure and hardware.
-#
-# :authors: Isis Lovecruft, Arturo Filasto
-# :version: 0.0.1-pre-alpha
-# :license: (c) 2012 Isis Lovecruft, Arturo Filasto
-# see attached LICENCE file
-
import sys
import socket
from random import randint
@@ -23,7 +12,6 @@ from ooni.utils import log, txscapy
#if sys.platform.system() == 'Windows':
# import _winreg as winreg
-
userAgents = [
("Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6", "Firefox 2.0, Windows XP"),
("Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)", "Internet Explorer 7, Windows Vista"),
@@ -55,7 +43,6 @@ PLATFORMS = {'LINUX': sys.platform.startswith("linux"),
'SOLARIS': sys.platform.startswith("sunos"),
'WINDOWS': sys.platform.startswith("win32")}
-
class StringProducer(object):
implements(IBodyProducer)
diff --git a/ooni/utils/txscapy.py b/ooni/utils/txscapy.py
index 210f21b..c9cc87a 100644
--- a/ooni/utils/txscapy.py
+++ b/ooni/utils/txscapy.py
@@ -1,11 +1,3 @@
-# -*- coding:utf8 -*-
-# txscapy
-# *******
-# Here shall go functions related to using scapy with twisted.
-#
-# This software has been written to be part of OONI, the Open Observatory of
-# Network Interference. More information on that here: http://ooni.nu/
-
import struct
import socket
import os
diff --git a/scripts/before_i_commit.sh b/scripts/before_i_commit.sh
new file mode 100755
index 0000000..8b8180f
--- /dev/null
+++ b/scripts/before_i_commit.sh
@@ -0,0 +1,29 @@
+#!/bin/sh
+# This script should be run before you commit to verify that the basic tests
+# are working as they should
+# Once you have run it you can inspect the log file via
+#
+# $ less before_i_commit.log
+# To clean up everything that is left by the running of this tool, do as
+# following:
+#
+# rm *.yamloo; rm before_i_commit.log
+#
+
+rm before_i_commit.log
+
+find . -type f -name "*.py[co]" -delete
+
+./bin/ooniprobe -i before_i_commit.testdeck
+
+echo "Below you should not see anything"
+echo "---------------------------------"
+grep "Error: " before_i_commit.log
+echo "---------------------------------"
+echo "If you do, it means something is wrong."
+echo "Read through the log file and fix it."
+echo "If you are having some problems fixing some things that have to do with"
+echo "the core of OONI, let's first discuss it on IRC, or open a ticket"
+read
+cat *yamloo | less
+rm -f *yamloo
diff --git a/scripts/submit-patch b/scripts/submit-patch
new file mode 100755
index 0000000..4768deb
--- /dev/null
+++ b/scripts/submit-patch
@@ -0,0 +1,100 @@
+#!/bin/bash
+##############################################################################
+#
+# submit-patch
+# -------------------
+# Submit a patch to the OONI developers!
+#
+# @authors: Isis Lovecruft, 0x2cdb8b35 <isis(a)torproject.org>
+# @license: see included LICENSE file
+# @version: 0.0.1
+#
+# To apply these patches:
+#
+# $ git fetch <project> master:test-apply
+# $ git checkout test-apply
+# $ git reset --hard
+# $ git am a.patch
+#
+# Note:
+# Dear other OONI developers and contributors,
+# if you would like patches emailed to you as well, then add your name and
+# email to this script, commit the changes, and submit it as a patch. :)
+# <(A)3
+# isis agora lovecruft
+#
+
+DEVELOPERS="<isis(a)torproject.org>, "
+
+HEADERS="X-Git-Format-Patch: ooni "
+
+function usage ()
+{
+ echo;
+ echo -e "\033[40m\033[0;32m OPEN OBSERVATORY of NETWORK INTERFERENCE \033[0m"
+ echo -e "\033[40m\033[0;32m ---------------------------------------- \033[0m"
+ echo -e ""
+ echo -e "\033[40m\033[0;32m This script will collect all committed changes in your current \033[0m"
+ echo -e "\033[40m\033[0;32m branch which are not in the upstream branch, format a patch or \033[0m"
+ echo -e "\033[40m\033[0;32m a series of patches from them, and, finally, email them to the \033[0m"
+ echo -e "\033[40m\033[0;32m OONI developers. \033[0m"
+ echo -e "\033[40m\033[0;32m Thanks for the patch\! \033[0m"
+}
+
+function pullfirst ()
+{
+ echo;
+ read -ep" Should we pull in changes from master before formatting patches? (Y/n) " -s choice
+ case "$choice" in
+ "n"|"N"|"no"|"No"|"NO"|"non"|"nein")
+ PULL=false
+ ;;
+ *)
+ PULL=true
+ ;;
+ esac
+ if $PULL; then
+ if test -n "$UPSTREAMPULL" ; then
+ echo;
+ echo -e "\033[40m\033[0;32m Pulling from upstream... \033[0m"
+ git pull $UPSTREAM
+ fi
+ fi
+}
+
+
+usage
+echo -e ""
+read -ep" Should we CC the generated patch emails to tor-dev(a)lists.torproject.org? (Y/n) " cctordev
+if test -z "$cctordev" ; then
+ CC="tor-dev(a)lists.torproject.org, "
+else
+ CC=""
+fi
+
+#echo;
+#echo -e
+#read -ep" Which branch/revision/commit should we include up to? (Return for 'HEAD'): " upstream
+#if test -n "$upstream" ; then
+# UPSTREAM=$upstream
+# UPSTREAMPULL=$upstream" master"
+#else
+# UPSTREAM="origin"
+# UPSTREAMPULL="origin master"
+#fi
+#pullfirst
+echo;
+echo -e "\033[40m\033[0;32m THIS SCRIPT DOES NOT SEND THE PATCH FILES. \033[0m"
+echo -e "\033[40m\033[0;32m You'll have to handle that bit on your own. \033[0m"
+echo;
+if test ! -d "patches" ; then
+ echo; echo -e "\033[40m\033[0;32m Creating '/patches' directory... \033[0m"
+ mkdir patches
+else
+ echo; echo -e "\033[40m\033[0;32m Using '/patches' directory... \033[0m"
+fi
+git format-patch --full-index -o "./patches" --stat -l10 --ignore-submodules \
+ --binary --cover-letter --numbered --ignore-if-in-upstream \
+ --suffix=".patch" --to="$DEVELOPERS" --cc="$CC" master
+#NEW=`git patch-id < patches/new.patch`
+#echo "Patch id: $NEW"
diff --git a/submit-patch b/submit-patch
deleted file mode 100755
index 4768deb..0000000
--- a/submit-patch
+++ /dev/null
@@ -1,100 +0,0 @@
-#!/bin/bash
-##############################################################################
-#
-# submit-patch
-# -------------------
-# Submit a patch to the OONI developers!
-#
-# @authors: Isis Lovecruft, 0x2cdb8b35 <isis(a)torproject.org>
-# @license: see included LICENSE file
-# @version: 0.0.1
-#
-# To apply these patches:
-#
-# $ git fetch <project> master:test-apply
-# $ git checkout test-apply
-# $ git reset --hard
-# $ git am a.patch
-#
-# Note:
-# Dear other OONI developers and contributors,
-# if you would like patches emailed to you as well, then add your name and
-# email to this script, commit the changes, and submit it as a patch. :)
-# <(A)3
-# isis agora lovecruft
-#
-
-DEVELOPERS="<isis(a)torproject.org>, "
-
-HEADERS="X-Git-Format-Patch: ooni "
-
-function usage ()
-{
- echo;
- echo -e "\033[40m\033[0;32m OPEN OBSERVATORY of NETWORK INTERFERENCE \033[0m"
- echo -e "\033[40m\033[0;32m ---------------------------------------- \033[0m"
- echo -e ""
- echo -e "\033[40m\033[0;32m This script will collect all committed changes in your current \033[0m"
- echo -e "\033[40m\033[0;32m branch which are not in the upstream branch, format a patch or \033[0m"
- echo -e "\033[40m\033[0;32m a series of patches from them, and, finally, email them to the \033[0m"
- echo -e "\033[40m\033[0;32m OONI developers. \033[0m"
- echo -e "\033[40m\033[0;32m Thanks for the patch\! \033[0m"
-}
-
-function pullfirst ()
-{
- echo;
- read -ep" Should we pull in changes from master before formatting patches? (Y/n) " -s choice
- case "$choice" in
- "n"|"N"|"no"|"No"|"NO"|"non"|"nein")
- PULL=false
- ;;
- *)
- PULL=true
- ;;
- esac
- if $PULL; then
- if test -n "$UPSTREAMPULL" ; then
- echo;
- echo -e "\033[40m\033[0;32m Pulling from upstream... \033[0m"
- git pull $UPSTREAM
- fi
- fi
-}
-
-
-usage
-echo -e ""
-read -ep" Should we CC the generated patch emails to tor-dev(a)lists.torproject.org? (Y/n) " cctordev
-if test -z "$cctordev" ; then
- CC="tor-dev(a)lists.torproject.org, "
-else
- CC=""
-fi
-
-#echo;
-#echo -e
-#read -ep" Which branch/revision/commit should we include up to? (Return for 'HEAD'): " upstream
-#if test -n "$upstream" ; then
-# UPSTREAM=$upstream
-# UPSTREAMPULL=$upstream" master"
-#else
-# UPSTREAM="origin"
-# UPSTREAMPULL="origin master"
-#fi
-#pullfirst
-echo;
-echo -e "\033[40m\033[0;32m THIS SCRIPT DOES NOT SEND THE PATCH FILES. \033[0m"
-echo -e "\033[40m\033[0;32m You'll have to handle that bit on your own. \033[0m"
-echo;
-if test ! -d "patches" ; then
- echo; echo -e "\033[40m\033[0;32m Creating '/patches' directory... \033[0m"
- mkdir patches
-else
- echo; echo -e "\033[40m\033[0;32m Using '/patches' directory... \033[0m"
-fi
-git format-patch --full-index -o "./patches" --stat -l10 --ignore-submodules \
- --binary --cover-letter --numbered --ignore-if-in-upstream \
- --suffix=".patch" --to="$DEVELOPERS" --cc="$CC" master
-#NEW=`git patch-id < patches/new.patch`
-#echo "Patch id: $NEW"
diff --git a/to-be-ported/TODO b/to-be-ported/TODO
deleted file mode 100644
index 81d834f..0000000
--- a/to-be-ported/TODO
+++ /dev/null
@@ -1,418 +0,0 @@
-This is a list of techniques that should be added as plugins or hooks or yamlooni
-
-Implement Plugooni - our plugin framework
-Implement Yamlooni - our output format
-Implement Proxooni - our proxy spec and program
-
-We should launch our own Tor on a special port (say, 127.0.0.1:9066)
-We should act as a controller with TorCtl to do this, etc
-We should take the Tor consensus file and pass it to plugins such as marco
-
-HTTP Host header comparsion of a vs b
-HTTP Content length header comparision of a vs b
-
-GET request splitting
- "G E T "
- Used in Iran
-
-General Malformed HTTP requests
- Error pages are fingerprintable
-
-traceroute
- icmp/udp/tcp
- each network link is an edge, each hop is a vertex in a network graph
-
-traceroute hop count
- "TTL walking"
-
-Latency measurement
-TCP reset detection
-Forged DNS spoofing detection
-
-DNS oracle query tool
- given DNS server foo - test resolve and look for known block pages
-
-Test HTTP header order - do they get reordered?
-
-Look for these filter fingerprints:
-X-Squid-Error: ERR_SCC_SMARTFILTER_DENIED 0
-X-Squid-Error: ERR_ACCESS_DENIED 0
-X-Cache: MISS from SmartFilter
-
-
-WWW-Authenticate: Basic realm="SmartFilter Control List HTTP Download"
-
-
-Via: 1.1 WEBFILTER.CONSERVESCHOOL.ORG:8080
-
-X-Cache: MISS from webfilter.whiteschneider.com
-X-Cache: MISS from webfilter.whiteschneider.com
-X-Cache: MISS from webfilter.whiteschneider.com
-
-Location: http://192.168.0.244/webfilter/blockpage?nonce=7d2b7e500e99a0fe&tid=3
-
-
-X-Cache: MISS from webfilter.imscs.local
-X-Cache: MISS from webfilter.tjs.at
-
-
-Via: 1.1 webwasher (Webwasher 6.8.7.9396)
-
-Websense:
-HTTP/1.0 301 Moved Permanently -> Location: http://www.websense.com/
-
-Via: HTTP/1.1 localhost.localdomain (Websense-Content_Gateway/7.1.4 [c s f ]), HTTP/1.0 localhost.localdomain (Websense-Content_Gateway/7.1.4 [cMsSf ])
-
-
-BlueCoat:
-
-Via: 1.1 testrating.dc5.es.bluecoat.com
-403 ->
-Set-Cookie: BIGipServerpool_bluecoat=1185677834.20480.0000; expires=Fri, 15-Apr-2011 10:13:21 GMT; path=/
-
-HTTP/1.0 407 Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied. ) -> Via: 1.1 WEBSENSE
-
-HTTP/1.0 302 Found -> Location: http://bluecoat/?cfru=aHR0cDovLzIwMC4yNy4xMjMuMTc4Lw==
-
-HTTP/1.0 403 Forbidden
-Server: squid/3.0.STABLE8
-
-X-Squid-Error: ERR_ACCESS_DENIED 0
-X-Cache: MISS from Bluecoat
-X-Cache-Lookup: NONE from Bluecoat:3128
-Via: 1.0 Bluecoat (squid/3.0.STABLE8)
-
-ISA server:
-HTTP/1.0 403 Forbidden ( ISA Server is configured to block HTTP requests that require authentication. )
-
-
-Unknown:
-X-XSS-Protection: 1; mode=block
-
-Rimon filter:
-
-Rimon: RWC_BLOCK
-HTTP/1.1 Rimon header
-Rimon header is only sent by lighttpd
-http://www.ynetnews.com/articles/0,7340,L-3446129,00.html
-http://btya.org/pdfs/rvienerbrochure.pdf
-
-Korea filtering:
-HTTP/1.0 302 Object Moved -> Location: http://www.willtechnology.co.kr/eng/BlockingMSGew.htm
-Redirects to Korean filter:
-http://www.willtechnology.co.kr/eng/BlockingMSGew.htm
-
-UA filtering:
-HTTP/1.0 307 Temporary Redirect
-https://my.best.net.ua/login/blocked/
-
-netsweeper:
-HTTP/1.0 302 Moved
-Location: http://netsweeper1.gaggle.net:8080/webadmin/deny/index.php?dpid=53&dpruleid…
-
-Set-cookie: RT_SID_netsweeper.com.80=68a6f5c564a9db297e8feb2bff69d73f; path=/
-X-Cache: MISS from netsweeper.irishbroadband.ie
-X-Cache-Lookup: NONE from netsweeper.irishbroadband.ie:80
-Via: 1.0 netsweeper.irishbroadband.ie:80 (squid/2.6.STABLE21)
-
-Nokia:
-Via: 1.1 saec-nokiaq05ca (NetCache NetApp/6.0.7)
-Server: "Nokia"
-
-CensorNet:
-HTTP/1.0 401 Authorization Required
-WWW-Authenticate: Basic realm="CensorNet Administration Area"
-Server: CensorNet/4.0
-
-http://www.itcensor.com/censor
-
-
-Server: ZyWALL Content Filter
-
-Apache/1.3.34 (Unix) filter/1.0
-
-HTTP/1.0 502 infiniteproxyloop
-Via: 1.0 218.102.20.37 (McAfee Web Gateway 7.0.1.5.0.8505)
-
-
-Set-Cookie: McAfee-SCM-URL-Filter-Coach="dD4OzXciEcp8Ihf1dD4ZzHM5FMZ2PSvRTllOnSR4RZkqfkmEIGgb3hZlVJsEaFaXNmNS3mgsdZAxaVOKIGgrrSx4Rb8hekmNKn4g02VZToogf1SbIQcVz3Q8G/U="; Comment="McAfee URL access coaching"; Version=1; Path=/; Max-Age=900; expires=Sat, 18 Dec 2010 06:47:11 GMT;
-
-
-WWW-Authenticate: Basic realm="(Nancy McAfee)"
-
-
-No known fingerprints for:
-NetNanny
-WebChaver
-accountable2you.com
-http://www.shodanhq.com/?q=barracuda
-http://www.shodanhq.com/?q=untangle
-http://www.shodanhq.com/?q=Lightspeed
-
-Server: Smarthouse Lightspeed
-Server: Smarthouse Lightspeed2
-Server: Smarthouse Lightspeed 3
-
-Server: EdgePrism/3.8.1.1
-
-
-X-Cache: MISS from Barracuda-WebFilter.jmpsecurities.com
-Via: 1.0 Barracuda-WebFilter.jmpsecurities.com:8080 (http_scan/4.0.2.6.19)
-
-HTTP/1.0 302 Redirected by M86 Web Filter
-http://www.m86security.com/products/web_security/m86-web-filter.asp
-
-Location: http://10.1.61.37:81/cgi/block.cgi?URL=http://70.182.111.99/&IP=96.9.174.54…
-
-
-Via: 1.1 WEBSENSE
-
-
-Via: 1.1 192.168.1.251 (McAfee Web Gateway 7.1.0.1.0.10541)
-Via: 1.1 McAfeeSA3000.cbcl.lan
-
-
-X-Squid-Error: ERR_CONNECT_FAIL 111
-X-Cache: MISS from CudaWebFilter.poten.com
-
-http://212.50.251.82/ -iran squid
-
-HTTP/1.0 403 Forbidden ( Forefront TMG denied the specified Uniform Resource Locator (URL). )
-Via: 1.1 TMG
-
-
-Server: NetCache appliance (NetApp/6.0.2)
-
-
-Server: EdgePrism/3.8.1.1
-
-
-Server: Mikrotik HttpProxy
-
-
-Via: 1.1 TMG-04, 1.1 TMG-03
-
-
-X-Squid-Error: ERR_INVALID_REQ 0
-X-Cache: MISS from uspa150.trustedproxies.com
-X-Cache-Lookup: NONE from uspa150.trustedproxies.com:80
-
-http://www.shodanhq.com/host/view/93.125.95.177
-
-
-Server: SarfX WEB: Self Automation Redirect & Filter Expernet.Ltd Security Web Server
-http://203.229.245.100/ <- korea block page
-
-
-
-Server: Asroc Intelligent Security Filter 4.1.8
-
-
-
-Server: tinyproxy/1.8.2
-
-http://www.shodanhq.com/host/view/64.104.95.251
-
-
-
-Server: Asroc Intelligent Security Filter 4.1.8
-
-http://www.shodanhq.com/host/view/67.220.92.62
-
-
-Server: SarfX WEB: Self Automation Redirect & Filter Expernet.Ltd Security Web Server
-http://www.shodanhq.com/host/view/203.229.245.100
-Location: http://192.168.3.20/redirect.cgi?Time=05%2FJul%2F2011%3A21%3A29%3A32%20%2B0…
-
-
-http://www.shodanhq.com/?q=%22content+filter%22+-squid+-apache+-ZyWall&page=4
-http://www.shodanhq.com/host/view/72.5.92.51
-http://www.microsoft.com/forefront/threat-management-gateway/en/us/pricing-licensing.aspx
-
-http://meta.wikimedia.org/wiki/Talk:XFF_project
-
-% dig nats.epiccash.com
-
-; <<>> DiG 9.7.3 <<>> nats.epiccash.com
-;; global options: +cmd
-;; Got answer:
-;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 14920
-;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 0
-
-;; QUESTION SECTION:
-;nats.epiccash.com. IN A
-
-;; ANSWER SECTION:
-nats.epiccash.com. 5 IN A 172.27.0.1
-
-;; AUTHORITY SECTION:
-epiccash.com. 5 IN NS ns0.example.net.
-epiccash.com. 5 IN NS ns1.example.net.
-
-;; Query time: 81 msec
-;; SERVER: 172.16.42.2#53(172.16.42.2)
-;; WHEN: Sat Jul 16 16:14:11 2011
-;; MSG SIZE rcvd: 98
-
-If we think it's squid, we can perhaps confirm it:
-echo -e "GET cache_object://localhost/info HTTP/1.0\r\n" | nc en.wikipedia.com 80
-Harvest urls from:
-http://urlblacklist.com/?sec=download
-
-https://secure.wikimedia.org/wikipedia/simple/wiki/User_talk:62.30.249.131
-
-mention WCCPv2 filters (http://www.cl.cam.ac.uk/~rnc1/talks/090528-uknof13.pdf)
-
-Cite a bunch of Richard's work:
-http://www.cl.cam.ac.uk/~rnc1/ignoring.pdf
-
-http://www.contentkeeper.com/products/web
-
-We should detect HTTP re-directs to rfc-1918 addresses; they're almost always captive portals.
-We should also detect HTTP MITM served from rfc-1918 addresses for the same reason.
-
-We should take a page from sshshuttle and run without touching the disk
-
-VIA Rail MITM's SSL In Ottawa:
-Jul 22 17:47:21.983 [Warning] Problem bootstrapping. Stuck at 85%: Finishing handshake with first hop. (DONE; DONE; count 13; recommendation warn)
-
-http://wireless.colubris.com:81/goform/HtmlLoginRequest?username=al1852&password=al1852
-
-VIA Rail Via header (DONE):
-
-HTTP/1.0 301 Moved Permanently
-Location: http://www.google.com/
-Content-Type: text/html; charset=UTF-8
-Date: Sat, 23 Jul 2011 02:21:30 GMT
-Expires: Mon, 22 Aug 2011 02:21:30 GMT
-Cache-Control: public, max-age=2592000
-Server: gws
-Content-Length: 219
-X-XSS-Protection: 1; mode=block
-X-Cache: MISS from cache_server
-X-Cache-Lookup: MISS from cache_server:3128
-Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
-Connection: close
-
-<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
-<TITLE>301 Moved</TITLE></HEAD><BODY>
-<H1>301 Moved</H1>
-The document has moved
-<A HREF="http://www.google.com/">here</A>.
-</BODY></HTML>
-
-
-blocked site (DONE):
-
-HTTP/1.0 302 Moved Temporarily
-Server: squid/2.6.STABLE21
-Date: Sat, 23 Jul 2011 02:22:17 GMT
-Content-Length: 0
-Location: http://10.66.66.66/denied.html
-
-invalid request response:
-
-$ nc 8.8.8.8 80 (DONE)
-hjdashjkdsahjkdsa
-HTTP/1.0 400 Bad Request
-Server: squid/2.6.STABLE21
-Date: Sat, 23 Jul 2011 02:22:44 GMT
-Content-Type: text/html
-Content-Length: 1178
-Expires: Sat, 23 Jul 2011 02:22:44 GMT
-X-Squid-Error: ERR_INVALID_REQ 0
-X-Cache: MISS from cache_server
-X-Cache-Lookup: NONE from cache_server:3128
-Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
-Proxy-Connection: close
-
-<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
-<HTML><HEAD><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
-<TITLE>ERROR: The requested URL could not be retrieved</TITLE>
-<STYLE type="text/css"><!--BODY{background-color:#ffffff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--></STYLE>
-</HEAD><BODY>
-<H1>ERROR</H1>
-<H2>The requested URL could not be retrieved</H2>
-<HR noshade size="1px">
-<P>
-While trying to process the request:
-<PRE>
-hjdashjkdsahjkdsa
-
-</PRE>
-<P>
-The following error was encountered:
-<UL>
-<LI>
-<STRONG>
-Invalid Request
-</STRONG>
-</UL>
-
-<P>
-Some aspect of the HTTP Request is invalid. Possible problems:
-<UL>
-<LI>Missing or unknown request method
-<LI>Missing URL
-<LI>Missing HTTP Identifier (HTTP/1.0)
-<LI>Request is too large
-<LI>Content-Length missing for POST or PUT requests
-<LI>Illegal character in hostname; underscores are not allowed
-</UL>
-<P>Your cache administrator is <A HREF="mailto:root">root</A>.
-
-<BR clear="all">
-<HR noshade size="1px">
-<ADDRESS>
-Generated Sat, 23 Jul 2011 02:22:44 GMT by cache_server (squid/2.6.STABLE21)
-</ADDRESS>
-</BODY></HTML>
-
-nc 10.66.66.66 80
-GET cache_object://localhost/info HTTP/1.0
-HTTP/1.0 403 Forbidden
-Server: squid/2.6.STABLE21
-Date: Sat, 23 Jul 2011 02:25:56 GMT
-Content-Type: text/html
-Content-Length: 1061
-Expires: Sat, 23 Jul 2011 02:25:56 GMT
-X-Squid-Error: ERR_ACCESS_DENIED 0
-X-Cache: MISS from cache_server
-X-Cache-Lookup: NONE from cache_server:3128
-Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
-Proxy-Connection: close
-
-<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
-<HTML><HEAD><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
-<TITLE>ERROR: The requested URL could not be retrieved</TITLE>
-<STYLE type="text/css"><!--BODY{background-color:#ffffff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--></STYLE>
-</HEAD><BODY>
-<H1>ERROR</H1>
-<H2>The requested URL could not be retrieved</H2>
-<HR noshade size="1px">
-<P>
-While trying to retrieve the URL:
-<A HREF="cache_object://localhost/info">cache_object://localhost/info</A>
-<P>
-The following error was encountered:
-<UL>
-<LI>
-<STRONG>
-Access Denied.
-</STRONG>
-<P>
-Access control configuration prevents your request from
-being allowed at this time. Please contact your service provider if
-you feel this is incorrect.
-</UL>
-<P>Your cache administrator is <A HREF="mailto:root">root</A>.
-
-
-<BR clear="all">
-<HR noshade size="1px">
-<ADDRESS>
-Generated Sat, 23 Jul 2011 02:25:56 GMT by cache_server (squid/2.6.STABLE21)
-</ADDRESS>
-</BODY></HTML>
-
-
diff --git a/to-be-ported/spec/proxooni-spec.txt b/to-be-ported/spec/proxooni-spec.txt
deleted file mode 100644
index 7cc476f..0000000
--- a/to-be-ported/spec/proxooni-spec.txt
+++ /dev/null
@@ -1,65 +0,0 @@
-
- Proxyooni specification
- version 0.0
- Jacob Appelbaum
-
-0. Preface
-
- This document describes a new proxy that is required to support ooni-probe.
-
-1. Overview
-
- There is no common proxy type that thwarts even the most basic traffic
- monitoring. The Proxyooni specification aims to provide a proxy that is
- encrypted by default, optionally authenticated, and will provide a way to run
- specific ooni-probe tests natively on the system where the proxy is running.
-
-2. Implementation
-
- Proxyooni may be written in any language, the reference implementation will be
- implemented in Python. The program shall be called ooni-proxy and it will handle
- running as a privileged user or an unprivileged user on supported systems. We
- aim to support ooni-proxy on Debian Gnu/Linux as the reference platform.
-
-2.1 Connections
-
- When ooni-proxy runs, it should open a single port and it will allow TLS 1.0
- clients to connect with a cipher suite that provides perfect forward secrecy.
-
-2.2 Certificates
-
- ooni-proxy should use a certificate if supplied or dynamically generate a
- certificate on startup; any connecting client should bootstrap trust with a
- TOFU model, a client may ignore the
-
-2.3 Authentication
-
- ooni-proxy should provide open access by default with no authentication.
- It should support TLS-PSK[0] if authentication is desired. Key distribution is
- explictly an out of scope problem.
-
-3.0 Services offered
-
- Post authentication, a remote client should treat ooni-proxy as a SOCKS4A[1]
- proxy. It should be possible to chain as many Proxyooni proxies as desired.
-
-3.1 Additional services offered
-
- ooni-proxy should allow for the sending of raw socket data - this is currently
- left unspecified. This should be specified in the next revision of the
- specification.
-
-3.2 Advanced meta-services
-
- It may be desired to load code on the ooni-proxy from a client with newer
- tests. This should be specified in the next revision of the specification.
-
-4. Security Concerns
-
- It is probably not a good idea to run ooni-proxy unless you have permission to
- do so. Consider your network context carefully; if it is dangerous to run a test
- ensure that you do not run the test.
-
-[0] http://en.wikipedia.org/wiki/TLS-PSK
-[1] http://en.wikipedia.org/wiki/SOCKS#SOCKS_4a
-
diff --git a/to-be-ported/very-old/TODO.plgoons b/to-be-ported/very-old/TODO.plgoons
deleted file mode 100644
index ace2a10..0000000
--- a/to-be-ported/very-old/TODO.plgoons
+++ /dev/null
@@ -1,79 +0,0 @@
-We should implement the following as plugoons:
-
-dns_plgoo.py - Various DNS checks
-
-As a start - we should perform a known good check against a name or list of
-names. As input, we should take an ip address, a name or a list of names for
-testing; we also take dns servers for experiment or control data. For output we
-emit UDP or TCP packets - we should support proxying these requests when
-possible as is the case with TCP but probably not with UDP for certain DNS
-request types.
-
-http_plgoo.py - Various HTTP checks
-
-We should compare two pages and see if we have identical properties.
-At the very least, we should print the important differences - perhaps
-with a diff like output? We should look for fingerprints in URLS that are
-returned. We should detect 302 re-direction.
-
-As input, we should take an ip address, a name or a list of names for testing;
-we also take a list of headers such as random user agent strings and so on.
-We should emit TCP packets and ensure that we do not leak DNS for connections
-that we expect to proxy to a remote network.
-
-latency_plgoo.py - Measure latency for a host or a list of hosts
-
-As input, we should take an ip address, a name or a list of names for testing;
-We should measure the mean latency from the ooni-probe to the host with various
-traceroute tests. We should also measure the latency between the ooni-probe and
-a given server for any other protocol that is request and response oriented;
-HTTP latency may be calculated by simply tracking the delta between requests
-and responses.
-
-tcptrace_plgoo.py udptrace_plgoo.py icmptrace_plgoo.py - Traceroute suites
-
-tcptrace_plgoo.py should allow for both stray and in-connection traceroute
-modes.
-
-udptrace_plgoo.py should use UDP 53 by default; 0 and 123 are also nice options
-- it may also be nice to simply make a random A record request in a DNS packet
-and use it as the payload for a UDP traceroute.
-
-reversetrace_plgoo.py should give a remote host the client's IP and return the
-output of a traceroute to that IP from the remote host. It will need a remote
-component if run against a web server. It would not need a remote component if
-run against route-views - we can simply telnet over Tor and ask it to trace to
-our detected client IP.
-
-keyword_plgoo.py should take a keyword or a list of keywords for use as a
-payload in a varity of protocols. This should be protocol aware - dns keyword
-filtering requires a sniffer to catch stray packets after the censor wins the
-race. HTTP payloads in open connections may be similar and in practice, we'll
-have to find tune it.
-
-icsi_plgoo.py - The ICSI Netalyzr tests; we should act as a client for their
-servers. They have dozens of tests and to implement this plgoo, we'll need to
-add many things to ooni. More details here:
-http://netalyzr.icsi.berkeley.edu/faq.html
-http://netalyzr.icsi.berkeley.edu/json/id=example-session
-
-HTML output:
-http://n2.netalyzr.icsi.berkeley.edu/summary/id=43ca208a-3466-82f17207-9bc1-433f-9b43
-
-JSON output:
-http://n2.netalyzr.icsi.berkeley.edu/json/id=43ca208a-3466-82f17207-9bc1-433f-9b43
-
-Netalyzer log:
-http://netalyzr.icsi.berkeley.edu/restore/id=43ca208a-3466-82f17207-9bc1-433f-9b43
-http://n2.netalyzr.icsi.berkeley.edu/transcript/id=43ca208a-3466-82f17207-9bc1-433f-9b43/side=client
-http://n2.netalyzr.icsi.berkeley.edu/transcript/id=43ca208a-3466-82f17207-9bc1-433f-9b43/side=server
-
-sniffer_plgoo.py - We need a generic method for capturing packets during a full
-run - this may be better as a core ooni-probe feature but we should implement
-packet capture in a plugin if it is done no where else.
-
-nmap_plgoo.py - We should take a list of hosts and run nmap against each of
-these hosts; many hosts are collected during testing and they should be scanned
-with something reasonable like "-A -O -T4 -sT --top-ports=10000" or something
-more reasonable.
-
diff --git a/to-be-ported/very-old/TO_BE_PORTED b/to-be-ported/very-old/TO_BE_PORTED
deleted file mode 100644
index 49ce5e0..0000000
--- a/to-be-ported/very-old/TO_BE_PORTED
+++ /dev/null
@@ -1,14 +0,0 @@
-
-The tests in this directory are very old, and have neither been ported to
-Twisted, nor to the new twisted.trial API framework. Although, they are not
-old in the sense of the *seriously old* OONI code which was written two years
-ago.
-
-These tests should be updated at least to use Twisted.
-
-If you want to hack on something care free, feel free to mess with these files
-because it would be difficult to not improve on them.
-
-<(A)3
-isis
-0x2cdb8b35
diff --git a/to-be-ported/very-old/ooni-probe.diff b/to-be-ported/very-old/ooni-probe.diff
deleted file mode 100644
index fc61d3f..0000000
--- a/to-be-ported/very-old/ooni-probe.diff
+++ /dev/null
@@ -1,358 +0,0 @@
-diff --git a/TODO b/TODO
-index c2e19af..51fa559 100644
---- a/TODO
-+++ b/TODO
-@@ -293,3 +293,142 @@ VIA Rail MITM's SSL In Ottawa:
- Jul 22 17:47:21.983 [Warning] Problem bootstrapping. Stuck at 85%: Finishing handshake with first hop. (DONE; DONE; count 13; recommendation warn)
-
- http://wireless.colubris.com:81/goform/HtmlLoginRequest?username=al1852&pas…
-+
-+VIA Rail Via header:
-+
-+HTTP/1.0 301 Moved Permanently
-+Location: http://www.google.com/
-+Content-Type: text/html; charset=UTF-8
-+Date: Sat, 23 Jul 2011 02:21:30 GMT
-+Expires: Mon, 22 Aug 2011 02:21:30 GMT
-+Cache-Control: public, max-age=2592000
-+Server: gws
-+Content-Length: 219
-+X-XSS-Protection: 1; mode=block
-+X-Cache: MISS from cache_server
-+X-Cache-Lookup: MISS from cache_server:3128
-+Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
-+Connection: close
-+
-+<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
-+<TITLE>301 Moved</TITLE></HEAD><BODY>
-+<H1>301 Moved</H1>
-+The document has moved
-+<A HREF="http://www.google.com/">here</A>.
-+</BODY></HTML>
-+
-+
-+blocked site:
-+
-+HTTP/1.0 302 Moved Temporarily
-+Server: squid/2.6.STABLE21
-+Date: Sat, 23 Jul 2011 02:22:17 GMT
-+Content-Length: 0
-+Location: http://10.66.66.66/denied.html
-+
-+invalid request response:
-+
-+$ nc 8.8.8.8 80
-+hjdashjkdsahjkdsa
-+HTTP/1.0 400 Bad Request
-+Server: squid/2.6.STABLE21
-+Date: Sat, 23 Jul 2011 02:22:44 GMT
-+Content-Type: text/html
-+Content-Length: 1178
-+Expires: Sat, 23 Jul 2011 02:22:44 GMT
-+X-Squid-Error: ERR_INVALID_REQ 0
-+X-Cache: MISS from cache_server
-+X-Cache-Lookup: NONE from cache_server:3128
-+Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
-+Proxy-Connection: close
-+
-+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
-+<HTML><HEAD><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
-+<TITLE>ERROR: The requested URL could not be retrieved</TITLE>
-+<STYLE type="text/css"><!--BODY{background-color:#ffffff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--></STYLE>
-+</HEAD><BODY>
-+<H1>ERROR</H1>
-+<H2>The requested URL could not be retrieved</H2>
-+<HR noshade size="1px">
-+<P>
-+While trying to process the request:
-+<PRE>
-+hjdashjkdsahjkdsa
-+
-+</PRE>
-+<P>
-+The following error was encountered:
-+<UL>
-+<LI>
-+<STRONG>
-+Invalid Request
-+</STRONG>
-+</UL>
-+
-+<P>
-+Some aspect of the HTTP Request is invalid. Possible problems:
-+<UL>
-+<LI>Missing or unknown request method
-+<LI>Missing URL
-+<LI>Missing HTTP Identifier (HTTP/1.0)
-+<LI>Request is too large
-+<LI>Content-Length missing for POST or PUT requests
-+<LI>Illegal character in hostname; underscores are not allowed
-+</UL>
-+<P>Your cache administrator is <A HREF="mailto:root">root</A>.
-+
-+<BR clear="all">
-+<HR noshade size="1px">
-+<ADDRESS>
-+Generated Sat, 23 Jul 2011 02:22:44 GMT by cache_server (squid/2.6.STABLE21)
-+</ADDRESS>
-+</BODY></HTML>
-+
-+nc 10.66.66.66 80
-+GET cache_object://localhost/info HTTP/1.0
-+HTTP/1.0 403 Forbidden
-+Server: squid/2.6.STABLE21
-+Date: Sat, 23 Jul 2011 02:25:56 GMT
-+Content-Type: text/html
-+Content-Length: 1061
-+Expires: Sat, 23 Jul 2011 02:25:56 GMT
-+X-Squid-Error: ERR_ACCESS_DENIED 0
-+X-Cache: MISS from cache_server
-+X-Cache-Lookup: NONE from cache_server:3128
-+Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
-+Proxy-Connection: close
-+
-+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
-+<HTML><HEAD><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
-+<TITLE>ERROR: The requested URL could not be retrieved</TITLE>
-+<STYLE type="text/css"><!--BODY{background-color:#ffffff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--></STYLE>
-+</HEAD><BODY>
-+<H1>ERROR</H1>
-+<H2>The requested URL could not be retrieved</H2>
-+<HR noshade size="1px">
-+<P>
-+While trying to retrieve the URL:
-+<A HREF="cache_object://localhost/info">cache_object://localhost/info</A>
-+<P>
-+The following error was encountered:
-+<UL>
-+<LI>
-+<STRONG>
-+Access Denied.
-+</STRONG>
-+<P>
-+Access control configuration prevents your request from
-+being allowed at this time. Please contact your service provider if
-+you feel this is incorrect.
-+</UL>
-+<P>Your cache administrator is <A HREF="mailto:root">root</A>.
-+
-+
-+<BR clear="all">
-+<HR noshade size="1px">
-+<ADDRESS>
-+Generated Sat, 23 Jul 2011 02:25:56 GMT by cache_server (squid/2.6.STABLE21)
-+</ADDRESS>
-+</BODY></HTML>
-+
-+
-diff --git a/ooni/command.py b/ooni/command.py
-index 361190f..df1a58c 100644
---- a/ooni/command.py
-+++ b/ooni/command.py
-@@ -13,6 +13,7 @@ import ooni.captive_portal
- import ooni.namecheck
- import ooni.dns_poisoning
- import ooni.dns_cc_check
-+import ooni.transparenthttp
-
- class Command():
- def __init__(self, args):
-@@ -48,6 +49,15 @@ class Command():
- help="run captiveportal tests"
- )
-
-+ # --transhttp
-+ def cb_transhttp(option, opt, value, oparser):
-+ self.action = opt[2:]
-+ optparser.add_option(
-+ "--transhttp",
-+ action="callback", callback=cb_transhttp,
-+ help="run Transparent HTTP tests"
-+ )
-+
- # --dns
- def cb_dnstests(option, opt, value, oparser):
- self.action = opt[2:]
-@@ -122,7 +132,7 @@ class Command():
- if (not self.action):
- raise optparse.OptionError(
- 'is required',
-- '--dns | --dnsbulk | --captiveportal | --help | --version'
-+ '--dns | --dnsbulk | --dnscccheck | [ --cc CC ] | --captiveportal | --transhttp | --help | --version'
- )
-
- except optparse.OptionError, err:
-@@ -138,6 +148,10 @@ class Command():
- captive_portal = ooni.captive_portal.CaptivePortal
- captive_portal(self).main()
-
-+ def transhttp(self):
-+ transparent_http = ooni.transparenthttp.TransparentHTTPProxy
-+ transparent_http(self).main()
-+
- def dns(self):
- dnstests = ooni.namecheck.DNS
- dnstests(self).main()
-diff --git a/ooni/dns.py b/ooni/dns.py
-index 95da6ef..90d50bd 100644
---- a/ooni/dns.py
-+++ b/ooni/dns.py
-@@ -8,7 +8,7 @@ from socket import gethostbyname
- import ooni.common
-
- # apt-get install python-dns
--import DNS
-+import dns
- import random
-
- """ Wrap gethostbyname """
-diff --git a/ooni/http.py b/ooni/http.py
-index 62365bb..bb72001 100644
---- a/ooni/http.py
-+++ b/ooni/http.py
-@@ -7,8 +7,14 @@
- from socket import gethostbyname
- import ooni.common
- import urllib2
-+import httplib
-+from urlparse import urlparse
-+from pprint import pprint
- import pycurl
-+import random
-+import string
- import re
-+from BeautifulSoup import BeautifulSoup
-
- # By default, we'll be Torbutton's UA
- default_ua = { 'User-Agent' :
-@@ -20,20 +26,8 @@ default_proxy_type = PROXYTYPE_SOCKS5
- default_proxy_host = "127.0.0.1"
- default_proxy_port = "9050"
-
--
--
--
--
--
--
--
--
--
--
--
--
--
--
-+#class HTTPResponse(object):
-+# def __init__(self):
-
-
- """A very basic HTTP fetcher that uses Tor by default and returns a curl
-@@ -51,7 +45,7 @@ def http_proxy_fetch(url, headers, proxy_type=5,
- http_code = getinfo(pycurl.HTTP_CODE)
- return response, http_code
-
--"""A very basic HTTP fetcher that returns a urllib3 response object."""
-+"""A very basic HTTP fetcher that returns a urllib2 response object."""
- def http_fetch(url,
- headers= default_ua,
- label="generic HTTP fetch"):
-@@ -136,6 +130,76 @@ def http_header_no_match(experiment_url, control_header, control_result):
- else:
- return True
-
-+def http_request(self, method, url, path=None):
-+ """Takes as argument url that is perfectly formed (http://hostname/REQUEST"""
-+ purl = urlparse(url)
-+ host = purl.netloc
-+ conn = httplib.HTTPConnection(host, 80)
-+ if path is None:
-+ path = purl.path
-+ conn.request(method, purl.path)
-+ response = conn.getresponse()
-+ headers = dict(response.getheaders())
-+ self.headers = headers
-+ self.data = response.read()
-+ return True
-+
-+def search_headers(self, s_headers, url):
-+ if http_request(self, "GET", url):
-+ headers = self.headers
-+ else:
-+ return None
-+ result = {}
-+ for h in s_headers.items():
-+ result[h[0]] = h[0] in headers
-+ return result
-+
-+def http_header_match_dict(experimental_url, dict_header):
-+ result = {}
-+ url_header = http_get_header_dict(experimental_url)
-+
-+# XXX for testing
-+# [('content-length', '9291'), ('via', '1.0 cache_server:3128 (squid/2.6.STABLE21)'), ('x-cache', 'MISS from cache_server'), ('accept-ranges', 'bytes'), ('server', 'Apache/2.2.16 (Debian)'), ('last-modified', 'Fri, 22 Jul 2011 03:00:31 GMT'), ('connection', 'close'), ('etag', '"105801a-244b-4a89fab1e51c0;49e684ba90c80"'), ('date', 'Sat, 23 Jul 2011 03:03:56 GMT'), ('content-type', 'text/html'), ('x-cache-lookup', 'MISS from cache_server:3128')]
-+
-+def search_squid_headers(self):
-+ url = "http://securityfocus.org/blabla"
-+ s_headers = {'via': '1.0 cache_server:3128 (squid/2.6.STABLE21)', 'x-cache': 'MISS from cache_server', 'x-cache-lookup':'MISS from cache_server:3128'}
-+ ret = search_headers(self, s_headers, url)
-+ for i in ret.items():
-+ if i[1] is True:
-+ return False
-+ return True
-+
-+def random_bad_request(self):
-+ url = "http://securityfocus.org/blabla"
-+ r_str = ''.join(random.choice(string.ascii_uppercase + string.digits) for x in range(random.randint(5,20)))
-+ if http_request(self, r_str, url):
-+ return True
-+ else:
-+ return None
-+
-+def squid_search_bad_request(self):
-+ if random_bad_request(self):
-+ s_headers = {'X-Squid-Error' : 'ERR_INVALID_REQ 0'}
-+ for i in s_headers.items():
-+ if i[0] in self.headers:
-+ return False
-+ return True
-+ else:
-+ return None
-+
-+def squid_cacheobject_request(self):
-+ url = "http://securityfocus.org/blabla"
-+ if http_request(self, "GET", url, "cache_object://localhost/info"):
-+ soup = BeautifulSoup(self.data)
-+ if soup.find('strong') and soup.find('strong').string == "Access Denied.":
-+ return False
-+ else:
-+ return True
-+ else:
-+ return None
-+
-+
- def MSHTTP_CP_Tests(self):
- experiment_url = "http://www.msftncsi.com/ncsi.txt"
- expectedResponse = "Microsoft NCSI" # Only this - nothing more
-@@ -186,6 +250,18 @@ def WC3_CP_Tests(self):
-
- # Google ChromeOS fetches this url in guest mode
- # and they expect the user to authenticate
-- def googleChromeOSHTTPTest(self):
-- print "noop"
-- #url = "http://www.google.com/"
-+def googleChromeOSHTTPTest(self):
-+ print "noop"
-+ #url = "http://www.google.com/"
-+
-+def SquidHeader_TransparentHTTP_Tests(self):
-+ return search_squid_headers(self)
-+
-+def SquidBadRequest_TransparentHTTP_Tests(self):
-+ squid_cacheobject_request(self)
-+ return squid_search_bad_request(self)
-+
-+def SquidCacheobject_TransparentHTTP_Tests(self):
-+ return squid_cacheobject_request(self)
-+
-+
diff --git a/to-be-ported/very-old/ooni/#namecheck.py# b/to-be-ported/very-old/ooni/#namecheck.py#
deleted file mode 100644
index 1a2a3f0..0000000
--- a/to-be-ported/very-old/ooni/#namecheck.py#
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/usr/bin/env python
-#
-# DNS tampering detection module
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-#
-# This module performs multiple DNS tests.
-
-import sys
-import ooni.dnsooni
-
-class DNS():
- def __init__(self, args):
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.debug = False
- self.randomize = args.randomize
-
- def DNS_Tests(self):
- print "DNS tampering detection:"
- filter_name = "_DNS_Tests"
- tests = [ooni.dnsooni]
- for test in tests:
- for function_ptr in dir(test):
- if function_ptr.endswith(filter_name):
- filter_result = getattr(test, function_ptr)(self)
- if filter_result == True:
- print function_ptr + " thinks the network is clean"
- elif filter_result == None:
- print function_ptr + " failed"
- else:
- print function_ptr + " thinks the network is dirty"
-
- def main(self):
- for function_ptr in dir(self):
- if function_ptr.endswith("_Tests"):
- getattr(self, function_ptr)()
-
-if __name__ == '__main__':
- self.main()
diff --git a/to-be-ported/very-old/ooni/.DS_Store b/to-be-ported/very-old/ooni/.DS_Store
deleted file mode 100644
index f5738a5..0000000
Binary files a/to-be-ported/very-old/ooni/.DS_Store and /dev/null differ
diff --git a/to-be-ported/very-old/ooni/__init__.py b/to-be-ported/very-old/ooni/__init__.py
deleted file mode 100644
index 8f1b96e..0000000
--- a/to-be-ported/very-old/ooni/__init__.py
+++ /dev/null
@@ -1,12 +0,0 @@
-"""\
-This is your package, 'ooni'.
-
-It was provided by the package, `package`.
-
-Please change this documentation, and write this module!
-"""
-
-__version__ = '0.0.1'
-
-# If you run 'make test', this is your failing test.
-# raise Exception("\n\n\tNow it's time to write your 'ooni' module!!!\n\n")
diff --git a/to-be-ported/very-old/ooni/command.py b/to-be-ported/very-old/ooni/command.py
deleted file mode 100644
index e5f8f9f..0000000
--- a/to-be-ported/very-old/ooni/command.py
+++ /dev/null
@@ -1,250 +0,0 @@
-# -*- coding: utf-8
-"""\
-Command line UI module for ooni-probe - heavily inspired by Ingy döt Net
-"""
-
-import os
-import sys
-import re
-import optparse
-
-# Only include high level ooni tests at this time
-import ooni.captive_portal
-import ooni.namecheck
-import ooni.dns_poisoning
-import ooni.dns_cc_check
-import ooni.transparenthttp
-import ooni.helpers
-import ooni.plugooni
-import ooni.input
-
-class Command():
- def __init__(self, args):
- sys.argv = sys.argv[0:1]
- sys.argv.extend(args)
- self.startup_options()
-
- def startup_options(self):
- self.action = None
- self.from_ = None
- self.to = None
- self.parser = None
- self.emitter = None
- self.emit_header = None
- self.emit_trailer = None
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.debug = False
- self.randomize = True
- self.cc = None
- self.hostname = None
- self.listfile = None
- self.listplugooni = False
- self.plugin_name = "all"
- self.controlproxy = None # "socks4a://127.0.0.1:9050/"
- self.experimentproxy = None
-
- usage = """
-
- 'ooni' is the Open Observatory of Network Interference
-
- command line usage: ooni-probe [options]"""
-
- optparser = optparse.OptionParser(usage=usage)
-
- # --plugin
- def cb_plugin(option, opt, value, oparser):
- self.action = opt[2:]
- self.plugin_name = str(value)
- optparser.add_option(
- "--plugin", type="string",
- action="callback", callback=cb_plugin,
- help="run the Plugooni plgoo plugin specified"
- )
-
- # --listplugins
- def cb_list_plugins(option, opt, value, oparser):
- self.action = opt[2:]
- optparser.add_option(
- "--listplugins",
- action="callback", callback=cb_list_plugins,
- help="list available Plugooni as plgoos plugin names"
- )
-
- # --captiveportal
- def cb_captiveportal(option, opt, value, oparser):
- self.action = opt[2:]
- optparser.add_option(
- "--captiveportal",
- action="callback", callback=cb_captiveportal,
- help="run vendor emulated captiveportal tests"
- )
-
- # --transhttp
- def cb_transhttp(option, opt, value, oparser):
- self.action = opt[2:]
- optparser.add_option(
- "--transhttp",
- action="callback", callback=cb_transhttp,
- help="run Transparent HTTP tests"
- )
-
- # --dns
- def cb_dnstests(option, opt, value, oparser):
- self.action = opt[2:]
- optparser.add_option(
- "--dns",
- action="callback", callback=cb_dnstests,
- help="run fixed generic dns tests"
- )
-
- # --dnsbulk
- def cb_dnsbulktests(option, opt, value, oparser):
- self.action = opt[2:]
- optparser.add_option(
- "--dnsbulk",
- action="callback", callback=cb_dnsbulktests,
- help="run bulk DNS tests in random.shuffle() order"
- )
-
- # --dns-cc-check
- def cb_dnscccheck(option, opt, value, oparser):
- self.action = opt[2:]
- optparser.add_option(
- "--dnscccheck",
- action="callback", callback=cb_dnscccheck,
- help="run cc specific bulk DNS tests in random.shuffle() order"
- )
-
- # --cc [country code]
- def cb_cc(option, opt, value, optparser):
- # XXX: We should check this against a list of supported county codes
- # and then return the matching value from the list into self.cc
- self.cc = str(value)
- optparser.add_option(
- "--cc", type="string",
- action="callback", callback=cb_cc,
- help="set a specific county code -- default is None",
- )
-
- # --list [url/hostname/ip list in file]
- def cb_list(option, opt, value, optparser):
- self.listfile = os.path.expanduser(value)
- if not os.path.isfile(self.listfile):
- print "Wrong file '" + value + "' in --list."
- sys.exit(1)
- optparser.add_option(
- "--list", type="string",
- action="callback", callback=cb_list,
- help="file to read from -- default is None",
- )
-
- # --url [url/hostname/ip]
- def cb_host(option, opt, value, optparser):
- self.hostname = str(value)
- optparser.add_option(
- "--url", type="string",
- action="callback", callback=cb_host,
- help="set URL/hostname/IP for use in tests -- default is None",
- )
-
- # --controlproxy [scheme://host:port]
- def cb_controlproxy(option, opt, value, optparser):
- self.controlproxy = str(value)
- optparser.add_option(
- "--controlproxy", type="string",
- action="callback", callback=cb_controlproxy,
- help="proxy to be used as a control -- default is None",
- )
-
- # --experimentproxy [scheme://host:port]
- def cb_experimentproxy(option, opt, value, optparser):
- self.experimentproxy = str(value)
- optparser.add_option(
- "--experimentproxy", type="string",
- action="callback", callback=cb_experimentproxy,
- help="proxy to be used for experiments -- default is None",
- )
-
-
-
- # --randomize
- def cb_randomize(option, opt, value, optparser):
- self.randomize = bool(int(value))
- optparser.add_option(
- "--randomize", type="choice",
- choices=['0', '1'], metavar="0|1",
- action="callback", callback=cb_randomize,
- help="randomize host order -- default is on",
- )
-
- # XXX TODO:
- # pause/resume scans for dns_BULK_DNS_Tests()
- # setting of control/experiment resolver
- # setting of control/experiment proxy
- #
-
- def cb_version(option, opt, value, oparser):
- self.action = 'version'
- optparser.add_option(
- "-v", "--version",
- action="callback", callback=cb_version,
- help="print ooni-probe version"
- )
-
- # parse options
- (opts, args) = optparser.parse_args()
-
- # validate options
- try:
- if (args):
- raise optparse.OptionError('extra arguments found', args)
- if (not self.action):
- raise optparse.OptionError(
- 'RTFS', 'required arguments missing'
- )
-
- except optparse.OptionError, err:
- sys.stderr.write(str(err) + '\n\n')
- optparser.print_help()
- sys.exit(1)
-
- def version(self):
- print """
-ooni-probe pre-alpha
-Copyright (c) 2011, Jacob Appelbaum, Arturo Filastò
-See: https://www.torproject.org/ooni/
-
-"""
-
- def run(self):
- getattr(self, self.action)()
-
- def plugin(self):
- plugin_run = ooni.plugooni.Plugooni
- plugin_run(self).run(self)
-
- def listplugins(self):
- plugin_run = ooni.plugooni.Plugooni
- plugin_run(self).list_plugoons()
-
- def captiveportal(self):
- captive_portal = ooni.captive_portal.CaptivePortal
- captive_portal(self).main()
-
- def transhttp(self):
- transparent_http = ooni.transparenthttp.TransparentHTTPProxy
- transparent_http(self).main()
-
- def dns(self):
- dnstests = ooni.namecheck.DNS
- dnstests(self).main()
-
- def dnsbulk(self):
- dnstests = ooni.dns_poisoning.DNSBulk
- dnstests(self).main()
-
- def dnscccheck(self):
- dnstests = ooni.dns_cc_check.DNSBulk
- dnstests(self).main()
-
diff --git a/to-be-ported/very-old/ooni/dns_poisoning.py b/to-be-ported/very-old/ooni/dns_poisoning.py
deleted file mode 100644
index 939391e..0000000
--- a/to-be-ported/very-old/ooni/dns_poisoning.py
+++ /dev/null
@@ -1,43 +0,0 @@
-#!/usr/bin/env python
-#
-# DNS tampering detection module
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-#
-# This module performs DNS queries against a known good resolver and a possible
-# bad resolver. We compare every resolved name against a list of known filters
-# - if we match, we ring a bell; otherwise, we list possible filter IP
-# addresses. There is a high false positive rate for sites that are GeoIP load
-# balanced.
-#
-
-import sys
-import ooni.dnsooni
-
-class DNSBulk():
- def __init__(self, args):
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.randomize = args.randomize
- self.debug = False
-
- def DNS_Tests(self):
- print "DNS tampering detection for list of domains:"
- filter_name = "_DNS_BULK_Tests"
- tests = [ooni.dnsooni]
- for test in tests:
- for function_ptr in dir(test):
- if function_ptr.endswith(filter_name):
- filter_result = getattr(test, function_ptr)(self)
- if filter_result == True:
- print function_ptr + " thinks the network is clean"
- elif filter_result == None:
- print function_ptr + " failed"
- else:
- print function_ptr + " thinks the network is dirty"
- def main(self):
- for function_ptr in dir(self):
- if function_ptr.endswith("_Tests"):
- getattr(self, function_ptr)()
-
-if __name__ == '__main__':
- self.main()
diff --git a/to-be-ported/very-old/ooni/dnsooni.py b/to-be-ported/very-old/ooni/dnsooni.py
deleted file mode 100644
index bfdfe51..0000000
--- a/to-be-ported/very-old/ooni/dnsooni.py
+++ /dev/null
@@ -1,356 +0,0 @@
-#!/usr/bin/env python
-#
-# DNS support for ooni-probe
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-#
-
-from socket import gethostbyname
-import ooni.common
-
-# requires python-dns
-# (pydns.sourceforge.net)
-try:
- import DNS
-# Mac OS X needs this
-except:
- try:
- import dns as DNS
- except:
- pass # Never mind, let's break later.
-import random
-from pprint import pprint
-
-""" Wrap gethostbyname """
-def dns_resolve(hostname):
- try:
- resolved_host = gethostbyname(hostname)
- return resolved_host
- except:
- return False
-
-"""Perform a resolution on test_hostname and compare it with the expected
- control_resolved ip address. Optionally, a label may be set to customize
- output. If the experiment matches the control, this returns True; otherwise
- it returns False.
-"""
-def dns_resolve_match(experiment_hostname, control_resolved,
- label="generic DNS comparison"):
- experiment_resolved = dns_resolve(experiment_hostname)
- if experiment_resolved == False:
- return None
- if experiment_resolved:
- if str(experiment_resolved) != str(control_resolved):
- print label + " control " + str(control_resolved) + " data does not " \
- "match experiment response: " + str(experiment_resolved)
- return False
- return True
-
-def generic_DNS_resolve(experiment_hostname, experiment_resolver):
- if experiment_resolver == None:
- req = DNS.Request(name=experiment_hostname) # local resolver
- else:
- req = DNS.Request(name=experiment_hostname, server=experiment_resolver) #overide
- resolved_data = req.req().answers
- return resolved_data
-
-""" Return a list of all known censors. """
-def load_list_of_known_censors(known_proxy_file=None):
- proxyfile = "proxy-lists/ips.txt"
- known_proxy_file = open(proxyfile, 'r', 1)
- known_proxy_list = []
- for known_proxy in known_proxy_file.readlines():
- known_proxy_list.append(known_proxy)
- known_proxy_file.close()
- known_proxy_count = len(known_proxy_list)
- print "Loading " + str(known_proxy_count) + " known proxies..."
- return known_proxy_list, known_proxy_count
-
-def load_list_of_test_hosts(hostfile=None):
- if hostfile == None:
- hostfile="censorship-lists/norwegian-dns-blacklist.txt"
- host_list_file = open(hostfile, 'r', 1)
- host_list = []
- for host_name in host_list_file.readlines():
- if host_name.isspace():
- continue
- else:
- host_list.append(host_name)
- host_list_file.close()
- host_count = len(host_list)
- #print "Loading " + str(host_count) + " test host names..."
- return host_list, host_count
-
-""" Return True with a list of censors if we find a known censor from
- known_proxy_list in the experiment_data DNS response. Otherwise return
- False and None. """
-def contains_known_censors(known_proxy_list, experiment_data):
- match = False
- proxy_list = []
- for answer in range(len(experiment_data)):
- for known_proxy in known_proxy_list:
- if answer == known_proxy:
- print "CONFLICT: known proxy discovered: " + str(known_proxy),
- proxy_list.append(known_proxy)
- match = True
- return match, proxy_list
-
-""" Return True and the experiment response that failed to match."""
-def compare_control_with_experiment(known_proxy_list, control_data, experiment_data):
- known_proxy_found, known_proxies = contains_known_censors(known_proxy_list, experiment_data)
- conflict_list = []
- conflict = False
- if known_proxy_found:
- print "known proxy discovered: " + str(known_proxies)
- for answer in range(len(control_data)):
- if control_data[answer]['data'] == experiment_data:
- print "control_data[answer]['data'] = " + str(control_data[answer]['data']) + "and experiment_data = " + str(experiment_data)
- continue
- else:
- conflict = True
- conflict_list.append(experiment_data)
- #print "CONFLICT: control_data: " + str(control_data) + " experiment_data: " + str(experiment_data),
- return conflict, conflict_list
-
-def dns_DNS_BULK_Tests(self, hostfile=None,
- known_good_resolver="8.8.8.8", test_resolver=None):
- tampering = False # By default we'll pretend the internet is nice
- tampering_list = []
- host_list, host_count = load_list_of_test_hosts()
- known_proxies, proxy_count = load_list_of_known_censors()
- check_count = 1
- if test_resolver == None:
- DNS.ParseResolvConf() # Set the local resolver as our default
- if self.randomize:
- random.shuffle(host_list) # This makes our list non-sequential for now
- for host_name in host_list:
- host_name = host_name.strip()
- print "Total progress: " + str(check_count) + " of " + str(host_count) + " hosts to check"
- print "Resolving with control resolver..."
- print "Testing " + host_name + " with control resolver: " + str(known_good_resolver)
- print "Testing " + host_name + " with experiment resolver: " + str(test_resolver)
- # XXX TODO - we need to keep track of the status of these requests and then resume them
- while True:
- try:
- control_data = generic_DNS_resolve(host_name, known_good_resolver)
- break
- except KeyboardInterrupt:
- print "bailing out..."
- exit()
- except DNS.Base.DNSError:
- print "control resolver appears to be failing..."
- continue
- except:
- print "Timeout; looping!"
- continue
-
- print "Resolving with experiment resolver..."
- while True:
- try:
- experiment_data = generic_DNS_resolve(host_name, test_resolver)
- break
- except KeyboardInterrupt:
- print "bailing out..."
- exit()
- except DNS.Base.DNSError:
- print "experiment resolver appears to be failing..."
- continue
- except:
- print "Timeout; looping!"
- continue
-
- print "Comparing control and experiment...",
- tampering, conflicts = compare_control_with_experiment(known_proxies, control_data, experiment_data)
- if tampering:
- tampering_list.append(conflicts)
- print "Conflicts with " + str(host_name) + " : " + str(conflicts)
- check_count = check_count + 1
- host_list.close()
- return tampering
-
-""" Attempt to resolve random_hostname and return True and None if empty. If an
- address is returned we return False and the returned address.
-"""
-def dns_response_empty(random_hostname):
- response = dns_resolve(random_hostname)
- if response == False:
- return True, None
- return False, response
-
-def dns_multi_response_empty(count, size):
- for i in range(count):
- randName = ooni.common._randstring(size)
- response_empty, response_ip = dns_response_empty(randName)
- if response_empty == True and response_ip == None:
- responses_are_empty = True
- else:
- print label + " " + randName + " found with value " + str(response_ip)
- responses_are_empty = False
- return responses_are_empty
-
-""" Attempt to resolve one random host name per tld in tld_list where the
- hostnames are random strings with a length between min_length and
- max_length. Return True if list is empty, otherwise return False."""
-def dns_list_empty(tld_list, min_length, max_length,
- label="generic DNS list test"):
- for tld in tld_list:
- randName = ooni.common._randstring(min_length, max_length) + tld
- response_empty, response_ip = dns_response_empty(randName)
- return response_empty
-
-# Known bad test
-# Test for their DNS breakage and their HTTP MITM
-# "Family Shield" is 208.67.222.123 and 208.67.220.123
-# returns 67.215.65.130 for filtered sites like kink.com
-# block.opendns.com is a block page where users are redirected
-# 208.67.216.135 208.67.217.135 are the block pages currently point
-# 67.215.65.132 is returned for NXDOMAINs and a visit with HTTP to that IP
-# results in redirection to http://guide.opendns.com/main?url=sdagsad.com or
-# whatever the HOST header says
-# Amusingly - their Server header is: "OpenDNS Guide"
-""" Return True if we are not being directed as known OpenDNS block pages."""
-def OpenDNS_DNS_Tests(self):
- return OpenDNS_Censorship_DNS_TESTS(self)
- return OpenDNS_NXDomain_DNS_TESTS(self)
-
-def OpenDNS_Censorship_DNS_TESTS(self):
- known_filter = "67.215.65.130"
- randName = ooni.common._randstring(10)
- redirected = dns_resolve_match(randName, known_filter, label="OpenDNS DNS Censorship comparison")
- if redirected:
- return False
- else:
- return True
-
-def OpenDNS_NXDomain_DNS_TESTS(self):
- known_filter = "67.215.65.132"
- randName = ooni.common._randstring(10)
- redirected = dns_resolve_match(randName, known_filter, label="OpenDNS DNS NXDomain comparison")
- if redirected:
- return False
- else:
- return True
-
-"""Returns True if the experiment_url returns the well known Italian block page."""
-def cc_DNS_Tests_it(self):
- tampering = False # By default we'll pretend the internet is nice
- tampering_list = []
- conflicts = []
- known_good_resolver = "8.8.8.8"
- host_list, host_count = load_list_of_test_hosts("censorship-lists/italy-gamble-blocklist-07-22-11.txt")
- known_http_block_pages, known_block_count = load_list_of_test_hosts("proxy-lists/italy-http-ips.txt")
- known_censoring_resolvers, censoring_resolver_count = load_list_of_test_hosts("proxy-lists/italy-dns-ips.txt")
-
- check_count = 1
- DNS.ParseResolvConf()
- # Set the local resolver as our default
- if self.randomize:
- random.shuffle(host_list) # This makes our list non-sequential for now
- print "We're testing (" + str(host_count) + ") URLs"
- print "We're looking for (" + str(known_block_count) + ") block pages"
- print "We're testing against (" + str(censoring_resolver_count) + ") censoring DNS resolvers"
- for test_resolver in known_censoring_resolvers:
- test_resolver = test_resolver.strip()
- for host_name in host_list:
- host_name = host_name.strip()
- print "Total progress: " + str(check_count) + " of " + str(host_count) + " hosts to check"
- print "Testing " + host_name + " with control resolver: " + known_good_resolver
- print "Testing " + host_name + " with experiment resolver: " + test_resolver
- while True:
- try:
- control_data = generic_DNS_resolve(host_name, known_good_resolver)
- break
- except KeyboardInterrupt:
- print "bailing out..."
- exit()
- except DNS.Base.DNSError:
- print "control resolver appears to be failing..."
- break
- except:
- print "Timeout; looping!"
- continue
-
- while True:
- try:
- experiment_data = generic_DNS_resolve(host_name, test_resolver)
- break
- except KeyboardInterrupt:
- print "bailing out..."
- exit()
- except DNS.Base.DNSError:
- print "experiment resolver appears to be failing..."
- continue
- except:
- print "Timeout; looping!"
- continue
-
- print "Comparing control and experiment...",
- tampering, conflicts = compare_control_with_experiment(known_http_block_pages, control_data, experiment_data)
- if tampering:
- tampering_list.append(conflicts)
- print "Conflicts with " + str(host_name) + " : " + str(conflicts)
- check_count = check_count + 1
-
- host_list.close()
- return tampering
-
-
-## XXX TODO
-## Code up automatic tests for HTTP page checking in Italy - length + known strings, etc
-
-""" Returns True if the experiment_host returns a well known Australian filter
- IP address."""
-def Australian_DNS_Censorship(self, known_filtered_host="badhost.com"):
- # http://www.robtex.com/ip/61.88.88.88.html
- # http://requests.optus.net.au/dns/
- known_block_ip = "208.69.183.228" # http://interpol.contentkeeper.com/
- known_censoring_resolvers = ["61.88.88.88"] # Optus
- for resolver in known_censoring_resolvers:
- blocked = generic_DNS_censorship(known_filtered_host, resolver, known_block_page)
- if blocked:
- return True
-
-"""Returns True if experiment_hostname as resolved by experiment_resolver
- resolves to control_data. Returns False if there is no match or None if the
- attempt fails."""
-def generic_DNS_censorship(self, experiment_hostname, experiment_resolver,
- control_data):
- req = DNS.Request(name=experiment_hostname, server=experiment_resolver)
- resolved_data = s.req().answers
- for answer in range(len(resolved_data)):
- if resolved_data[answer]['data'] == control_data:
- return True
- return False
-
-# See dns_launch_wildcard_checks in tor/src/or/dns.c for Tor implementation
-# details
-""" Return True if Tor would consider the network fine; False if it's hostile
- and has no signs of DNS tampering. """
-def Tor_DNS_Tests(self):
- response_rfc2606_empty = RFC2606_DNS_Tests(self)
- tor_tld_list = ["", ".com", ".org", ".net"]
- response_tor_empty = ooni.dnsooni.dns_list_empty(tor_tld_list, 8, 16, "TorDNSTest")
- return response_tor_empty | response_rfc2606_empty
-
-""" Return True if RFC2606 would consider the network hostile; False if it's all
- clear and has no signs of DNS tampering. """
-def RFC2606_DNS_Tests(self):
- tld_list = [".invalid", ".test"]
- return ooni.dnsooni.dns_list_empty(tld_list, 4, 18, "RFC2606Test")
-
-""" Return True if googleChromeDNSTest would consider the network OK."""
-def googleChrome_CP_Tests(self):
- maxGoogleDNSTests = 3
- GoogleDNSTestSize = 10
- return ooni.dnsooni.dns_multi_response_empty(maxGoogleDNSTests,
- GoogleDNSTestSize)
-def googleChrome_DNS_Tests(self):
- return googleChrome_CP_Tests(self)
-
-""" Return True if MSDNSTest would consider the network OK."""
-def MSDNS_CP_Tests(self):
- experimentHostname = "dns.msftncsi.com"
- expectedResponse = "131.107.255.255"
- return ooni.dnsooni.dns_resolve_match(experimentHostname, expectedResponse, "MS DNS")
-
-def MSDNS_DNS_Tests(self):
- return MSDNS_CP_Tests(self)
diff --git a/to-be-ported/very-old/ooni/helpers.py b/to-be-ported/very-old/ooni/helpers.py
deleted file mode 100644
index 514e65f..0000000
--- a/to-be-ported/very-old/ooni/helpers.py
+++ /dev/null
@@ -1,38 +0,0 @@
-#!/usr/bin/env python
-#
-# HTTP support for ooni-probe
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-# Arturo Filasto' <art(a)fuffa.org>
-
-import ooni.common
-import pycurl
-import random
-import zipfile
-import os
-from xml.dom import minidom
-try:
- from BeautifulSoup import BeautifulSoup
-except:
- pass # Never mind, let's break later.
-
-def get_random_url(self):
- filepath = os.getcwd() + "/test-lists/top-1m.csv.zip"
- fp = zipfile.ZipFile(filepath, "r")
- fp.open("top-1m.csv")
- content = fp.read("top-1m.csv")
- return "http://" + random.choice(content.split("\n")).split(",")[1]
-
-"""Pick a random header and use that for the request"""
-def get_random_headers(self):
- filepath = os.getcwd() + "/test-lists/whatheaders.xml"
- headers = []
- content = open(filepath, "r").read()
- soup = BeautifulSoup(content)
- measurements = soup.findAll('measurement')
- i = random.randint(0,len(measurements))
- for vals in measurements[i].findAll('header'):
- name = vals.find('name').string
- value = vals.find('value').string
- if name != "host":
- headers.append((name, value))
- return headers
diff --git a/to-be-ported/very-old/ooni/http.py b/to-be-ported/very-old/ooni/http.py
deleted file mode 100644
index 59e2abb..0000000
--- a/to-be-ported/very-old/ooni/http.py
+++ /dev/null
@@ -1,306 +0,0 @@
-#!/usr/bin/env python
-#
-# HTTP support for ooni-probe
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-# Arturo Filasto' <art(a)fuffa.org>
-#
-
-from socket import gethostbyname
-import ooni.common
-import ooni.helpers
-import ooni.report
-import urllib2
-import httplib
-from urlparse import urlparse
-from pprint import pprint
-import pycurl
-import random
-import string
-import re
-from pprint import pprint
-try:
- from BeautifulSoup import BeautifulSoup
-except:
- pass # Never mind, let's break later.
-
-# By default, we'll be Torbutton's UA
-default_ua = { 'User-Agent' :
- 'Mozilla/5.0 (Windows NT 6.1; rv:5.0) Gecko/20100101 Firefox/5.0' }
-
-# Use pycurl to connect over a proxy
-PROXYTYPE_SOCKS5 = 5
-default_proxy_type = PROXYTYPE_SOCKS5
-default_proxy_host = "127.0.0.1"
-default_proxy_port = "9050"
-
-#class HTTPResponse(object):
-# def __init__(self):
-
-
-"""A very basic HTTP fetcher that uses Tor by default and returns a curl
- object."""
-def http_proxy_fetch(url, headers, proxy_type=5,
- proxy_host="127.0.0.1",
- proxy_port=9050):
- request = pycurl.Curl()
- request.setopt(pycurl.PROXY, proxy_host)
- request.setopt(pycurl.PROXYPORT, proxy_port)
- request.setopt(pycurl.PROXYTYPE, proxy_type)
- request.setopt(pycurl.HTTPHEADER, ["User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:5.0) Gecko/20100101 Firefox/5.0"])
- request.setopt(pycurl.URL, url)
- response = request.perform()
- http_code = getinfo(pycurl.HTTP_CODE)
- return response, http_code
-
-"""A very basic HTTP fetcher that returns a urllib2 response object."""
-def http_fetch(url,
- headers= default_ua,
- label="generic HTTP fetch"):
- request = urllib2.Request(url, None, headers)
- response = urllib2.urlopen(request)
- return response
-
-"""Connect to test_hostname on port 80, request url and compare it with the expected
- control_result. Optionally, a label may be set to customize
- output. If the experiment matches the control, this returns True with the http
- status code; otherwise it returns False.
-"""
-def http_content_match(experimental_url, control_result,
- headers= { 'User-Agent' : default_ua },
- label="generic HTTP content comparison"):
- request = urllib2.Request(experimental_url, None, headers)
- response = urllib2.urlopen(request)
- responseContents = response.read()
- responseCode = response.code
- if responseContents != False:
- if str(responseContents) != str(control_result):
- print label + " control " + str(control_result) + " data does not " \
- "match experiment response: " + str(responseContents)
- return False, responseCode
- return True, responseCode
- else:
- print "HTTP connection appears to have failed"
- return False, False
-
-"""Connect to test_hostname on port 80, request url and compare it with the expected
- control_result as a regex. Optionally, a label may be set to customize
- output. If the experiment matches the control, this returns True with the HTTP
- status code; otherwise it returns False.
-"""
-def http_content_fuzzy_match(experimental_url, control_result,
- headers= { 'User-Agent' : default_ua },
- label="generic HTTP content comparison"):
- request = urllib2.Request(experimental_url, None, headers)
- response = urllib2.urlopen(request)
- responseContents = response.read()
- responseCode = response.code
- pattern = re.compile(control_result)
- match = pattern.search(responseContents)
- if responseContents != False:
- if not match:
- print label + " control " + str(control_result) + " data does not " \
- "match experiment response: " + str(responseContents)
- return False, responseCode
- return True, responseCode
- else:
- print "HTTP connection appears to have failed"
- return False, False
-
-"""Compare two HTTP status codes as integers and return True if they match."""
-def http_status_code_match(experiment_code, control_code):
- if int(experiment_code) != int(control_code):
- return False
- return True
-
-"""Compare two HTTP status codes as integers and return True if they don't match."""
-def http_status_code_no_match(experiment_code, control_code):
- if http_status_code_match(experiment_code, control_code):
- return False
- return True
-
-"""Connect to a URL and compare the control_header/control_result with the data
-served by the remote server. Return True if it matches, False if it does not."""
-def http_header_match(experiment_url, control_header, control_result):
- response = http_fetch(url, label=label)
- remote_header = response.get_header(control_header)
- if str(remote_header) == str(control_result):
- return True
- else:
- return False
-
-"""Connect to a URL and compare the control_header/control_result with the data
-served by the remote server. Return True if it does not matche, False if it does."""
-def http_header_no_match(experiment_url, control_header, control_result):
- match = http_header_match(experiment_url, control_header, control_result)
- if match:
- return False
- else:
- return True
-
-def send_browser_headers(self, browser, conn):
- headers = ooni.helpers.get_random_headers(self)
- for h in headers:
- conn.putheader(h[0], h[1])
- conn.endheaders()
- return True
-
-def http_request(self, method, url, path=None):
- purl = urlparse(url)
- host = purl.netloc
- conn = httplib.HTTPConnection(host, 80)
- conn.connect()
- if path is None:
- path = purl.path
- conn.putrequest(method, purl.path)
- send_browser_headers(self, None, conn)
- response = conn.getresponse()
- headers = dict(response.getheaders())
- self.headers = headers
- self.data = response.read()
- return True
-
-def search_headers(self, s_headers, url):
- if http_request(self, "GET", url):
- headers = self.headers
- else:
- return None
- result = {}
- for h in s_headers.items():
- result[h[0]] = h[0] in headers
- return result
-
-# XXX for testing
-# [('content-length', '9291'), ('via', '1.0 cache_server:3128 (squid/2.6.STABLE21)'), ('x-cache', 'MISS from cache_server'), ('accept-ranges', 'bytes'), ('server', 'Apache/2.2.16 (Debian)'), ('last-modified', 'Fri, 22 Jul 2011 03:00:31 GMT'), ('connection', 'close'), ('etag', '"105801a-244b-4a89fab1e51c0;49e684ba90c80"'), ('date', 'Sat, 23 Jul 2011 03:03:56 GMT'), ('content-type', 'text/html'), ('x-cache-lookup', 'MISS from cache_server:3128')]
-
-"""Search for squid headers by requesting a random site and checking if the headers have been rewritten (active, not fingerprintable)"""
-def search_squid_headers(self):
- test_name = "squid header"
- self.logger.info("RUNNING %s test" % test_name)
- url = ooni.helpers.get_random_url(self)
- s_headers = {'via': '1.0 cache_server:3128 (squid/2.6.STABLE21)', 'x-cache': 'MISS from cache_server', 'x-cache-lookup':'MISS from cache_server:3128'}
- ret = search_headers(self, s_headers, url)
- for i in ret.items():
- if i[1] is True:
- self.logger.info("the %s test returned False" % test_name)
- return False
- self.logger.info("the %s test returned True" % test_name)
- return True
-
-def random_bad_request(self):
- url = ooni.helpers.get_random_url(self)
- r_str = ''.join(random.choice(string.ascii_uppercase + string.digits) for x in range(random.randint(5,20)))
- if http_request(self, r_str, url):
- return True
- else:
- return None
-
-"""Create a request made up of a random string of 5-20 chars (active technique, possibly fingerprintable)"""
-def squid_search_bad_request(self):
- test_name = "squid bad request"
- self.logger.info("RUNNING %s test" % test_name)
- if random_bad_request(self):
- s_headers = {'X-Squid-Error' : 'ERR_INVALID_REQ 0'}
- for i in s_headers.items():
- if i[0] in self.headers:
- self.logger.info("the %s test returned False" % test_name)
- return False
- self.logger.info("the %s test returned True" % test_name)
- return True
- else:
- self.logger.warning("the %s test returned failed" % test_name)
- return None
-
-"""Try requesting cache_object and expect as output access denied (very active technique, fingerprintable) """
-def squid_cacheobject_request(self):
- url = ooni.helpers.get_random_url(self)
- test_name = "squid cacheobject"
- self.logger.info("RUNNING %s test" % test_name)
- if http_request(self, "GET", url, "cache_object://localhost/info"):
- soup = BeautifulSoup(self.data)
- if soup.find('strong') and soup.find('strong').string == "Access Denied.":
- self.logger.info("the %s test returned False" % test_name)
- return False
- else:
- self.logger.info("the %s test returned True" % test_name)
- return True
- else:
- self.logger.warning("the %s test failed" % test_name)
- return None
-
-
-def MSHTTP_CP_Tests(self):
- test_name = "MS HTTP Captive Portal"
- self.logger.info("RUNNING %s test" % test_name)
- experiment_url = "http://www.msftncsi.com/ncsi.txt"
- expectedResponse = "Microsoft NCSI" # Only this - nothing more
- expectedResponseCode = "200" # Must be this - nothing else
- label = "MS HTTP"
- headers = { 'User-Agent' : 'Microsoft NCSI' }
- content_match, experiment_code = http_content_match(experiment_url, expectedResponse,
- headers, label)
- status_match = http_status_code_match(expectedResponseCode,
- experiment_code)
- if status_match and content_match:
- self.logger.info("the %s test returned True" % test_name)
- return True
- else:
- print label + " experiment would conclude that the network is filtered."
- self.logger.info("the %s test returned False" % test_name)
- return False
-
-def AppleHTTP_CP_Tests(self):
- test_name = "Apple HTTP Captive Portal"
- self.logger.info("RUNNING %s test" % test_name)
- experiment_url = "http://www.apple.com/library/test/success.html"
- expectedResponse = "Success" # There is HTML that contains this string
- expectedResponseCode = "200"
- label = "Apple HTTP"
- headers = { 'User-Agent' : 'Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) '
- 'AppleWebKit/420+ (KHTML, like Gecko) Version/3.0'
- ' Mobile/1A543a Safari/419.3' }
- content_match, experiment_code = http_content_fuzzy_match(
- experiment_url, expectedResponse, headers)
- status_match = http_status_code_match(expectedResponseCode,
- experiment_code)
- if status_match and content_match:
- self.logger.info("the %s test returned True" % test_name)
- return True
- else:
- print label + " experiment would conclude that the network is filtered."
- print label + "content match:" + str(content_match) + " status match:" + str(status_match)
- self.logger.info("the %s test returned False" % test_name)
- return False
-
-def WC3_CP_Tests(self):
- test_name = "W3 Captive Portal"
- self.logger.info("RUNNING %s test" % test_name)
- url = "http://tools.ietf.org/html/draft-nottingham-http-portal-02"
- draftResponseCode = "428"
- label = "WC3 draft-nottingham-http-portal"
- response = http_fetch(url, label=label)
- responseCode = response.code
- if http_status_code_no_match(responseCode, draftResponseCode):
- self.logger.info("the %s test returned True" % test_name)
- return True
- else:
- print label + " experiment would conclude that the network is filtered."
- print label + " status match:" + status_match
- self.logger.info("the %s test returned False" % test_name)
- return False
-
-# Google ChromeOS fetches this url in guest mode
-# and they expect the user to authenticate
-def googleChromeOSHTTPTest(self):
- print "noop"
- #url = "http://www.google.com/"
-
-def SquidHeader_TransparentHTTP_Tests(self):
- return search_squid_headers(self)
-
-def SquidBadRequest_TransparentHTTP_Tests(self):
- return squid_search_bad_request(self)
-
-def SquidCacheobject_TransparentHTTP_Tests(self):
- return squid_cacheobject_request(self)
-
-
diff --git a/to-be-ported/very-old/ooni/input.py b/to-be-ported/very-old/ooni/input.py
deleted file mode 100644
index c32ab48..0000000
--- a/to-be-ported/very-old/ooni/input.py
+++ /dev/null
@@ -1,33 +0,0 @@
-#!/usr/bin/python
-
-class file:
- def __init__(self, name=None):
- if name:
- self.name = name
-
- def simple(self, name=None):
- """ Simple file parsing method:
- Read a file line by line and output an array with all it's lines, without newlines
- """
- if name:
- self.name = name
- output = []
- try:
- f = open(self.name, "r")
- for line in f.readlines():
- output.append(line.strip())
- return output
- except:
- return output
-
- def csv(self, name=None):
- if name:
- self.name = name
-
- def yaml(self, name):
- if name:
- self.name = name
-
- def consensus(self, name):
- if name:
- self.name = name
diff --git a/to-be-ported/very-old/ooni/namecheck.py b/to-be-ported/very-old/ooni/namecheck.py
deleted file mode 100644
index 1a2a3f0..0000000
--- a/to-be-ported/very-old/ooni/namecheck.py
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/usr/bin/env python
-#
-# DNS tampering detection module
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-#
-# This module performs multiple DNS tests.
-
-import sys
-import ooni.dnsooni
-
-class DNS():
- def __init__(self, args):
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.debug = False
- self.randomize = args.randomize
-
- def DNS_Tests(self):
- print "DNS tampering detection:"
- filter_name = "_DNS_Tests"
- tests = [ooni.dnsooni]
- for test in tests:
- for function_ptr in dir(test):
- if function_ptr.endswith(filter_name):
- filter_result = getattr(test, function_ptr)(self)
- if filter_result == True:
- print function_ptr + " thinks the network is clean"
- elif filter_result == None:
- print function_ptr + " failed"
- else:
- print function_ptr + " thinks the network is dirty"
-
- def main(self):
- for function_ptr in dir(self):
- if function_ptr.endswith("_Tests"):
- getattr(self, function_ptr)()
-
-if __name__ == '__main__':
- self.main()
diff --git a/to-be-ported/very-old/ooni/plugins/__init__.py b/to-be-ported/very-old/ooni/plugins/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/to-be-ported/very-old/ooni/plugins/dnstest_plgoo.py b/to-be-ported/very-old/ooni/plugins/dnstest_plgoo.py
deleted file mode 100644
index 0c0cfa7..0000000
--- a/to-be-ported/very-old/ooni/plugins/dnstest_plgoo.py
+++ /dev/null
@@ -1,84 +0,0 @@
-#!/usr/bin/python
-
-import sys
-import re
-from pprint import pprint
-from twisted.internet import reactor, endpoints
-from twisted.names import client
-from ooni.plugooni import Plugoo
-from ooni.socksclient import SOCKSv4ClientProtocol, SOCKSWrapper
-
-class DNSTestPlugin(Plugoo):
- def __init__(self):
- self.name = ""
- self.type = ""
- self.paranoia = ""
- self.modules_to_import = []
- self.output_dir = ""
- self.buf = ""
- self.control_response = []
-
- def response_split(self, response):
- a = []
- b = []
- for i in response:
- a.append(i[0])
- b.append(i[1])
-
- return a,b
-
- def cb(self, type, hostname, dns_server, value):
- if self.control_response is None:
- self.control_response = []
- if type == 'control' and self.control_response != value:
- print "%s %s" % (dns_server, value)
- self.control_response.append((dns_server,value))
- pprint(self.control_response)
- if type == 'experiment':
- pprint(self.control_response)
- _, res = self.response_split(self.control_response)
- if value not in res:
- print "res (%s) : " % value
- pprint(res)
- print "---"
- print "%s appears to be censored on %s (%s != %s)" % (hostname, dns_server, res[0], value)
-
- else:
- print "%s appears to be clean on %s" % (hostname, dns_server)
- self.r2.servers = [('212.245.158.66',53)]
- print "HN: %s %s" % (hostname, value)
-
- def err(self, pck, error):
- pprint(pck)
- error.printTraceback()
- reactor.stop()
- print "error!"
- pass
-
- def ooni_main(self, args):
- self.experimentalproxy = ''
- self.test_hostnames = ['dio.it']
- self.control_dns = [('8.8.8.8',53), ('4.4.4.8',53)]
- self.experiment_dns = [('85.37.17.9',53),('212.245.158.66',53)]
-
- self.control_res = []
- self.control_response = None
-
- self.r1 = client.Resolver(None, [self.control_dns.pop()])
- self.r2 = client.Resolver(None, [self.experiment_dns.pop()])
-
- for hostname in self.test_hostnames:
- for dns_server in self.control_dns:
- self.r1.servers = [dns_server]
- f = self.r1.getHostByName(hostname)
- pck = (hostname, dns_server)
- f.addCallback(lambda x: self.cb('control', hostname, dns_server, x)).addErrback(lambda x: self.err(pck, x))
-
- for dns_server in self.experiment_dns:
- self.r2.servers = [dns_server]
- pck = (hostname, dns_server)
- f = self.r2.getHostByName(hostname)
- f.addCallback(lambda x: self.cb('experiment', hostname, dns_server, x)).addErrback(lambda x: self.err(pck, x))
-
- reactor.run()
-
diff --git a/to-be-ported/very-old/ooni/plugins/http_plgoo.py b/to-be-ported/very-old/ooni/plugins/http_plgoo.py
deleted file mode 100644
index 021e863..0000000
--- a/to-be-ported/very-old/ooni/plugins/http_plgoo.py
+++ /dev/null
@@ -1,70 +0,0 @@
-#!/usr/bin/python
-
-import sys
-import re
-from twisted.internet import reactor, endpoints
-from twisted.web import client
-from ooni.plugooni import Plugoo
-from ooni.socksclient import SOCKSv4ClientProtocol, SOCKSWrapper
-
-class HttpPlugin(Plugoo):
- def __init__(self):
- self.name = ""
- self.type = ""
- self.paranoia = ""
- self.modules_to_import = []
- self.output_dir = ""
- self.buf = ''
-
- def cb(self, type, content):
- print "got %d bytes from %s" % (len(content), type) # DEBUG
- if not self.buf:
- self.buf = content
- else:
- if self.buf == content:
- print "SUCCESS"
- else:
- print "FAIL"
- reactor.stop()
-
- def endpoint(self, scheme, host, port):
- ep = None
- if scheme == 'http':
- ep = endpoints.TCP4ClientEndpoint(reactor, host, port)
- elif scheme == 'https':
- ep = endpoints.SSL4ClientEndpoint(reactor, host, port, context)
- return ep
-
- def ooni_main(self):
- # We don't have the Command object so cheating for now.
- url = 'http://check.torproject.org/'
- self.controlproxy = 'socks4a://127.0.0.1:9050'
- self.experimentalproxy = ''
-
- if not re.match("[a-zA-Z0-9]+\:\/\/[a-zA-Z0-9]+", url):
- return None
- scheme, host, port, path = client._parse(url)
-
- ctrl_dest = self.endpoint(scheme, host, port)
- if not ctrl_dest:
- raise Exception('unsupported scheme %s in %s' % (scheme, url))
- if self.controlproxy:
- _, proxy_host, proxy_port, _ = client._parse(self.controlproxy)
- control = SOCKSWrapper(reactor, proxy_host, proxy_port, ctrl_dest)
- else:
- control = ctrl_dest
- f = client.HTTPClientFactory(url)
- f.deferred.addCallback(lambda x: self.cb('control', x))
- control.connect(f)
-
- exp_dest = self.endpoint(scheme, host, port)
- if not exp_dest:
- raise Exception('unsupported scheme %s in %s' % (scheme, url))
- # FIXME: use the experiment proxy if there is one
- experiment = exp_dest
- f = client.HTTPClientFactory(url)
- f.deferred.addCallback(lambda x: self.cb('experiment', x))
- experiment.connect(f)
-
- reactor.run()
-
diff --git a/to-be-ported/very-old/ooni/plugins/marco_plgoo.py b/to-be-ported/very-old/ooni/plugins/marco_plgoo.py
deleted file mode 100644
index cb63df7..0000000
--- a/to-be-ported/very-old/ooni/plugins/marco_plgoo.py
+++ /dev/null
@@ -1,377 +0,0 @@
-#!/usr/bin/python
-# Copyright 2009 The Tor Project, Inc.
-# License at end of file.
-#
-# This tests connections to a list of Tor nodes in a given Tor consensus file
-# while also recording the certificates - it's not a perfect tool but complete
-# or even partial failure should raise alarms.
-#
-# This plugoo uses threads and as a result, it's not friendly to SIGINT signals.
-#
-
-import logging
-import socket
-import time
-import random
-import threading
-import sys
-import os
-try:
- from ooni.plugooni import Plugoo
-except:
- print "Error importing Plugoo"
-
-try:
- from ooni.common import Storage
-except:
- print "Error importing Storage"
-
-try:
- from ooni import output
-except:
- print "Error importing output"
-
-try:
- from ooni import input
-except:
- print "Error importing output"
-
-
-
-ssl = OpenSSL = None
-
-try:
- import ssl
-except ImportError:
- pass
-
-if ssl is None:
- try:
- import OpenSSL.SSL
- import OpenSSL.crypto
- except ImportError:
- pass
-
-if ssl is None and OpenSSL is None:
- if socket.ssl:
- print """Your Python is too old to have the ssl module, and you haven't
-installed pyOpenSSL. I'll try to work with what you've got, but I can't
-record certificates so well."""
- else:
- print """Your Python has no OpenSSL support. Upgrade to 2.6, install
-pyOpenSSL, or both."""
- sys.exit(1)
-
-################################################################
-
-# How many servers should we test in parallel?
-N_THREADS = 16
-
-# How long do we give individual socket operations to succeed or fail?
-# (Seconds)
-TIMEOUT = 10
-
-################################################################
-
-CONNECTING = "noconnect"
-HANDSHAKING = "nohandshake"
-OK = "ok"
-ERROR = "err"
-
-LOCK = threading.RLock()
-socket.setdefaulttimeout(TIMEOUT)
-
-def clean_pem_cert(cert):
- idx = cert.find('-----END')
- if idx > 1 and cert[idx-1] != '\n':
- cert = cert.replace('-----END','\n-----END')
- return cert
-
-def record((addr,port), state, extra=None, cert=None):
- LOCK.acquire()
- try:
- OUT.append({'addr' : addr,
- 'port' : port,
- 'state' : state,
- 'extra' : extra})
- if cert:
- CERT_OUT.append({'addr' : addr,
- 'port' : port,
- 'clean_cert' : clean_pem_cert(cert)})
- finally:
- LOCK.release()
-
-def probe(address,theCtx=None):
- sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- logging.info("Opening socket to %s",address)
- try:
- s.connect(address)
- except IOError, e:
- logging.info("Error %s from socket connect.",e)
- record(address, CONNECTING, e)
- s.close()
- return
- logging.info("Socket to %s open. Launching SSL handshake.",address)
- if ssl:
- try:
- s = ssl.wrap_socket(s,cert_reqs=ssl.CERT_NONE,ca_certs=None)
- # "MARCO!"
- s.do_handshake()
- except IOError, e:
- logging.info("Error %s from ssl handshake",e)
- record(address, HANDSHAKING, e)
- s.close()
- sock.close()
- return
- cert = s.getpeercert(True)
- if cert != None:
- cert = ssl.DER_cert_to_PEM_cert(cert)
- elif OpenSSL:
- try:
- s = OpenSSL.SSL.Connection(theCtx, s)
- s.set_connect_state()
- s.setblocking(True)
- s.do_handshake()
- cert = s.get_peer_certificate()
- if cert != None:
- cert = OpenSSL.crypto.dump_certificate(
- OpenSSL.crypto.FILETYPE_PEM, cert)
- except IOError, e:
- logging.info("Error %s from OpenSSL handshake",e)
- record(address, HANDSHAKING, e)
- s.close()
- sock.close()
- return
- else:
- try:
- s = socket.ssl(s)
- s.write('a')
- cert = s.server()
- except IOError, e:
- logging.info("Error %s from socket.ssl handshake",e)
- record(address, HANDSHAKING, e)
- sock.close()
- return
-
- logging.info("SSL handshake with %s finished",address)
- # "POLO!"
- record(address,OK, cert=cert)
- if (ssl or OpenSSL):
- s.close()
- sock.close()
-
-def parseNetworkstatus(ns):
- for line in ns:
- if line.startswith('r '):
- r = line.split()
- yield (r[-3],int(r[-2]))
-
-def parseCachedDescs(cd):
- for line in cd:
- if line.startswith('router '):
- r = line.split()
- yield (r[2],int(r[3]))
-
-def worker(addrList, origLength):
- done = False
- logging.info("Launching thread.")
-
- if OpenSSL is not None:
- context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_METHOD)
- else:
- context = None
-
- while True:
- LOCK.acquire()
- try:
- if addrList:
- print "Starting test %d/%d"%(
- 1+origLength-len(addrList),origLength)
- addr = addrList.pop()
- else:
- return
- finally:
- LOCK.release()
-
- try:
- logging.info("Launching probe for %s",addr)
- probe(addr, context)
- except Exception, e:
- logging.info("Unexpected error from %s",addr)
- record(addr, ERROR, e)
-
-def runThreaded(addrList, nThreads):
- ts = []
- origLen = len(addrList)
- for num in xrange(nThreads):
- t = threading.Thread(target=worker, args=(addrList,origLen))
- t.setName("Th#%s"%num)
- ts.append(t)
- t.start()
- for t in ts:
- logging.info("Joining thread %s",t.getName())
- t.join()
-
-def main(self, args):
- # BEGIN
- # This logic should be present in more or less all plugoos
- global OUT
- global CERT_OUT
- global OUT_DATA
- global CERT_OUT_DATA
- OUT_DATA = []
- CERT_OUT_DATA = []
-
- try:
- OUT = output.data(name=args.output.main) #open(args.output.main, 'w')
- except:
- print "No output file given. quitting..."
- return -1
-
- try:
- CERT_OUT = output.data(args.output.certificates) #open(args.output.certificates, 'w')
- except:
- print "No output cert file given. quitting..."
- return -1
-
- logging.basicConfig(format='%(asctime)s [%(levelname)s] [%(threadName)s] %(message)s',
- datefmt="%b %d %H:%M:%S",
- level=logging.INFO,
- filename=args.log)
- logging.info("============== STARTING NEW LOG")
- # END
-
- if ssl is not None:
- methodName = "ssl"
- elif OpenSSL is not None:
- methodName = "OpenSSL"
- else:
- methodName = "socket"
- logging.info("Running marco with method '%s'", methodName)
-
- addresses = []
-
- if args.input.ips:
- for fn in input.file(args.input.ips).simple():
- a, b = fn.split(":")
- addresses.append( (a,int(b)) )
-
- elif args.input.consensus:
- for fn in args:
- print fn
- for a,b in parseNetworkstatus(open(args.input.consensus)):
- addresses.append( (a,b) )
-
- if args.input.randomize:
- # Take a random permutation of the set the knuth way!
- for i in range(0, len(addresses)):
- j = random.randint(0, i)
- addresses[i], addresses[j] = addresses[j], addresses[i]
-
- if len(addresses) == 0:
- logging.error("No input source given, quiting...")
- return -1
-
- addresses = list(addresses)
-
- if not args.input.randomize:
- addresses.sort()
-
- runThreaded(addresses, N_THREADS)
-
-class MarcoPlugin(Plugoo):
- def __init__(self):
- self.name = ""
-
- self.modules = [ "logging", "socket", "time", "random", "threading", "sys",
- "OpenSSL.SSL", "OpenSSL.crypto", "os" ]
-
- self.input = Storage()
- self.input.ip = None
- try:
- c_file = os.path.expanduser("~/.tor/cached-consensus")
- open(c_file)
- self.input.consensus = c_file
- except:
- pass
-
- try:
- c_file = os.path.expanduser("~/tor/bundle/tor-browser_en-US/Data/Tor/cached-consensus")
- open(c_file)
- self.input.consensus = c_file
- except:
- pass
-
- if not self.input.consensus:
- print "Error importing consensus file"
- sys.exit(1)
-
- self.output = Storage()
- self.output.main = 'reports/marco-1.yamlooni'
- self.output.certificates = 'reports/marco_certs-1.out'
-
- # XXX This needs to be moved to a proper function
- # refactor, refactor and ... refactor!
- if os.path.exists(self.output.main):
- basedir = "/".join(self.output.main.split("/")[:-1])
- fn = self.output.main.split("/")[-1].split(".")
- ext = fn[1]
- name = fn[0].split("-")[0]
- i = fn[0].split("-")[1]
- i = int(i) + 1
- self.output.main = os.path.join(basedir, name + "-" + str(i) + "." + ext)
-
- if os.path.exists(self.output.certificates):
- basedir = "/".join(self.output.certificates.split("/")[:-1])
- fn = self.output.certificates.split("/")[-1].split(".")
- ext = fn[1]
- name = fn[0].split("-")[0]
- i = fn[0].split("-")[1]
- i = int(i) + 1
- self.output.certificates= os.path.join(basedir, name + "-" + str(i) + "." + ext)
-
- # We require for Tor to already be running or have recently run
- self.args = Storage()
- self.args.input = self.input
- self.args.output = self.output
- self.args.log = 'reports/marco.log'
-
- def ooni_main(self, cmd):
- self.args.input.randomize = cmd.randomize
- self.args.input.ips = cmd.listfile
- main(self, self.args)
-
-if __name__ == '__main__':
- if len(sys.argv) < 2:
- print >> sys.stderr, ("This script takes one or more networkstatus "
- "files as an argument.")
- self = None
- main(self, sys.argv[1:])
-
-# Redistribution and use in source and binary forms, with or without
-# modification, are permitted provided that the following conditions are
-# met:
-#
-# * Redistributions of source code must retain the above copyright
-# notice, this list of conditions and the following disclaimer.
-#
-# * Redistributions in binary form must reproduce the above
-# copyright notice, this list of conditions and the following disclaimer
-# in the documentation and/or other materials provided with the
-# distribution.
-#
-# * Neither the names of the copyright owners nor the names of its
-# contributors may be used to endorse or promote products derived from
-# this software without specific prior written permission.
-#
-# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/to-be-ported/very-old/ooni/plugins/proxy_plgoo.py b/to-be-ported/very-old/ooni/plugins/proxy_plgoo.py
deleted file mode 100644
index d175c1c..0000000
--- a/to-be-ported/very-old/ooni/plugins/proxy_plgoo.py
+++ /dev/null
@@ -1,69 +0,0 @@
-#!/usr/bin/python
-
-import sys
-from twisted.internet import reactor, endpoints
-from twisted.web import client
-from ooni.plugooni import Plugoo
-from ooni.socksclient import SOCKSv4ClientProtocol, SOCKSWrapper
-
-class HttpPlugin(Plugoo):
- def __init__(self):
- self.name = ""
- self.type = ""
- self.paranoia = ""
- self.modules_to_import = []
- self.output_dir = ""
- self.buf = ''
-
- def cb(self, type, content):
- print "got %d bytes from %s" % (len(content), type) # DEBUG
- if not self.buf:
- self.buf = content
- else:
- if self.buf == content:
- print "SUCCESS"
- else:
- print "FAIL"
- reactor.stop()
-
- def endpoint(self, scheme, host, port):
- ep = None
- if scheme == 'http':
- ep = endpoints.TCP4ClientEndpoint(reactor, host, port)
- elif scheme == 'https':
- from twisted.internet import ssl
- ep = endpoints.SSL4ClientEndpoint(reactor, host, port,
- ssl.ClientContextFactory())
- return ep
-
- def ooni_main(self, cmd):
- # We don't have the Command object so cheating for now.
- url = cmd.hostname
-
- # FIXME: validate that url is on the form scheme://host[:port]/path
- scheme, host, port, path = client._parse(url)
-
- ctrl_dest = self.endpoint(scheme, host, port)
- if not ctrl_dest:
- raise Exception('unsupported scheme %s in %s' % (scheme, url))
- if cmd.controlproxy:
- assert scheme != 'https', "no support for proxied https atm, sorry"
- _, proxy_host, proxy_port, _ = client._parse(cmd.controlproxy)
- control = SOCKSWrapper(reactor, proxy_host, proxy_port, ctrl_dest)
- print "proxy: ", proxy_host, proxy_port
- else:
- control = ctrl_dest
- f = client.HTTPClientFactory(url)
- f.deferred.addCallback(lambda x: self.cb('control', x))
- control.connect(f)
-
- exp_dest = self.endpoint(scheme, host, port)
- if not exp_dest:
- raise Exception('unsupported scheme %s in %s' % (scheme, url))
- # FIXME: use the experiment proxy if there is one
- experiment = exp_dest
- f = client.HTTPClientFactory(url)
- f.deferred.addCallback(lambda x: self.cb('experiment', x))
- experiment.connect(f)
-
- reactor.run()
diff --git a/to-be-ported/very-old/ooni/plugins/simple_dns_plgoo.py b/to-be-ported/very-old/ooni/plugins/simple_dns_plgoo.py
deleted file mode 100644
index 87d3684..0000000
--- a/to-be-ported/very-old/ooni/plugins/simple_dns_plgoo.py
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/usr/bin/env python
-#
-# DNS tampering detection module
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-#
-# This module performs DNS queries against a known good resolver and a possible
-# bad resolver. We compare every resolved name against a list of known filters
-# - if we match, we ring a bell; otherwise, we list possible filter IP
-# addresses. There is a high false positive rate for sites that are GeoIP load
-# balanced.
-#
-
-import sys
-import ooni.dnsooni
-
-from ooni.plugooni import Plugoo
-
-class DNSBulkPlugin(Plugoo):
- def __init__(self):
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.randomize = True # Pass this down properly
- self.debug = False
-
- def DNS_Tests(self):
- print "DNS tampering detection for list of domains:"
- tests = self.get_tests_by_filter(("_DNS_BULK_Tests"), (ooni.dnsooni))
- self.run_tests(tests)
-
- def magic_main(self):
- self.run_plgoo_tests("_Tests")
-
- def ooni_main(self, args):
- self.magic_main()
-
diff --git a/to-be-ported/very-old/ooni/plugins/tcpcon_plgoo.py b/to-be-ported/very-old/ooni/plugins/tcpcon_plgoo.py
deleted file mode 100644
index 01dee81..0000000
--- a/to-be-ported/very-old/ooni/plugins/tcpcon_plgoo.py
+++ /dev/null
@@ -1,278 +0,0 @@
-#!/usr/bin/python
-# Copyright 2011 The Tor Project, Inc.
-# License at end of file.
-#
-# This is a modified version of the marco plugoo. Given a list of #
-# IP:port addresses, this plugoo will attempt a TCP connection with each
-# host and write the results to a .yamlooni file.
-#
-# This plugoo uses threads and as a result, it's not friendly to SIGINT signals.
-#
-
-import logging
-import socket
-import time
-import random
-import threading
-import sys
-import os
-try:
- from ooni.plugooni import Plugoo
-except:
- print "Error importing Plugoo"
-
-try:
- from ooni.common import Storage
-except:
- print "Error importing Storage"
-
-try:
- from ooni import output
-except:
- print "Error importing output"
-
-try:
- from ooni import input
-except:
- print "Error importing output"
-
-################################################################
-
-# How many servers should we test in parallel?
-N_THREADS = 16
-
-# How long do we give individual socket operations to succeed or fail?
-# (Seconds)
-TIMEOUT = 10
-
-################################################################
-
-CONNECTING = "noconnect"
-OK = "ok"
-ERROR = "err"
-
-LOCK = threading.RLock()
-socket.setdefaulttimeout(TIMEOUT)
-
-# We will want to log the IP address, the port and the state
-def record((addr,port), state, extra=None):
- LOCK.acquire()
- try:
- OUT.append({'addr' : addr,
- 'port' : port,
- 'state' : state,
- 'extra' : extra})
- finally:
- LOCK.release()
-
-# For each IP address in the list, open a socket, write to the log and
-# then close the socket
-def probe(address,theCtx=None):
- sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- logging.info("Opening socket to %s",address)
- try:
- s.connect(address)
- except IOError, e:
- logging.info("Error %s from socket connect.",e)
- record(address, CONNECTING, e)
- s.close()
- return
- logging.info("Socket to %s open. Successfully launched TCP handshake.",address)
- record(address, OK)
- s.close()
-
-def parseNetworkstatus(ns):
- for line in ns:
- if line.startswith('r '):
- r = line.split()
- yield (r[-3],int(r[-2]))
-
-def parseCachedDescs(cd):
- for line in cd:
- if line.startswith('router '):
- r = line.split()
- yield (r[2],int(r[3]))
-
-def worker(addrList, origLength):
- done = False
- context = None
-
- while True:
- LOCK.acquire()
- try:
- if addrList:
- print "Starting test %d/%d"%(
- 1+origLength-len(addrList),origLength)
- addr = addrList.pop()
- else:
- return
- finally:
- LOCK.release()
-
- try:
- logging.info("Launching probe for %s",addr)
- probe(addr, context)
- except Exception, e:
- logging.info("Unexpected error from %s",addr)
- record(addr, ERROR, e)
-
-def runThreaded(addrList, nThreads):
- ts = []
- origLen = len(addrList)
- for num in xrange(nThreads):
- t = threading.Thread(target=worker, args=(addrList,origLen))
- t.setName("Th#%s"%num)
- ts.append(t)
- t.start()
- for t in ts:
- t.join()
-
-def main(self, args):
- # BEGIN
- # This logic should be present in more or less all plugoos
- global OUT
- global OUT_DATA
- OUT_DATA = []
-
- try:
- OUT = output.data(name=args.output.main) #open(args.output.main, 'w')
- except:
- print "No output file given. quitting..."
- return -1
-
- logging.basicConfig(format='%(asctime)s [%(levelname)s] [%(threadName)s] %(message)s',
- datefmt="%b %d %H:%M:%S",
- level=logging.INFO,
- filename=args.log)
- logging.info("============== STARTING NEW LOG")
- # END
-
- methodName = "socket"
- logging.info("Running tcpcon with method '%s'", methodName)
-
- addresses = []
-
- if args.input.ips:
- for fn in input.file(args.input.ips).simple():
- a, b = fn.split(":")
- addresses.append( (a,int(b)) )
-
- elif args.input.consensus:
- for fn in args:
- print fn
- for a,b in parseNetworkstatus(open(args.input.consensus)):
- addresses.append( (a,b) )
-
- if args.input.randomize:
- # Take a random permutation of the set the knuth way!
- for i in range(0, len(addresses)):
- j = random.randint(0, i)
- addresses[i], addresses[j] = addresses[j], addresses[i]
-
- if len(addresses) == 0:
- logging.error("No input source given, quiting...")
- return -1
-
- addresses = list(addresses)
-
- if not args.input.randomize:
- addresses.sort()
-
- runThreaded(addresses, N_THREADS)
-
-class MarcoPlugin(Plugoo):
- def __init__(self):
- self.name = ""
-
- self.modules = [ "logging", "socket", "time", "random", "threading", "sys",
- "os" ]
-
- self.input = Storage()
- self.input.ip = None
- try:
- c_file = os.path.expanduser("~/.tor/cached-consensus")
- open(c_file)
- self.input.consensus = c_file
- except:
- pass
-
- try:
- c_file = os.path.expanduser("~/tor/bundle/tor-browser_en-US/Data/Tor/cached-consensus")
- open(c_file)
- self.input.consensus = c_file
- except:
- pass
-
- if not self.input.consensus:
- print "Error importing consensus file"
- sys.exit(1)
-
- self.output = Storage()
- self.output.main = 'reports/tcpcon-1.yamlooni'
- self.output.certificates = 'reports/tcpcon_certs-1.out'
-
- # XXX This needs to be moved to a proper function
- # refactor, refactor and ... refactor!
- if os.path.exists(self.output.main):
- basedir = "/".join(self.output.main.split("/")[:-1])
- fn = self.output.main.split("/")[-1].split(".")
- ext = fn[1]
- name = fn[0].split("-")[0]
- i = fn[0].split("-")[1]
- i = int(i) + 1
- self.output.main = os.path.join(basedir, name + "-" + str(i) + "." + ext)
-
- if os.path.exists(self.output.certificates):
- basedir = "/".join(self.output.certificates.split("/")[:-1])
- fn = self.output.certificates.split("/")[-1].split(".")
- ext = fn[1]
- name = fn[0].split("-")[0]
- i = fn[0].split("-")[1]
- i = int(i) + 1
- self.output.certificates= os.path.join(basedir, name + "-" + str(i) + "." + ext)
-
- # We require for Tor to already be running or have recently run
- self.args = Storage()
- self.args.input = self.input
- self.args.output = self.output
- self.args.log = 'reports/tcpcon.log'
-
- def ooni_main(self, cmd):
- self.args.input.randomize = cmd.randomize
- self.args.input.ips = cmd.listfile
- main(self, self.args)
-
-if __name__ == '__main__':
- if len(sys.argv) < 2:
- print >> sys.stderr, ("This script takes one or more networkstatus "
- "files as an argument.")
- self = None
- main(self, sys.argv[1:])
-
-# Redistribution and use in source and binary forms, with or without
-# modification, are permitted provided that the following conditions are
-# met:
-#
-# * Redistributions of source code must retain the above copyright
-# notice, this list of conditions and the following disclaimer.
-#
-# * Redistributions in binary form must reproduce the above
-# copyright notice, this list of conditions and the following disclaimer
-# in the documentation and/or other materials provided with the
-# distribution.
-#
-# * Neither the names of the copyright owners nor the names of its
-# contributors may be used to endorse or promote products derived from
-# this software without specific prior written permission.
-#
-# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
-# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
-# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
-# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
-# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
-# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
-# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
-# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
-# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
-# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/to-be-ported/very-old/ooni/plugins/tor.py b/to-be-ported/very-old/ooni/plugins/tor.py
deleted file mode 100644
index 0d95d4d..0000000
--- a/to-be-ported/very-old/ooni/plugins/tor.py
+++ /dev/null
@@ -1,80 +0,0 @@
-import re
-import os.path
-import signal
-import subprocess
-import socket
-import threading
-import time
-import logging
-
-from pytorctl import TorCtl
-
-torrc = os.path.join(os.getcwd(),'torrc') #os.path.join(projroot, 'globaleaks', 'tor', 'torrc')
-# hiddenservice = os.path.join(projroot, 'globaleaks', 'tor', 'hiddenservice')
-
-class ThreadProc(threading.Thread):
- def __init__(self, cmd):
- threading.Thread.__init__(self)
- self.cmd = cmd
- self.proc = None
-
- def run(self):
- print "running"
- try:
- self.proc = subprocess.Popen(self.cmd,
- shell = False, stdout = subprocess.PIPE,
- stderr = subprocess.PIPE)
-
- except OSError:
- logging.fatal('cannot execute command')
-
-class Tor:
- def __init__(self):
- self.start()
-
- def check(self):
- conn = TorCtl.connect()
- if conn != None:
- conn.close()
- return True
-
- return False
-
-
- def start(self):
- if not os.path.exists(torrc):
- raise OSError("torrc doesn't exist (%s)" % torrc)
-
- tor_cmd = ["tor", "-f", torrc]
-
- torproc = ThreadProc(tor_cmd)
- torproc.run()
-
- bootstrap_line = re.compile("Bootstrapped 100%: ")
-
- while True:
- if torproc.proc == None:
- time.sleep(1)
- continue
-
- init_line = torproc.proc.stdout.readline().strip()
-
- if not init_line:
- torproc.proc.kill()
- return False
-
- if bootstrap_line.search(init_line):
- break
-
- return True
-
- def stop(self):
- if not self.check():
- return
-
- conn = TorCtl.connect()
- if conn != None:
- conn.send_signal("SHUTDOWN")
- conn.close()
-
-t = Tor()
diff --git a/to-be-ported/very-old/ooni/plugins/torrc b/to-be-ported/very-old/ooni/plugins/torrc
deleted file mode 100644
index b9ffc80..0000000
--- a/to-be-ported/very-old/ooni/plugins/torrc
+++ /dev/null
@@ -1,9 +0,0 @@
-SocksPort 9050
-ControlPort 9051
-VirtualAddrNetwork 10.23.47.0/10
-AutomapHostsOnResolve 1
-TransPort 9040
-TransListenAddress 127.0.0.1
-DNSPort 5353
-DNSListenAddress 127.0.0.1
-
diff --git a/to-be-ported/very-old/ooni/plugooni.py b/to-be-ported/very-old/ooni/plugooni.py
deleted file mode 100644
index 17f17b3..0000000
--- a/to-be-ported/very-old/ooni/plugooni.py
+++ /dev/null
@@ -1,106 +0,0 @@
-#!/usr/bin/env python
-#
-# Plugooni, ooni plugin module for loading plgoo files.
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-# Arturo Filasto' <art(a)fuffa.org>
-
-import sys
-import os
-
-import imp, pkgutil, inspect
-
-class Plugoo:
- def __init__(self, name, plugin_type, paranoia, author):
- self.name = name
- self.author = author
- self.type = plugin_type
- self.paranoia = paranoia
-
- """
- Expect a tuple of strings in 'filters' and a tuple of ooni 'plugins'.
- Return a list of (plugin, function) tuples that match 'filter' in 'plugins'.
- """
- def get_tests_by_filter(self, filters, plugins):
- ret_functions = []
-
- for plugin in plugins:
- for function_ptr in dir(plugin):
- if function_ptr.endswith(filters):
- ret_functions.append((plugin,function_ptr))
- return ret_functions
-
- """
- Expect a list of (plugin, function) tuples that must be ran, and three strings 'clean'
- 'dirty' and 'failed'.
- Run the tests and print 'clean','dirty' or 'failed' according to the test result.
- """
- def run_tests(self, tests, clean="clean", dirty="dirty", failed="failed"):
- for test in tests:
- filter_result = getattr(test[0], test[1])(self)
- if filter_result == True:
- print test[1] + ": " + clean
- elif filter_result == None:
- print test[1] + ": " + failed
- else:
- print test[1] + ": " + dirty
-
- """
- Find all the tests belonging to plgoo 'self' and run them.
- We know the tests when we see them because they end in 'filter'.
- """
- def run_plgoo_tests(self, filter):
- for function_ptr in dir(self):
- if function_ptr.endswith(filter):
- getattr(self, function_ptr)()
-
-PLUGIN_PATHS = [os.path.join(os.getcwd(), "ooni", "plugins")]
-RESERVED_NAMES = [ "skel_plgoo" ]
-
-class Plugooni():
- def __init__(self, args):
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.debug = False
- self.loadall = True
- self.plugin_name = args.plugin_name
- self.listfile = args.listfile
-
- self.plgoo_found = False
-
- # Print all the plugoons to stdout.
- def list_plugoons(self):
- print "Plugooni list:"
- for loader, name, ispkg in pkgutil.iter_modules(PLUGIN_PATHS):
- if name not in RESERVED_NAMES:
- print "\t%s" %(name.split("_")[0])
-
- # Return name of the plgoo class of a plugin.
- # We know because it always ends with "Plugin".
- def get_plgoo_class(self,plugin):
- for memb_name, memb in inspect.getmembers(plugin, inspect.isclass):
- if memb.__name__.endswith("Plugin"):
- return memb
-
- # This function is responsible for loading and running the plugoons
- # the user wants to run.
- def run(self, command_object):
- print "Plugooni: the ooni plgoo plugin module loader"
-
- # iterate all modules
- for loader, name, ispkg in pkgutil.iter_modules(PLUGIN_PATHS):
- # see if this module should be loaded
- if (self.plugin_name == "all") or (name == self.plugin_name+"_plgoo"):
- self.plgoo_found = True # we found at least one plgoo!
-
- file, pathname, desc = imp.find_module(name, PLUGIN_PATHS)
- # load module
- plugin = imp.load_module(name, file, pathname, desc)
- # instantiate plgoo class and call its ooni_main()
- self.get_plgoo_class(plugin)().ooni_main(command_object)
-
- # if we couldn't find the plgoo; whine to the user
- if self.plgoo_found is False:
- print "Plugooni could not find plugin '%s'!" %(self.plugin_name)
-
-if __name__ == '__main__':
- self.main()
diff --git a/to-be-ported/very-old/ooni/transparenthttp.py b/to-be-ported/very-old/ooni/transparenthttp.py
deleted file mode 100644
index 311fb32..0000000
--- a/to-be-ported/very-old/ooni/transparenthttp.py
+++ /dev/null
@@ -1,41 +0,0 @@
-#!/usr/bin/env python
-#
-# Captive Portal Detection With Multi-Vendor Emulation
-# by Jacob Appelbaum <jacob(a)appelbaum.net>
-#
-# This module performs multiple tests that match specific vendor
-# mitm proxies
-
-import sys
-import ooni.http
-import ooni.report
-
-class TransparentHTTPProxy():
- def __init__(self, args):
- self.in_ = sys.stdin
- self.out = sys.stdout
- self.debug = False
- self.logger = ooni.report.Log().logger
-
- def TransparentHTTPProxy_Tests(self):
- print "Transparent HTTP Proxy:"
- filter_name = "_TransparentHTTP_Tests"
- tests = [ooni.http]
- for test in tests:
- for function_ptr in dir(test):
- if function_ptr.endswith(filter_name):
- filter_result = getattr(test, function_ptr)(self)
- if filter_result == True:
- print function_ptr + " thinks the network is clean"
- elif filter_result == None:
- print function_ptr + " failed"
- else:
- print function_ptr + " thinks the network is dirty"
-
- def main(self):
- for function_ptr in dir(self):
- if function_ptr.endswith("_Tests"):
- getattr(self, function_ptr)()
-
-if __name__ == '__main__':
- self.main()
diff --git a/var/old_notes.txt b/var/old_notes.txt
new file mode 100644
index 0000000..81d834f
--- /dev/null
+++ b/var/old_notes.txt
@@ -0,0 +1,418 @@
+This is a list of techniques that should be added as plugins or hooks or yamlooni
+
+Implement Plugooni - our plugin framework
+Implement Yamlooni - our output format
+Implement Proxooni - our proxy spec and program
+
+We should launch our own Tor on a special port (say, 127.0.0.1:9066)
+We should act as a controller with TorCtl to do this, etc
+We should take the Tor consensus file and pass it to plugins such as marco
+
+HTTP Host header comparsion of a vs b
+HTTP Content length header comparision of a vs b
+
+GET request splitting
+ "G E T "
+ Used in Iran
+
+General Malformed HTTP requests
+ Error pages are fingerprintable
+
+traceroute
+ icmp/udp/tcp
+ each network link is an edge, each hop is a vertex in a network graph
+
+traceroute hop count
+ "TTL walking"
+
+Latency measurement
+TCP reset detection
+Forged DNS spoofing detection
+
+DNS oracle query tool
+ given DNS server foo - test resolve and look for known block pages
+
+Test HTTP header order - do they get reordered?
+
+Look for these filter fingerprints:
+X-Squid-Error: ERR_SCC_SMARTFILTER_DENIED 0
+X-Squid-Error: ERR_ACCESS_DENIED 0
+X-Cache: MISS from SmartFilter
+
+
+WWW-Authenticate: Basic realm="SmartFilter Control List HTTP Download"
+
+
+Via: 1.1 WEBFILTER.CONSERVESCHOOL.ORG:8080
+
+X-Cache: MISS from webfilter.whiteschneider.com
+X-Cache: MISS from webfilter.whiteschneider.com
+X-Cache: MISS from webfilter.whiteschneider.com
+
+Location: http://192.168.0.244/webfilter/blockpage?nonce=7d2b7e500e99a0fe&tid=3
+
+
+X-Cache: MISS from webfilter.imscs.local
+X-Cache: MISS from webfilter.tjs.at
+
+
+Via: 1.1 webwasher (Webwasher 6.8.7.9396)
+
+Websense:
+HTTP/1.0 301 Moved Permanently -> Location: http://www.websense.com/
+
+Via: HTTP/1.1 localhost.localdomain (Websense-Content_Gateway/7.1.4 [c s f ]), HTTP/1.0 localhost.localdomain (Websense-Content_Gateway/7.1.4 [cMsSf ])
+
+
+BlueCoat:
+
+Via: 1.1 testrating.dc5.es.bluecoat.com
+403 ->
+Set-Cookie: BIGipServerpool_bluecoat=1185677834.20480.0000; expires=Fri, 15-Apr-2011 10:13:21 GMT; path=/
+
+HTTP/1.0 407 Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied. ) -> Via: 1.1 WEBSENSE
+
+HTTP/1.0 302 Found -> Location: http://bluecoat/?cfru=aHR0cDovLzIwMC4yNy4xMjMuMTc4Lw==
+
+HTTP/1.0 403 Forbidden
+Server: squid/3.0.STABLE8
+
+X-Squid-Error: ERR_ACCESS_DENIED 0
+X-Cache: MISS from Bluecoat
+X-Cache-Lookup: NONE from Bluecoat:3128
+Via: 1.0 Bluecoat (squid/3.0.STABLE8)
+
+ISA server:
+HTTP/1.0 403 Forbidden ( ISA Server is configured to block HTTP requests that require authentication. )
+
+
+Unknown:
+X-XSS-Protection: 1; mode=block
+
+Rimon filter:
+
+Rimon: RWC_BLOCK
+HTTP/1.1 Rimon header
+Rimon header is only sent by lighttpd
+http://www.ynetnews.com/articles/0,7340,L-3446129,00.html
+http://btya.org/pdfs/rvienerbrochure.pdf
+
+Korea filtering:
+HTTP/1.0 302 Object Moved -> Location: http://www.willtechnology.co.kr/eng/BlockingMSGew.htm
+Redirects to Korean filter:
+http://www.willtechnology.co.kr/eng/BlockingMSGew.htm
+
+UA filtering:
+HTTP/1.0 307 Temporary Redirect
+https://my.best.net.ua/login/blocked/
+
+netsweeper:
+HTTP/1.0 302 Moved
+Location: http://netsweeper1.gaggle.net:8080/webadmin/deny/index.php?dpid=53&dpruleid…
+
+Set-cookie: RT_SID_netsweeper.com.80=68a6f5c564a9db297e8feb2bff69d73f; path=/
+X-Cache: MISS from netsweeper.irishbroadband.ie
+X-Cache-Lookup: NONE from netsweeper.irishbroadband.ie:80
+Via: 1.0 netsweeper.irishbroadband.ie:80 (squid/2.6.STABLE21)
+
+Nokia:
+Via: 1.1 saec-nokiaq05ca (NetCache NetApp/6.0.7)
+Server: "Nokia"
+
+CensorNet:
+HTTP/1.0 401 Authorization Required
+WWW-Authenticate: Basic realm="CensorNet Administration Area"
+Server: CensorNet/4.0
+
+http://www.itcensor.com/censor
+
+
+Server: ZyWALL Content Filter
+
+Apache/1.3.34 (Unix) filter/1.0
+
+HTTP/1.0 502 infiniteproxyloop
+Via: 1.0 218.102.20.37 (McAfee Web Gateway 7.0.1.5.0.8505)
+
+
+Set-Cookie: McAfee-SCM-URL-Filter-Coach="dD4OzXciEcp8Ihf1dD4ZzHM5FMZ2PSvRTllOnSR4RZkqfkmEIGgb3hZlVJsEaFaXNmNS3mgsdZAxaVOKIGgrrSx4Rb8hekmNKn4g02VZToogf1SbIQcVz3Q8G/U="; Comment="McAfee URL access coaching"; Version=1; Path=/; Max-Age=900; expires=Sat, 18 Dec 2010 06:47:11 GMT;
+
+
+WWW-Authenticate: Basic realm="(Nancy McAfee)"
+
+
+No known fingerprints for:
+NetNanny
+WebChaver
+accountable2you.com
+http://www.shodanhq.com/?q=barracuda
+http://www.shodanhq.com/?q=untangle
+http://www.shodanhq.com/?q=Lightspeed
+
+Server: Smarthouse Lightspeed
+Server: Smarthouse Lightspeed2
+Server: Smarthouse Lightspeed 3
+
+Server: EdgePrism/3.8.1.1
+
+
+X-Cache: MISS from Barracuda-WebFilter.jmpsecurities.com
+Via: 1.0 Barracuda-WebFilter.jmpsecurities.com:8080 (http_scan/4.0.2.6.19)
+
+HTTP/1.0 302 Redirected by M86 Web Filter
+http://www.m86security.com/products/web_security/m86-web-filter.asp
+
+Location: http://10.1.61.37:81/cgi/block.cgi?URL=http://70.182.111.99/&IP=96.9.174.54…
+
+
+Via: 1.1 WEBSENSE
+
+
+Via: 1.1 192.168.1.251 (McAfee Web Gateway 7.1.0.1.0.10541)
+Via: 1.1 McAfeeSA3000.cbcl.lan
+
+
+X-Squid-Error: ERR_CONNECT_FAIL 111
+X-Cache: MISS from CudaWebFilter.poten.com
+
+http://212.50.251.82/ -iran squid
+
+HTTP/1.0 403 Forbidden ( Forefront TMG denied the specified Uniform Resource Locator (URL). )
+Via: 1.1 TMG
+
+
+Server: NetCache appliance (NetApp/6.0.2)
+
+
+Server: EdgePrism/3.8.1.1
+
+
+Server: Mikrotik HttpProxy
+
+
+Via: 1.1 TMG-04, 1.1 TMG-03
+
+
+X-Squid-Error: ERR_INVALID_REQ 0
+X-Cache: MISS from uspa150.trustedproxies.com
+X-Cache-Lookup: NONE from uspa150.trustedproxies.com:80
+
+http://www.shodanhq.com/host/view/93.125.95.177
+
+
+Server: SarfX WEB: Self Automation Redirect & Filter Expernet.Ltd Security Web Server
+http://203.229.245.100/ <- korea block page
+
+
+
+Server: Asroc Intelligent Security Filter 4.1.8
+
+
+
+Server: tinyproxy/1.8.2
+
+http://www.shodanhq.com/host/view/64.104.95.251
+
+
+
+Server: Asroc Intelligent Security Filter 4.1.8
+
+http://www.shodanhq.com/host/view/67.220.92.62
+
+
+Server: SarfX WEB: Self Automation Redirect & Filter Expernet.Ltd Security Web Server
+http://www.shodanhq.com/host/view/203.229.245.100
+Location: http://192.168.3.20/redirect.cgi?Time=05%2FJul%2F2011%3A21%3A29%3A32%20%2B0…
+
+
+http://www.shodanhq.com/?q=%22content+filter%22+-squid+-apache+-ZyWall&page=4
+http://www.shodanhq.com/host/view/72.5.92.51
+http://www.microsoft.com/forefront/threat-management-gateway/en/us/pricing-licensing.aspx
+
+http://meta.wikimedia.org/wiki/Talk:XFF_project
+
+% dig nats.epiccash.com
+
+; <<>> DiG 9.7.3 <<>> nats.epiccash.com
+;; global options: +cmd
+;; Got answer:
+;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 14920
+;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 0
+
+;; QUESTION SECTION:
+;nats.epiccash.com. IN A
+
+;; ANSWER SECTION:
+nats.epiccash.com. 5 IN A 172.27.0.1
+
+;; AUTHORITY SECTION:
+epiccash.com. 5 IN NS ns0.example.net.
+epiccash.com. 5 IN NS ns1.example.net.
+
+;; Query time: 81 msec
+;; SERVER: 172.16.42.2#53(172.16.42.2)
+;; WHEN: Sat Jul 16 16:14:11 2011
+;; MSG SIZE rcvd: 98
+
+If we think it's squid, we can perhaps confirm it:
+echo -e "GET cache_object://localhost/info HTTP/1.0\r\n" | nc en.wikipedia.com 80
+Harvest urls from:
+http://urlblacklist.com/?sec=download
+
+https://secure.wikimedia.org/wikipedia/simple/wiki/User_talk:62.30.249.131
+
+mention WCCPv2 filters (http://www.cl.cam.ac.uk/~rnc1/talks/090528-uknof13.pdf)
+
+Cite a bunch of Richard's work:
+http://www.cl.cam.ac.uk/~rnc1/ignoring.pdf
+
+http://www.contentkeeper.com/products/web
+
+We should detect HTTP re-directs to rfc-1918 addresses; they're almost always captive portals.
+We should also detect HTTP MITM served from rfc-1918 addresses for the same reason.
+
+We should take a page from sshshuttle and run without touching the disk
+
+VIA Rail MITM's SSL In Ottawa:
+Jul 22 17:47:21.983 [Warning] Problem bootstrapping. Stuck at 85%: Finishing handshake with first hop. (DONE; DONE; count 13; recommendation warn)
+
+http://wireless.colubris.com:81/goform/HtmlLoginRequest?username=al1852&password=al1852
+
+VIA Rail Via header (DONE):
+
+HTTP/1.0 301 Moved Permanently
+Location: http://www.google.com/
+Content-Type: text/html; charset=UTF-8
+Date: Sat, 23 Jul 2011 02:21:30 GMT
+Expires: Mon, 22 Aug 2011 02:21:30 GMT
+Cache-Control: public, max-age=2592000
+Server: gws
+Content-Length: 219
+X-XSS-Protection: 1; mode=block
+X-Cache: MISS from cache_server
+X-Cache-Lookup: MISS from cache_server:3128
+Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
+Connection: close
+
+<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
+<TITLE>301 Moved</TITLE></HEAD><BODY>
+<H1>301 Moved</H1>
+The document has moved
+<A HREF="http://www.google.com/">here</A>.
+</BODY></HTML>
+
+
+blocked site (DONE):
+
+HTTP/1.0 302 Moved Temporarily
+Server: squid/2.6.STABLE21
+Date: Sat, 23 Jul 2011 02:22:17 GMT
+Content-Length: 0
+Location: http://10.66.66.66/denied.html
+
+invalid request response:
+
+$ nc 8.8.8.8 80 (DONE)
+hjdashjkdsahjkdsa
+HTTP/1.0 400 Bad Request
+Server: squid/2.6.STABLE21
+Date: Sat, 23 Jul 2011 02:22:44 GMT
+Content-Type: text/html
+Content-Length: 1178
+Expires: Sat, 23 Jul 2011 02:22:44 GMT
+X-Squid-Error: ERR_INVALID_REQ 0
+X-Cache: MISS from cache_server
+X-Cache-Lookup: NONE from cache_server:3128
+Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
+Proxy-Connection: close
+
+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
+<HTML><HEAD><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
+<TITLE>ERROR: The requested URL could not be retrieved</TITLE>
+<STYLE type="text/css"><!--BODY{background-color:#ffffff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--></STYLE>
+</HEAD><BODY>
+<H1>ERROR</H1>
+<H2>The requested URL could not be retrieved</H2>
+<HR noshade size="1px">
+<P>
+While trying to process the request:
+<PRE>
+hjdashjkdsahjkdsa
+
+</PRE>
+<P>
+The following error was encountered:
+<UL>
+<LI>
+<STRONG>
+Invalid Request
+</STRONG>
+</UL>
+
+<P>
+Some aspect of the HTTP Request is invalid. Possible problems:
+<UL>
+<LI>Missing or unknown request method
+<LI>Missing URL
+<LI>Missing HTTP Identifier (HTTP/1.0)
+<LI>Request is too large
+<LI>Content-Length missing for POST or PUT requests
+<LI>Illegal character in hostname; underscores are not allowed
+</UL>
+<P>Your cache administrator is <A HREF="mailto:root">root</A>.
+
+<BR clear="all">
+<HR noshade size="1px">
+<ADDRESS>
+Generated Sat, 23 Jul 2011 02:22:44 GMT by cache_server (squid/2.6.STABLE21)
+</ADDRESS>
+</BODY></HTML>
+
+nc 10.66.66.66 80
+GET cache_object://localhost/info HTTP/1.0
+HTTP/1.0 403 Forbidden
+Server: squid/2.6.STABLE21
+Date: Sat, 23 Jul 2011 02:25:56 GMT
+Content-Type: text/html
+Content-Length: 1061
+Expires: Sat, 23 Jul 2011 02:25:56 GMT
+X-Squid-Error: ERR_ACCESS_DENIED 0
+X-Cache: MISS from cache_server
+X-Cache-Lookup: NONE from cache_server:3128
+Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
+Proxy-Connection: close
+
+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
+<HTML><HEAD><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
+<TITLE>ERROR: The requested URL could not be retrieved</TITLE>
+<STYLE type="text/css"><!--BODY{background-color:#ffffff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--></STYLE>
+</HEAD><BODY>
+<H1>ERROR</H1>
+<H2>The requested URL could not be retrieved</H2>
+<HR noshade size="1px">
+<P>
+While trying to retrieve the URL:
+<A HREF="cache_object://localhost/info">cache_object://localhost/info</A>
+<P>
+The following error was encountered:
+<UL>
+<LI>
+<STRONG>
+Access Denied.
+</STRONG>
+<P>
+Access control configuration prevents your request from
+being allowed at this time. Please contact your service provider if
+you feel this is incorrect.
+</UL>
+<P>Your cache administrator is <A HREF="mailto:root">root</A>.
+
+
+<BR clear="all">
+<HR noshade size="1px">
+<ADDRESS>
+Generated Sat, 23 Jul 2011 02:25:56 GMT by cache_server (squid/2.6.STABLE21)
+</ADDRESS>
+</BODY></HTML>
+
+
diff --git a/var/proxooni-spec.txt b/var/proxooni-spec.txt
new file mode 100644
index 0000000..7cc476f
--- /dev/null
+++ b/var/proxooni-spec.txt
@@ -0,0 +1,65 @@
+
+ Proxyooni specification
+ version 0.0
+ Jacob Appelbaum
+
+0. Preface
+
+ This document describes a new proxy that is required to support ooni-probe.
+
+1. Overview
+
+ There is no common proxy type that thwarts even the most basic traffic
+ monitoring. The Proxyooni specification aims to provide a proxy that is
+ encrypted by default, optionally authenticated, and will provide a way to run
+ specific ooni-probe tests natively on the system where the proxy is running.
+
+2. Implementation
+
+ Proxyooni may be written in any language, the reference implementation will be
+ implemented in Python. The program shall be called ooni-proxy and it will handle
+ running as a privileged user or an unprivileged user on supported systems. We
+ aim to support ooni-proxy on Debian Gnu/Linux as the reference platform.
+
+2.1 Connections
+
+ When ooni-proxy runs, it should open a single port and it will allow TLS 1.0
+ clients to connect with a cipher suite that provides perfect forward secrecy.
+
+2.2 Certificates
+
+ ooni-proxy should use a certificate if supplied or dynamically generate a
+ certificate on startup; any connecting client should bootstrap trust with a
+ TOFU model, a client may ignore the
+
+2.3 Authentication
+
+ ooni-proxy should provide open access by default with no authentication.
+ It should support TLS-PSK[0] if authentication is desired. Key distribution is
+ explictly an out of scope problem.
+
+3.0 Services offered
+
+ Post authentication, a remote client should treat ooni-proxy as a SOCKS4A[1]
+ proxy. It should be possible to chain as many Proxyooni proxies as desired.
+
+3.1 Additional services offered
+
+ ooni-proxy should allow for the sending of raw socket data - this is currently
+ left unspecified. This should be specified in the next revision of the
+ specification.
+
+3.2 Advanced meta-services
+
+ It may be desired to load code on the ooni-proxy from a client with newer
+ tests. This should be specified in the next revision of the specification.
+
+4. Security Concerns
+
+ It is probably not a good idea to run ooni-proxy unless you have permission to
+ do so. Consider your network context carefully; if it is dangerous to run a test
+ ensure that you do not run the test.
+
+[0] http://en.wikipedia.org/wiki/TLS-PSK
+[1] http://en.wikipedia.org/wiki/SOCKS#SOCKS_4a
+
diff --git a/var/secdev.org.pem b/var/secdev.org.pem
new file mode 100644
index 0000000..6fdbb97
--- /dev/null
+++ b/var/secdev.org.pem
@@ -0,0 +1,20 @@
+-----BEGIN CERTIFICATE-----
+MIIDVDCCAjwCCQD6iQnFvlSvNjANBgkqhkiG9w0BAQUFADBsMQswCQYDVQQGEwJG
+UjEMMAoGA1UECBMDSWRGMQ4wDAYDVQQHEwVQYXJpczETMBEGA1UEChMKc2VjZGV2
+Lm9yZzETMBEGA1UECxMKc2VjZGV2Lm9yZzEVMBMGA1UEAxQMKi5zZWNkZXYub3Jn
+MB4XDTA4MDUxOTIxMzAxNVoXDTE4MDUyMDIxMzAxNVowbDELMAkGA1UEBhMCRlIx
+DDAKBgNVBAgTA0lkRjEOMAwGA1UEBxMFUGFyaXMxEzARBgNVBAoTCnNlY2Rldi5v
+cmcxEzARBgNVBAsTCnNlY2Rldi5vcmcxFTATBgNVBAMUDCouc2VjZGV2Lm9yZzCC
+ASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAMijlApVIOF86nIsPvIfKjkQ
+qpw8DWtugsUQkspXGaJM5qM3CvoxQ3VQejIZiLIS/B57WtpwfhD63h+dswUZy1wI
+Z4injE/uF4R7ylNammROjS1ycQbFM1fWX/1nzKFrxWpX3lU2YjwB9qIAlE3u/SyH
+U10oq9ZJ5KlyOrjTPe3eb0KBwF5W0AJxcTiGQJhADZAaAivZRT880GYJAo3UaL/G
+JaBYIYSFxvGnqmUVM9kbnGLFQEQahBpgmtCzMRVFXp/AccxCtXKY+LORtSGNKaB6
+ODDidG8jyb3S9GmjtgxwyWHvY/9YRW2BkB3AufRsOAWUN7jWDtRLKy6FCLbxE/sC
+AwEAATANBgkqhkiG9w0BAQUFAAOCAQEAex0loqATvXxZEagphrLASUGpKIlTf3a1
+1adokzrKvbuDXcxNqUKEPxI09TjnT/zySLfVc18t+yy2baSstPFC9RrLPeu8rfzL
+k+NTDmM3OfW60MCeEnyNxPvW0wCIrFLfH3t5XPT3J2DtYLmecg8Lf/sQOEWPyMVc
+uCaFIYsAypGYi0wwG5VDQHEsKxkHC2nBRwGJdx9w70yy14H/JOAZl5yQpLHEc4Db
+RUfNTIV2myXOIET2VbCN2Yc8Gegsclc506XVOQypp5Ndvy4GW2yRRE2ps1c1xH6P
+OHENUp0JPyLeyibmoOCUfrlrq2KoSashFZmPCGYFFJvcKAYI45GcaQ==
+-----END CERTIFICATE-----
[View Less]
1
0
commit b55355f1ec24a50b3a9cbf3e670a89fc5e0a3528
Author: aagbsn <aagbsn(a)extc.org>
Date: Wed Dec 5 14:08:10 2012 +0000
Use HTTPS where available
---
README.md | 14 +++++++-------
1 files changed, 7 insertions(+), 7 deletions(-)
diff --git a/README.md b/README.md
index be35943..77b5b91 100644
--- a/README.md
+++ b/README.md
@@ -29,13 +29,13 @@ On debian based systems these can be installed with:
The python dependencies required for running ooniprobe are:
- * Tor (>2.2.…
[View More]x): http://torproject.org/
- * Twisted (>12.1.0): http://twistedmatrix.com/trac/
+ * Tor (>2.2.x): https://torproject.org/
+ * Twisted (>12.1.0): https://twistedmatrix.com/trac/
* PyYAML: http://pyyaml.org/
* Scapy: http://www.secdev.org/projects/scapy/
- * pypcap: http://code.google.com/p/pypcap/
- * libdnet: http://code.google.com/p/libdnet/
- * BeautifulSoup: http://www.crummy.com/software/BeautifulSoup/
+ * pypcap: https://code.google.com/p/pypcap/
+ * libdnet: https://code.google.com/p/libdnet/
+ * BeautifulSoup: https://www.crummy.com/software/BeautifulSoup/
* txtorcon: https://github.com/meejah/txtorcon
## Install Tor
@@ -110,7 +110,7 @@ If you don't already have Subversion installed:
For libdnet:
- wget http://libdnet.googlecode.com/files/libdnet-1.12.tgz
+ wget https://libdnet.googlecode.com/files/libdnet-1.12.tgz
tar xzf libdnet-1.12.tgz
cd libdnet-1.12
./configure && make
@@ -120,7 +120,7 @@ For libdnet:
For pypcap:
- svn checkout http://pypcap.googlecode.com/svn/trunk/ pypcap-read-only
+ svn checkout https://pypcap.googlecode.com/svn/trunk/ pypcap-read-only
cd pypcap-read-only/
pip install pyrex
make
[View Less]
1
0

[ooni-probe/master] call launch_tor with the tor_binary from config
by art@torproject.org 06 Dec '12
by art@torproject.org 06 Dec '12
06 Dec '12
commit 3fd67d201061ca22c5fcdbc7bbaafb9897a888c5
Author: aagbsn <aagbsn(a)extc.org>
Date: Wed Dec 5 16:56:49 2012 +0000
call launch_tor with the tor_binary from config
Otherwise, the default is supplied by txtorcon as /usr/sbin/tor
---
ooni/runner.py | 1 +
1 files changed, 1 insertions(+), 0 deletions(-)
diff --git a/ooni/runner.py b/ooni/runner.py
index bc9b874..2123a7d 100644
--- a/ooni/runner.py
+++ b/ooni/runner.py
@@ -516,6 +516,7 @@ def startTor():
log.…
[View More]debug("Setting SOCKS port as %s" % tor_config.SocksPort)
d = launch_tor(tor_config, reactor,
+ tor_binary=config.advanced.tor_binary,
progress_updates=updates)
d.addCallback(setup_complete)
d.addErrback(setup_failed)
[View Less]
1
0

06 Dec '12
commit 7f29689606b9661c53c4164c8bb660df87567223
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 12:20:14 2012 +0000
Add docstrings to network helper functions
Adds documentation for getDefaultIface and getNetworkFromRoutes.
Also adds #XXX warning for users of OpenVZ environments.
---
ooni/utils/txscapy.py | 5 +++++
1 files changed, 5 insertions(+), 0 deletions(-)
diff --git a/ooni/utils/txscapy.py b/ooni/utils/txscapy.py
index 7902133..210f21b 100644
--- a/…
[View More]ooni/utils/txscapy.py
+++ b/ooni/utils/txscapy.py
@@ -45,6 +45,7 @@ from scapy.all import BasePacketList, conf, PcapReader
from scapy.all import conf, Gen, SetGen, MTU
def getNetworksFromRoutes():
+ """ Return a list of networks from the routing table """
from scapy.all import conf, ltoa, read_routes
from ipaddr import IPNetwork, IPAddress
@@ -65,6 +66,10 @@ class IfaceError(Exception):
pass
def getDefaultIface():
+ """ Return the default interface or raise IfaceError """
+ #XXX: currently broken on OpenVZ environments, because
+ # the routing table does not contain a default route
+ # Workaround: Set the default interface in ooniprobe.conf
networks = getNetworksFromRoutes()
for net in networks:
if net.is_private:
[View Less]
1
0
commit 6dea02f4de4cd32741b3f9a7404750dff7136183
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 11:56:16 2012 +0000
Fix spelling in test documentation
---
docs/source/tests/dnstamper.rst | 6 +++---
docs/source/tests/http_host.rst | 10 +++++-----
docs/source/tests/http_invalid_request_line.rst | 4 ++--
docs/source/tests/http_requests.rst | 4 ++--
docs/source/tests/tcpconnect.rst | 2 +-
docs/source/…
[View More]tests/traceroute.rst | 2 +-
docs/source/writing_tests.rst | 8 ++++----
7 files changed, 18 insertions(+), 18 deletions(-)
diff --git a/docs/source/tests/dnstamper.rst b/docs/source/tests/dnstamper.rst
index c4e6f17..8fdb572 100644
--- a/docs/source/tests/dnstamper.rst
+++ b/docs/source/tests/dnstamper.rst
@@ -22,8 +22,8 @@ lookup on the first A record address of both sets and check if they both
resolve to the same name.
NOTE: This test frequently results in false positives due to GeoIP-based
-load balancing on major global sites such as google, facebook, and
-youtube, etc.
+load balancing on major global sites such as Google, Facebook, and
+Youtube, etc.
How to run the test
===================
@@ -176,5 +176,5 @@ From running:
test_started: 1354187839.512434
...
-Notes: Query is the string repsentation of :class:twisted.names.dns.Query
+Notes: Query is the string representation of :class:twisted.names.dns.Query
diff --git a/docs/source/tests/http_host.rst b/docs/source/tests/http_host.rst
index febdc4a..ebe10e7 100644
--- a/docs/source/tests/http_host.rst
+++ b/docs/source/tests/http_host.rst
@@ -22,7 +22,7 @@ enumerating the sites that are being censored by it.
It places inside of the Host header field the hostname of the site that is to
be tested for censorship and then determines if the probe is behind a
transparent HTTP proxy (because the response from the backend server does not
-match) and if the site is censorsed, by checking if the page that it got back
+match) and if the site is censored, by checking if the page that it got back
matches the input block page.
*Why do content blocking?*
@@ -30,7 +30,7 @@ matches the input block page.
Q: Why should be do content blocking measurements with this test when we have
other tests that also do this?
-A: Why not? Although you are correct that tecnically the two tests are
+A: Why not? Although you are correct that technically the two tests are
equivalent even though the IP layer differs in the two tests.
Note: We may in the future remove the Content Blocking aspect of the HTTP Host
@@ -46,9 +46,9 @@ How to run the test
*backend url* is the url of the backend that will be used for checking if the
site is blocked or not.
-*content* is the content of a blockpage. When a transparent HTTP proxy is
-present we will do comparisons against this to verify if the requested site is
-blocked or not.
+*content* is the content of a page. When a transparent HTTP proxy is present we
+will do comparisons against this to verify if the requested site is blocked or
+not.
Sample report
diff --git a/docs/source/tests/http_invalid_request_line.rst b/docs/source/tests/http_invalid_request_line.rst
index 4ac223e..4a59df0 100644
--- a/docs/source/tests/http_invalid_request_line.rst
+++ b/docs/source/tests/http_invalid_request_line.rst
@@ -24,7 +24,7 @@ on the HTTP request line. We generate a series of requests that are not
valid HTTP requests.
The remote backend runs a TCP echo server. If the response from the backend
-does not match with what we have sent then we say that tampering is occuring.
+does not match with what we have sent then we say that tampering is occurring.
The idea behind this is that certain transparent HTTP proxies may not be
properly parsing the HTTP request line.
@@ -75,7 +75,7 @@ This generates a request that looks like this:
::
GET / HTTP/XxX
-This attemps to trigger bugs in the parsing of the HTTP version number, that
+This attempts to trigger bugs in the parsing of the HTTP version number, that
is usually being split on the `.`.
How to run the test
diff --git a/docs/source/tests/http_requests.rst b/docs/source/tests/http_requests.rst
index cc98882..6eb0d0a 100644
--- a/docs/source/tests/http_requests.rst
+++ b/docs/source/tests/http_requests.rst
@@ -21,8 +21,8 @@ and over Tor. It then compares the two responses to see if the response bodies o
proportion between the expected body length (the one over Tor) and the one over
the control network match.
-If the proportion between the two body lengths is <= a certain tollerance
-factor (by default set to 0.8), then we say that they do not match.
+If the proportion between the two body lengths is <= a certain tolerance factor
+(by default set to 0.8), then we say that they do not match.
The reason for doing so is that a lot of sites serve geolocalized content based
on the location from which the request originated from.
diff --git a/docs/source/tests/tcpconnect.rst b/docs/source/tests/tcpconnect.rst
index 7c7bd16..e92077d 100644
--- a/docs/source/tests/tcpconnect.rst
+++ b/docs/source/tests/tcpconnect.rst
@@ -16,7 +16,7 @@ Details
Description
===========
-This test performs TCP connections to a set of sepecified IP:PORT pairs and
+This test performs TCP connections to a set of specified IP:PORT pairs and
reports the reason for which it failed connecting to the target address.
The reason for failure may be: "timeout", when the connection timed out,
diff --git a/docs/source/tests/traceroute.rst b/docs/source/tests/traceroute.rst
index 898fc30..5b18454 100644
--- a/docs/source/tests/traceroute.rst
+++ b/docs/source/tests/traceroute.rst
@@ -40,7 +40,7 @@ other means that are not the src and destination IP address.
In particular the ICMP TTL expired citations will contain the IP headers.
-We could theorically strip these though even if that were the case there would
+We could theoretically strip these though even if that were the case there would
still be at least a reduction of the anonymity set given by the fact that we
received a TTL expired from a router in a certain network range.
diff --git a/docs/source/writing_tests.rst b/docs/source/writing_tests.rst
index f74f3f0..e4d03e4 100644
--- a/docs/source/writing_tests.rst
+++ b/docs/source/writing_tests.rst
@@ -41,7 +41,7 @@ specifies what command line option may be used to control this value.
By default the ``inputProcessor`` is set to read the file line by line and
strip newline characters. To change this behavior you must set the
-``inputProcessor`` attribute to a function that takes as arugment a file
+``inputProcessor`` attribute to a function that takes as argument a file
descriptor and yield the next item. The default ``inputProcessor`` looks like
this::
@@ -57,7 +57,7 @@ Setup and command line passing
------------------------------
Tests may define the `setUp` method that will be called every time the Test
-Case object is intantiated, in here you may place some common logic to all your
+Case object is instantiated, in here you may place some common logic to all your
Test Methods that should be run before any testing occurs.
Command line arguments can be parsed thanks to the twisted
@@ -181,8 +181,8 @@ To implement a simple ICMP ping based on this function you can do like so
return d
The arguments taken by self.sr() are exactly the same as the scapy send and
-receive function, the only difference is that instead of using the regualar
-scapy super socket it uses our twisted drivven wrapper around it.
+receive function, the only difference is that instead of using the regular
+scapy super socket it uses our twisted driven wrapper around it.
Alternatively this test can also be written using the
`twisted.defer.inlineCallbacks` decorator, that makes it look more similar to
[View Less]
1
0
commit e4ab3c57422824f68b906cbfaa2de8391b550184
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 12:14:03 2012 +0000
Add dnst to ooni.templates.rst
---
docs/source/api/ooni.templates.rst | 6 ++++++
1 files changed, 6 insertions(+), 0 deletions(-)
diff --git a/docs/source/api/ooni.templates.rst b/docs/source/api/ooni.templates.rst
index 6951e31..c65a2da 100644
--- a/docs/source/api/ooni.templates.rst
+++ b/docs/source/api/ooni.templates.rst
@@ -17,4 +17,10 @@ templates …
[View More]Package
:undoc-members:
:show-inheritance:
+:mod:`dnst` Module
+--------------------
+.. automodule:: ooni.templates.dnst
+ :members:
+ :undoc-members:
+ :show-inheritance:
[View Less]
1
0
commit a2468793713723c1811ab01d115b5bdd2cd8c465
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 12:28:23 2012 +0000
Add docstrings to ooni/runner.py
Add docstring to startTor()
Add docstring to startSniffing()
---
ooni/runner.py | 7 +++++++
1 files changed, 7 insertions(+), 0 deletions(-)
diff --git a/ooni/runner.py b/ooni/runner.py
index bc9b874..723b209 100644
--- a/ooni/runner.py
+++ b/ooni/runner.py
@@ -464,6 +464,10 @@ class UnableToStartTor(Exception):
…
[View More] pass
def startTor():
+ """ Starts Tor
+ Launches a Tor with :param: socks_port :param: control_port
+ :param: tor_binary set in ooniprobe.conf
+ """
@defer.inlineCallbacks
def state_complete(state):
config.tor_state = state
@@ -522,6 +526,9 @@ def startTor():
return d
def startSniffing():
+ """ Start sniffing with Scapy. Exits if required privileges (root) are not
+ available.
+ """
from ooni.utils.txscapy import ScapyFactory, ScapySniffer
try:
checkForRoot()
[View Less]
1
0

06 Dec '12
commit 95ff2cf20efd0a293e44ecfa80eb19da9e238043
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 12:24:07 2012 +0000
Remove automodule for ooni/__init__.py
---
docs/source/api/ooni.rst | 8 --------
1 files changed, 0 insertions(+), 8 deletions(-)
diff --git a/docs/source/api/ooni.rst b/docs/source/api/ooni.rst
index 83b4f83..cb79525 100644
--- a/docs/source/api/ooni.rst
+++ b/docs/source/api/ooni.rst
@@ -1,14 +1,6 @@
ooni Package
============
-:mod:`ooni` Package
-------…
[View More]-------------
-
-.. automodule:: ooni.__init__
- :members:
- :undoc-members:
- :show-inheritance:
-
:mod:`inputunit` Module
-----------------------
[View Less]
1
0
commit 4cfb89c7f000a23cd0547a3ff0c513b22f4b7efa
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 12:34:09 2012 +0000
Clean up index.rst
Fix link to http_header_field_manipulation
Fix some spelling errors
Remove daphne from list of tests
Disable globbing in index TOC
---
docs/source/index.rst | 5 +----
1 files changed, 1 insertions(+), 4 deletions(-)
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 770133d..1abb517 100644
--- a/docs/…
[View More]source/index.rst
+++ b/docs/source/index.rst
@@ -62,14 +62,12 @@ Traffic Manipulation Tests
* `DNS Spoof <tests/dnsspoof.html>`_
- * `HTTP Header Field Manipulation <tests/http_header_field_manipulation>`_
+ * `HTTP Header Field Manipulation <tests/http_header_field_manipulation.html>`_
* `Traceroute <tests/traceroute.html>`_
* `HTTP Host <tests/http_host.html>`_
- * `Daphne <tests/daphne.html>`_
-
Other tests
...........
@@ -120,7 +118,6 @@ More developer documentation
.. toctree::
:maxdepth: 2
- :glob:
oonib
writing_tests
[View Less]
1
0

[ooni-probe/master] Add documentation and examples templates httpt and dnst
by art@torproject.org 06 Dec '12
by art@torproject.org 06 Dec '12
06 Dec '12
commit 256c340080cf8d721ac278c91bb97407b935b5dc
Author: aagbsn <aagbsn(a)extc.org>
Date: Thu Dec 6 13:38:04 2012 +0000
Add documentation and examples templates httpt and dnst
---
docs/source/writing_tests.rst | 180 +++++++++++++++++++++++++++-
nettests/examples/example_http_checksum.py | 27 ++++
2 files changed, 203 insertions(+), 4 deletions(-)
diff --git a/docs/source/writing_tests.rst b/docs/source/writing_tests.rst
index e4d03e4..a8e5409 100644
--- a/docs/…
[View More]source/writing_tests.rst
+++ b/docs/source/writing_tests.rst
@@ -317,14 +317,186 @@ TODO finish this with more details
HTTP based tests
................
-see nettests/examples/example_httpt.py
+HTTP based tests will be a subclass of `ooni.templates.httpt.HTTPTest`.
-TODO
+It provides methods `ooni.templates.httpt.HTTPTest.processResponseBody` and
+`ooni.templates.httpt.HTTPTest.processResponseHeaders` for interacting with the
+response body and headers respectively.
+
+For example, to implement a HTTP test that returns the sha256 hash of the
+response body (based on nettests/examples/example_httpt.py):
+
+::
+
+ from ooni.utils import log
+ from ooni.templates import httpt
+ from hashlib import sha256
+
+ class SHA256HTTPBodyTest(httpt.HTTPTest):
+ name = "ChecksumHTTPBodyTest"
+ author = "Aaron Gibson"
+ version = 0.1
+
+ inputFile = ['url file', 'f', None,
+ 'List of URLS to perform GET requests to']
+ requiredOptions = ['url file']
+
+ def test_http(self):
+ if self.input:
+ url = self.input
+ return self.doRequest(url)
+ else:
+ raise Exception("No input specified")
+
+ def processResponseBody(self, body):
+ body_sha256sum = sha256(body).hexdigest()
+ self.report['checksum'] = body_sha256sum
+
+The report for this test looks like this:
+
+::
+
+ ###########################################
+ # OONI Probe Report for ChecksumHTTPBodyTest test
+ # Thu Dec 6 17:31:57 2012
+ ###########################################
+ ---
+ options:
+ collector: null
+ help: 0
+ logfile: null
+ pcapfile: null
+ reportfile: null
+ resume: 0
+ subargs: [-f, hosts]
+ test: nettests/examples/example_http_checksum.py
+ probe_asn: null
+ probe_cc: null
+ probe_ip: 127.0.0.1
+ software_name: ooniprobe
+ software_version: 0.0.7.1-alpha
+ start_time: 1354786317.0
+ test_name: ChecksumHTTPBodyTest
+ test_version: 0.1
+ ...
+ ---
+ input: http://www.google.com
+ report:
+ agent: agent
+ checksum: d630fa2efd547d3656e349e96ff7af5496889dad959e8e29212af1ff843e7aa1
+ requests:
+ - request:
+ body: null
+ headers:
+ - - User-Agent
+ - - [Opera/9.00 (Windows NT 5.1; U; en), 'Opera 9.0, Windows XP']
+ method: GET
+ url: http://www.google.com
+ response:
+ body: '<!doctype html><html ... snip ... </html>'
+ code: 200
+ headers:
+ - - X-XSS-Protection
+ - [1; mode=block]
+ - - Set-Cookie
+ - ['PREF=ID=fada4216eb3684f9:FF=0:TM=1354800717:LM=1354800717:S=IT-2GCkNAocyXlVa;
+ expires=Sat, 06-Dec-2014 13:31:57 GMT; path=/; domain=.google.com', 'NID=66=KWaLbNQumuGuYf0HrWlGm54u9l-DKJwhFCMQXfhQPZM-qniRhmF6QRGXUKXb_8CIUuCOHnyoC5oAX5jWNrsfk-LLJLW530UiMp6hemTtDMh_e6GSiEB4GR3yOP_E0TCN;
+ expires=Fri, 07-Jun-2013 13:31:57 GMT; path=/; domain=.google.com; HttpOnly']
+ - - Expires
+ - ['-1']
+ - - Server
+ - [gws]
+ - - Connection
+ - [close]
+ - - Cache-Control
+ - ['private, max-age=0']
+ - - Date
+ - ['Thu, 06 Dec 2012 13:31:57 GMT']
+ - - P3P
+ - ['CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657
+ for more info."']
+ - - Content-Type
+ - [text/html; charset=UTF-8]
+ - - X-Frame-Options
+ - [SAMEORIGIN]
+ socksproxy: null
+ test_name: test_http
+ test_runtime: 0.08298492431640625
+ test_started: 1354800717.478403
+ ...
+
DNS based tests
...............
-see nettests/core/dnstamper.py
+DNS based tests will be a subclass of `ooni.templates.dnst.DNSTest`.
+
+It provides methods `ooni.templates.dnst.DNSTest.performPTRLookup` and
+`ooni.templates.dnst.DNSTest.performALookup`
-TODO
+For example (taken from nettets/examples/example_dnst.py):
+
+::
+
+ from ooni.templates.dnst import DNSTest
+
+ class ExampleDNSTest(DNSTest):
+ def test_a_lookup(self):
+ def gotResult(result):
+ # Result is an array containing all the A record lookup results
+ print result
+
+ d = self.performALookup('torproject.org', ('8.8.8.8', 53))
+ d.addCallback(gotResult)
+ return d
+
+The report looks like this:
+
+::
+
+ ###########################################
+ # OONI Probe Report for Base DNS Test test
+ # Thu Dec 6 17:42:51 2012
+ ###########################################
+ ---
+ options:
+ collector: null
+ help: 0
+ logfile: null
+ pcapfile: null
+ reportfile: null
+ resume: 0
+ subargs: []
+ test: nettests/examples/example_dnst.py
+ probe_asn: null
+ probe_cc: null
+ probe_ip: 127.0.0.1
+ software_name: ooniprobe
+ software_version: 0.0.7.1-alpha
+ start_time: 1354786971.0
+ test_name: Base DNS Test
+ test_version: 0.1
+ ...
+ ---
+ input: null
+ report:
+ queries:
+ - addrs: [82.195.75.101, 86.59.30.40, 38.229.72.14, 38.229.72.16]
+ answers:
+ - [<RR name=torproject.org type=A class=IN ttl=782s auth=False>, <A address=82.195.75.101
+ ttl=782>]
+ - [<RR name=torproject.org type=A class=IN ttl=782s auth=False>, <A address=86.59.30.40
+ ttl=782>]
+ - [<RR name=torproject.org type=A class=IN ttl=782s auth=False>, <A address=38.229.72.14
+ ttl=782>]
+ - [<RR name=torproject.org type=A class=IN ttl=782s auth=False>, <A address=38.229.72.16
+ ttl=782>]
+ query: '[Query(''torproject.org'', 1, 1)]'
+ query_type: A
+ resolver: [8.8.8.8, 53]
+ test_name: test_a_lookup
+ test_runtime: 0.028924942016601562
+ test_started: 1354801371.980114
+ ...
+For a more complex example, see: `DNS Tamper Test <https://gitweb.torproject.org/ooni-probe.git/blob/HEAD:/nettests/blocking/d…>`_
diff --git a/nettests/examples/example_http_checksum.py b/nettests/examples/example_http_checksum.py
new file mode 100644
index 0000000..9226b52
--- /dev/null
+++ b/nettests/examples/example_http_checksum.py
@@ -0,0 +1,27 @@
+# -*- encoding: utf-8 -*-
+#
+# :authors: Aaron Gibson
+# :licence: see LICENSE
+
+from ooni.utils import log
+from ooni.templates import httpt
+from hashlib import sha256
+
+class SHA256HTTPBodyTest(httpt.HTTPTest):
+ name = "ChecksumHTTPBodyTest"
+ author = "Aaron Gibson"
+ version = 0.1
+
+ inputFile = ['file', 'f', None,
+ 'List of URLS to perform GET requests to']
+
+ def test_http(self):
+ if self.input:
+ url = self.input
+ return self.doRequest(url)
+ else:
+ raise Exception("No input specified")
+
+ def processResponseBody(self, body):
+ body_sha256sum = sha256(body).digest()
+ self.report['checksum'] = body_sha256sum
[View Less]
1
0