ToR Protected Processing

Wilfred L. Guerin wilfredguerin at gmail.com
Tue Sep 4 18:08:24 UTC 2007


Though I'm sure there has been much discussion on the appropriate methods of
implementing distributed data management and processing with TOR
capabilities, this specific target project has critical reasons for
protecting the autonomous data processing systems that handle specific
arbitrary data.

I propose this need for the protection of unwitting individuals who have
taken a picture and under no circumstance realized or knew that something in
the far far distance might be a restricted govt site or other "no pic"
target.

Please contact me directly with information on any project working on
grid-style distributed data containment and processing, especially VM
emulated and sandboxed models, and more modern implementations of
transparent loading and migration with physical region weighting. "
WilfredGuerin at Gmail.com"

The project of concern is intended for k-12 classroom curriculum in the
fields of using technology and imaging tools for practical applications. The
dominant protocol uses a conventional camera, large spheres or other balls,
and any computational system to generate 3d models of the environment.

In full scale, this project (now simplified to "Project BigBall.us")
implements a simple known-target identification model (large spheres) to
facilitate extrapolation of imaging data to 4d models.

The data and processing requirements are explicit, maximal imaging sources
to maximal storage and processing facilities.

A current model using various p2p and server-centric distribution and
management of both processing and data storage works suitably, however there
exists no consistent protocol or system definitions that make use of current
public-use techniques like the mess of current distributed processing
models.

The specific model of data and processing management is not of concern here,
the mechanism of protecting the arbitrary processing device and facilitating
scientific research without concern for rogue vulnerabilities must be
determined prior to public release.

We all know of the irony of preventing analogue-signal video tape from being
used at American infrastructure, and video from a cell phone is far more
useful against their concerns.

Our concern is very explicit: any 2 images for a distant horizon will not
each independently show the distinct object when said object's visual
profile is less than a pixle, yet the extrapolated correlational vector
between 2 images will give finite and distinguishable characteristics of the
target.

If said resulting data overlaps ... air in a flight path with nothing flying
(or worse, near-space earth) ... it could be assumed a liability for any
system involved in processing.

Moreso, even with known "do not show" sectors in 4d, the isolation of the
do-not-show requires the segregation of its signature from other
environmental characteristics; one can not determine that it sees something
until a separation is made between that which it sees and another that it
sees.

In order to strike voluntarily a target from imaging, beit someone's house
or a true liability, system must know what that target is and other
characteristics to facilitate removal. Other options are simply not
possible. (Proven)

My concern is simple.

Aside from diverse distributed data management and processing, there is
explicit need for protection mechanisms in this model.

If a student (k-12) wants a good 3d model of the park or beach, using all
available image sources is expected. Is the student liable for the flesh in
the image where the system stripped the growth characteristics of the tree
in the background out for the project and ignored the single-case (spurious)
structures? Moreso, does said student get expelled because an autonomous
machine polled imaging sources to extract relevant information and happened
to need to filter that which was unneeded?

In the expected dealings with the DHS NAO, NGIA(nima), etc, our analysis has
indicated that a waco-style religious institution is needed to protect the
region of any specific ball to force release of various media and data
collections, but protecting the excess of diverse machines, their owners,
entities associated, etc, from such stupid liability as a smoke stack on the
distant horizon requires further analysis.

I have proposed a derivative project called "Skunk Cabbage" (which always
smells good) that would facilitate VM/emu sandboxed scientific processing
environments on distributed machines, fully manage distributed loading and
processing with regional sectoriing using a consolidation of the current
psuedo-standards, and most importantly, COMPREHENSIVE SYSTEM AND CODE
INTEGRITY SUFFICIENT TO SUBSTANTIATE THE DISTINCTION BETWEEN AUTONOMOUS
SYSTEM AND HOST ENVIRONMENT.

Of course, maximal efficiency, but with no liability for the computation of
the distributed processes.

Please contact me directly with information on projects that are somewhat
related to these concerns.

Moreso, how does one best merge and optimize the various data transport
capabilities of any viable client system or host and still protect and
isolate the processes involved?

Such research and protocol creation should be desirable by all parties
requiring an isolated or protected computational and data storage
environment while using the power and resources of currently existing public
systems.

I look forward to any responses and a fast pursuit of a system suitable for
this task.

-Wilfred L. Guerin
WilfredGuerin at Gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.torproject.org/pipermail/tor-dev/attachments/20070904/69be0ea4/attachment.htm>


More information about the tor-dev mailing list