[tor-bugs] #4439 [Metrics Utilities]: Develop a Java/Python API that wraps relay descriptor sources and provides unified access to them

Tor Bug Tracker & Wiki torproject-admin at torproject.org
Mon Nov 14 10:17:44 UTC 2011


#4439: Develop a Java/Python API that wraps relay descriptor sources and provides
unified access to them
-------------------------------+--------------------------------------------
 Reporter:  karsten            |          Owner:  karsten
     Type:  task               |         Status:  new    
 Priority:  normal             |      Milestone:         
Component:  Metrics Utilities  |        Version:         
 Keywords:                     |         Parent:         
   Points:                     |   Actualpoints:         
-------------------------------+--------------------------------------------

Comment(by karsten):

 Replying to [comment:5 atagar]:
 > Since this was spawned by the alarming infrastructure ticket I thought
 in the initial comment that we were talking about an RPC for services
 (like the alarms) to request information from the metrics hosts, and that
 this was an API for that. So disregard, it's obvious that has nothing to
 do with this. :)

 Ah okay. :)

 > Hmmm... now I think we've just had another misunderstanding. I'm saying
 that it *is* related functionality and I'd be happy to hack on it as part
 of stem.

 Makes sense.

 > My plan was to have Relay objects which are a composite of three
 things...
 > - fingerprint (constructor arg, always there)
 > - consensus and descriptor data (lazily loaded, throws an exception or
 returns a default value if it can't be loaded)

 I haven't thought much about an API for the parsed directory objects yet.
 My main focus was on the different data sources and how we would access
 them most efficiently.  For example, if we have a local Tor data
 directory, we ''may'' only want to learn about new descriptors since we
 last asked.  Or if we download descriptors from the directory authorities,
 we ''may'' only want to download those that we don't already know.

 My plan was to start without parsing descriptors at all and simply hand
 out raw descriptor strings.  It does make sense to add the parsing code to
 the API, too, but that was my step two.

 When you say you want to hack on this as part of stem, does that mean that
 stem will be able to handle non-Tor-control-port data sources?  That's
 what I'm most interested in.

 How do we proceed?  Should we start with writing a design document
 describing the scope of the new API?  How about we start this in a
 task-4439 directory in the metrics-tasks Git repository?  We can always
 move it to its own Git repository once it evolves.

-- 
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/4439#comment:6>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online


More information about the tor-bugs mailing list