Python RDF repository wanted for web proxy metadata harvester
Okay, this is getting close to outstripping my enthusiasm and invoking my laziness: Does anyone happen to have RDFLib and ZODB working under Mac OS X 10.2.3? Have also tried compiling Redland and its Python and Java APIs, but that's not been a 100% success. Or can someone recommend another decent RDF repository to play with under Python? I've had fun with Jena under Java, love using RDQL, and dig switching between MySQL and BDB stores.
I want an RDF repository I can integrate into my proxy experiments, currently implemented in Python. I've been very tempted to switch to Java, which I know better and have a better sense of tools available. But I'm still pulling for Python. I suppose I could just go with an in-memory repository at first, but I don't want to stick with that.
I'm still finishing up the PersonalWebProxy notes and plan I've been working on, but I've still got an itch to play in code. The next major thing I want to do is extract as much metadata as I can from every HTML page I visit and load the RDF repository up with statements based on what I harvest. Examples would include things like HTML title, visitation date, referring url, any meta tags, any autodiscovered RSS and FOAF URLs, and anything else I could eventually dig out. Then, I want to amass some data and play with it. I'm thinking this could give me a kind of uber-history with which to work.
Update: Seems like I managed to get Python, RDFLib, and ZODB working, but I started completely from scratch and compiled everything from clean source. I guess Apple's build of Python has more hiccups in it than just the Makefile thing.