I rather enjoyed Barcamp Blackpool this weekend. I've been to a lot of conferences and Barcamps this year, but I did particularly enjoy a Barcamp on my home soil.
I made an effort this weekend to spend less time in talks, and more time talking to people, and I think it paid off. Along with giving my (now rather tired) session on pitching, I had some very rewarding one-to-ones with a some very smart people - so if I cornered you and wouldn't stop talking, take it as a compliment!
I did, however spend time in some talks. @timhastings talk on TagWalk was very interesting, though I would have enjoyed more detail on the code running the site, but then I always like to skip ahead.
@walterja, whose talks have become the highlight of the last few Barcamp events I've attended, presented a predictably fascinating look at sociometry, which I've had some experience with before, though with a very different intention. I think this talk brought together the most diverse members of the audience, it was great to see so many teachers in the audience.
The evening was a lot of fun, and talking to people I was pleased with how many were talking about their own projects, startups and ideas for what is next; very invigorating.
Most blog posts end with thanks for the sponsors, which I would like to echo, but I think someone needs to call out the excellent work of @ruby_gem, who seems to have the ability to pull together these events, run them flawlessly and make it look incredibly easy. You are an asset to us all. Thank very much for your work.
Oh, and I won the PadRacer tournament ;-)
I've a few projects coming up for 84labs which required location awareness. Location awareness works great with any recent phone, but for traditional clients, I needed to fall-back to obtaining the location from the client's IP address.
There is an excellent free IP location database hosted on datatables.org, which offered the easiest way to get the data which I needed. This meant using YQL, which I haven't used before; YQL is "an expressive SQL-like language that lets you query, filter, and join data across Web services".
So here is the code. I was using Python, Django and Python YQL module, but the same query presumably works with any language you choose. I've removed a lot of exception handling for clarity.
# Get the current user's IP address.
client_ip_address = request.META['REMOTE_ADDR']
# Create a YQL public query object.
y = yql.Public()
# Build the query.
query = 'USE "http://www.datatables.org/iplocation/ip.location.xml" AS ip.location; select * from geo.places where woeid in (select place.woeid from flickr.places where (lat,lon) in(select Latitude,Longitude from ip.location where ip="%s"))' % client_ip_address;
# Execute the query.
result = y.execute(query)
# ... et voila.
ip_place_name = result.rows['locality1']['content']
ip_location = result.rows['centroid']
That's it. The query just performs a simple select against the 'iplocation' database, then retrieves the latitude and longitude from the flickr.places database (flickr.places is part of the standard YQL set of databases, which is why we don't need a specific USE statement to be able to access it).
Yesterday, I wrote asbo.org.uk, a site which provides really basic visualisation of the UK's anti-social behaviour order data from 1999-2007. This data was recently released by data.gov.uk.
I wrote the whole site, wrangled the data, and deployed it yesterday afternoon, in about six hours. I've got some contract work coming up using CakePHP, so I wanted to try it out, and I wanted to see what I could do in such a short space of time. I'm quite pleased with the results. There's no analysis of the data, just presentation, but I was trying to see what I could do in the time I had, rather than develop features.
Originally, I wanted to write a 'how safe am I' sort of application, which could offer data about the types of crime most likely to occur in a given area, but this idea was pretty much killed by the time I saw the actual data available. Maybe I don't know enough about Excel-scraping, but I considerably reduced the scope of this project because the data was such a mess - it's just not worth the time to extract information from arbitrarily formatted spreadsheets. Formatting the single table of data used on asbo.org.uk took about three hours, which seems like a waste.
I was generally quite impressed with CakePHP; it's laid out in a sane way, though coming from Python some of the automatic discovery of models in the controllers feels a little bit magic, and I'm not sure I'm comfortable passing around arrays of data rather than data objects themselves, but it's not a deal breaker. I do like the layout system, something I commonly implement with blocks in Django, but seeing it formalised as part of the recommended approach to application templating is nice.
I put the source on Github,if anyone is interested.