HackWaterloo! Meet Little Turk!

Five hours of train to Toronto, a 6:00am alarm in Toronto, an hour drive to Waterloo, six furious hours of coding, twelve presentations, impatient waiting through deliberations, and, at last, judges decisions. Then, then.. then! I won Hack Waterloo (!!!), with my project Little Turk.

I wasn’t going to go to Hack Waterloo.

It’s Very Far from Montréal. Particularly for a non-driver (True story, I don’t know how to drive). But I ended up going anyway, thanks to a nudge from Leila (:

As for my project; ideas, sometimes come from unexpected places. I said this in the developer chat-room at work, last week on Wednesday:

“Maybe I should wear my tutu and mask at the hack thing this weekend, instead of on Dress Like a Grownup day <:"

Instead of wearing a tutu and mask to Hack Waterloo; though, it occurred to me that maybe I could do another video-centered hack using clips of myself in a tutu and mask (much more comfortable!). Something else I wanted to do, after using the Idée APIs pretty heavily in previous hack events was to use a broader collection of APIs. Creating a query-interface seemed like a pretty good way to combine the two. So I did!

It presented a video window, with a very small version of myself, sitting in a tutu and venetian mask; with a laptop, hula hoop, map, and a ukulele. There might be other things if you ask the right thing. Okay, there are. Next to the video, was a field for entering commands, and a box for their output.

When you type a query, some very simple language processing takes place; and the video of the Little Turk would react to the query in some fashion, and provide a response. Mostly, she ends up typing (: But if you ask her to look for restaurants (using the Yellow Pages API), she would point at her map until the results had been retrieved, and would then start typing. You could also tell her a colour that you like, and she would find pictures for that colour, from Piximilar (alas, video exists for Dark Turk, who knows about drawing pictures; but the three minute demo time was going to be too short for a narrative with two turks). If you asked about a domain, she would ask PostRank about that domain.

Asking about invoices would get you a smartass remark (:

The video component, I put together using Quartz Composer. If you have the OSX development tools installed, Quartz Composer is lurking in there, somewhere. Roughly, it would look at an XML file to see which clip it should be playing; and play that clip. It also maintained a queue of the most recent 25 frames of video which it combined with the current clip (mostly to smooth out transitions between the different clips with visual noise. But it does also make the video look delightfully creepy).

The query interface was HTML/javascript/jQuery; which isn't exotic at all. The language parsing was more-or-less a tangle of if-statements and imagination. There was also a tiny server-side component that was responsible for updating the XML file when Little Turk needed to start doing something else.

Pictures and video, later (:

HackOTT! 4th Place! My Project! Icarus! It’s Magic!

At HackMTL, I learnt an important lesson: Make something which is very obvious and easy to explain. Coalesce was interesting but so esoteric that I don’t think anyone really followed what it is that it was actually doing. So a photo mosaic seemed like a good choice. Except a photo mosaic isn’t the most compelling thing ever; and so I made a video photo mosaic instead (:

And while I did come in 4th, again (: Leila decided to award me a prize for best use of the Piximilar API. Aww.

I used the iSight camera in my MacBook as a video source for both kinds of mosaic. For the first, simpler kind, you could click anywhere on the video, and the app would show a collection of images that matched that colour. If you clicked a second time, it would show you a collection of images that were half of each colour, and so on (up to 5 colours – at which point the oldest colour is dropped).

For the second kind, it was showing the video as a constantly updating photo mosaic. Having it show a tile for each individual colour that was present would have involved making an enormous number of requests to the colour search API; so instead, I made it so that it would round colours. At the default rendering level, it would round each of the RGB hex values to the nearest 64 – enough resolution to sortof see what’s going on, but a small enough colour set that precaching it isn’t tooo onerous.

But I also made it so you could adjust the colour accuracy! with more accurate colours, the photo mosaic is less complete; but as time goes by, its cache of images can expand to include all of the colours which are shown. I didn’t quite get to making an asynchronous image loader; so for each frame of video-mosaic rendered, it would load one more colour so that it wouldn’t stall too long on a single frame.

A demo, let me show you one:

HackOTT – Icarus from _hristine on Vimeo.

(Uhm. It looks like there’s something weird with Vimeo embedding which is making it show the video for Coalesce twice. If you’re looking at this on my homepage, and this seems confusing, click the link under the video..)

HackMTL! 4th Place! My Project! Coalesce! It’s Magic!

I was fortunate enough to be among the participants at HackMTL on Saturday; an event where a large group of people came together (alone, small teams, and an ensemble cast) to build things using a collection of APIs.

I worked alone d: But I did work on a somewhat unique project!

When I read about the APIs that were going to be available during the event, the piximilar colour search was the API that looked most compelling to me. Rather, having the kind of data that the colour API provides was much more compelling than the other APIs.

The general shape of the thing I wanted to build, was something where a collection of computers would talk to each other in such a way that they could collectively show a colour gradient. My initial intention, was for a neighbouring computer to modify your own colour to be more like itself; however, I had an argument with colour theory and time; so, instead, when a neighbouring computer gives you a colour, that colour becomes the edge of the gradient on that side of your screen instead (:

Essentially, each node in the network has their colour. When you do something, like change your name or your colour, everyone in the network is notified, and has the opportunity to add you to their left, or to their right. If you have added someone to your left and/or right, when you choose a new colour, the person on your left and right will have their mosaic updated with your new colour being the new colour to the left or right.

When you’ve got one person pushing colours to you, you’ll see a colour gradient from your colour to their colour. But if you’ve got people pushing colours on both sides, you’ll see a gradient from the left colour to your colour to the right colour. If you’ve got multiple people pushing colours on one side, the most recent one wins (:

Technically, the client is HTML/javascript/jQuery + jQuery plugins for manipulating HSV colours and growl notifications. The client polls the server each second to see if there are any messages for the current user. Some messages (“Someone is here!”, “Someone is now called Something Else!”) are straight plain text; with remaining messages being delimited_strings_with_useful_information_for_the_client (Telling a client to use a new colour on the right or left; or telling them about a possible new neighbour).

The main image-display area is just a collection of div tags, with appropriate classes for accessing any given cell based on its row and column. Changing an image involved changing the background of one of these div elements.

The images, came from Piximilar, which is an image/colour based search service. I pulled together a simple proxy to route same-domain queries from AJAX requests in the client to the appropriate API calls.

The server is a sketchy-as-hell, simplest-thing-that-could-possibly-work PHP/MySQL thing. I expect that it will stop working in a useful fashion soon, since it’s currently sending around far more data than it needs to, or should (:

The code lives on at https://github.com/hristine/coalesce if you want to peer at how I spent my time, and how it looks underneath (: The client can be accessed at http://www.neato.co.nz/Coalesce/Client; with the caveat that I’m not sure how long the images will be available (and that I’m not sure if an excess of people will cause it to have a bit of a sad <: )

So that's what I spent five hours doing on Saturday. Do you want to see a movie? Of course you want to see a movie!

Coalesce from _hristine on Vimeo.

On Stories.

“The universe is made of stories, not atoms.”

– Muriel Rukeyser

“The most important part of the story is the part you don’t know yet.”

– Barbara Kingsolver to Kurt Andersen

The first quote, it transpires, is in my pile of things that I keep to remember. It is also how chapter one of the book I just started reading begins (The Age of Spiritual Machines by Ray Kurzweil). The second quote, I passed on my way to finding the first quote; and is here because it seemed àpropos.

YUL-YVR-AKL-WLG-AKL-YVR-YUL

Canada has terrible airport codes that, for the most part, bear no resemblance to the name of the city. YUL? Really?

In any case, I am embarking on an epic voyage in February.

I leave Montréal February 14th, get to Wellington on the 16th (thanks international date line!), spend the 18th and 19th at Webstock, then leave Wellington on the 24th, arriving back in Montréal on.. the 24th. Travel gets weird when you go far.

I imagine there will be some kind of eventfulness and insobriety on the 20th, being that I am not often in New Zealand, and exactly no other reason.