Back in November 2013, nursing a broken hand and in a period of “meaningful work” with the LA, I decided to treat myself, but this would be no ordinary treat, this would be a future treat. I decided to back the PowerUp 3.0 through it’s Kickstarter campaign. I’d already been very happy with The Vamp which I’d funded, so I wasn’t unfamiliar with the Kickstarter process.
Well it arrived this week, Tuesday to be precise, and as I’ve been busy with other things this week I’ve left it until now to open the box.
Mark and Ashleigh danced 3 times for us; a Viennese Waltz, a Tango and something Richard Claydermanesque. I only know the first from the movies, the second from The Gotan Projects’ “La Revancha Del Tango” while the third seems inexplicably and inescapably linked to the 80’s melodrama “The Thorn Birds” staring with Richard Chamberlain, go figure, and here you thought I was going all “high brow.”
The Pi “Dance, sensor, camera thingymebob Pi™“ was a one shot deal, the scripts started on boot and ran until I manually stopped them by connecting an ethernet cable and ssh’ing back into the Pi and running sudo /etc/init.d/dance90fps stop and sudo /etc/init.d/xloborgdata stop. Not very elegant at all, but it would do for a first run.
The Viennese Waltz
As seen from Claire’s iPad
As seen from Mark’s Google Glass
As seen from Claire’s iPad
As seen from Ashleigh’s “Dance, sensor, camera thingymebob Pi™“
The Final Dance
The Final Dance as seen from with Claire’s iPad
The Final Dance as from with Mark’s Google Glass
The Final Dance as seen from with Ashleigh’s “Dance, sensor, camera thingymebob Pi™“
The thing to remember is that Ashleigh’s headcam was recording at 90fps and even then it stuggled to capture the speed of her motion as they performed. In the next post (i.e. when I work out how to do it) I’ll post the data captured from the XLoBorg and complete the triptych.
Reflections aka Things I wish I’d thought of at the start
My kit bag would have included the following:
It turned out to be rather more time consuming than I thought finding the start and end points of each performance in the 1.2GB 90fps file, and a basic note of start/stop times would have made the last few days a lot easier.
It would also be very useful to have a working video-editing software, because Linux. Pitivi did it’s usual trick of pretending to start and atempting to import the raw footage only to vanish with a “What’d you expect?” Kdenlive, pulled in a monster truck of dependancies and took an absolute age to render. These are both really cool open source projects but they’re not there yet.
So in the end my editing workflow looked something like this.
followed by commands like
Which is the polar opposite of non-destructive editing, but “you live and learn”
While you ponder the magnificance of the image above, let me explain how it was that I came to be sitting in my office wearing a Raspberry Pi camera sewn onto one of Claire’s Accessories finest headbands. That I’m posting it at all should answer the question that first popped into your head “Has he no shame?” to which the answer is a resounding no
It all started in Starbucks on Street Lane, as all things Clare Garside are wont to do. You can read Claire’s motivations and thinking behind the project on her blog post. I can’t remember exactly how the conversation went, probably as I was still hyperventilating having, to paraphrase Withnall, “gone cycling by mistake”. Anyway by the end of it I’d left with one of these
XLoBorg is a motion and direction sensor for the Raspberry Pi. The plan was that we, and I say we with the hyperventaliting caveat still fresh in your minds, would learn to dance using some of the ideas from Tim Ferriss’ The 4-Hour Chef in particluar exploring his ideas around Meta-Learning.
So how did I end up with the rather fetching headband? I hear you ask. It all started when Claire mentioned that the Ten Centre had a Google Glass, well at that moment the project just expanded to incorporate Glass.
what a dance looked like from the dancers point of view.
what a dance felt like from a dancers point of view.
The Google Glass would give us one POV, a Raspberry Pi headcam could give us the other, the same Raspberry Pi with the XLoBorg would give us a record of the motion and direction, the G force exerted on the dancer and thus from humble beginnings I give you the “Dance, sensor, camera thingymebob Pi™“
Getting the thingymebob™ working
As I’d done some time-lapse work with the Pi Camera before the initial plan was to capture a series of pictures from the camera on the headband and make a time-lapse video out of resulting stills. So testing was required, hence the image at the start of this post. Testing pointed out the first real problem with the endeavour, the shutter speed was too slow and the images where blurry, oh and they were 90º off.
So first solution involved changing the mode of the camera to sports which would force a faster shutter speed and adding rot 90 to rotate the resulting image.
modififications made I ended up with dance_capture.sh
dance_capture.sh became the slightly less documented 90frames.sh with a sleep to give the dancer time to get into position before recording.
As the blog explains the 90fps mode is limited to 640x480 which is more than enough for our lttle experiment.
hack is the only word I can use to describe what I did to xloborg.py which came from the PiBorg examples this snippet is my only alteration, and proper programmers will be able to spot why it took me an age to find xloborg_results
xloborg_results did give me a 1.1MB file full of readings.
X, Y, Z, mX, mY, Mz, T in the python snippet above, so for the moment I’m happy that it worked. The next challenge is to represent this data in a meaningful way, so I’ll be looking at gnuplot to do that.
So I went you for a quick bike ride, just to test out Glass, it was either that or annoy my family all evening by annunciating at regular intervals “ok glass. Take a photo” or “ok glass. Record a video” so I figured I’d do that in public on a bike, as you do.
Wyke Beck Way
I trundled off along the Wyke Beck Way talking to myself/Glass as I went. My fears of the Glass slipping off and being crushed under my wheels proved unfounded, they fit rather snuggly under my helmet. I had some fun taking a few photos, “ok glass. Take a photo”
I had some fun sending messages to my wife “ok glass. Send a message to” the voice recognition worked pretty well considering my huffing and puffing but you can judge for yourself.
No write up about Google Glass would be complete without the obligatory“ok glass. Record a video” action sequence, so without further ado I give you, the imaginatively titled.