Well, it has been almost a year since my previous post here, very much living up to the “occasional blogger” title on my Twitter profile.
I didn’t think there was going to be a Hackference 2017, what with it being ‘the last Hackference’ last year, but here we are.
Annoyingly, I made it to Euston station at 6:44am on the Friday. The train I booked was scheduled for 6:43, and annoyingly left on time, so I had to buy another ticket. £58 poorer than I was hoping to be that day, I got on the next train and still made it to The Studio (the venue for the day) on time.
I’m not going to say much about the talks: they were all really good, inspiring everyone to think differently about programming, design, and to try out some of the new Web technologies.
Here’s Oxford’s very own Ben Foxall demoing a combination of the Web Audio API, WebGL and Nexmo’s voice API. In short, a visualisation of a phone call between Ben and his mum, happening in real time!
The next day was the hackathon. 24 hours of working on whatever you want, with whomever you want, from midday on Saturday to midday on Sunday. There were a few sponsors who ran challenges to help focus the direction of the hacks, including Microsoft. Microsoft were encouraging the use of their Cognitive Services APIs – a collection of machine learning features making it easy for developers to add image recognition, OCR, speech-to-text etc. to their own applications; and they would award a prize to the team with the best use of their APIs.
What follow are the slides from my presentation at the end.
Quite early on I was looking for a team to hack with Microsoft’s Cognitive Services, but I might have put people off when I said some of it would be in PHP.
Anyway I carried on on my own and ended up creating this site, Is It A Bench?
The idea is pretty simple, you can upload a photo to the service and it will tell you if it thinks it’s a photo of a bench, along with the text in any inscriptions.
I imagine there may be a few of you who are wondering:
Well, I’ll tell you.
This is Terence. You may recognise him from the conference on Friday, or from speaking at Hackference last year.
He made this website called Open Benches.
The idea is you can upload geotagged images of benches with inscriptions and they will appear on a map.
As soon as someone uploads these images, the website will instantly post them on Twitter.
That poses a potential problem. The Internet is not a nice place and some people could upload photos that clearly aren’t benches. It would also be easier to enter in the text of inscriptions if there was some way of processing the images with OCR before they are published.
I set about making a content filter. I uploaded a photo to my service which Microsoft’s vision API decided was adult content and described it as “a close up of a tattoo”. I’ll leave it to you to imagine what it was.
When my site detects inappropriate content, it responds accordingly.
It was around this time when Joe Nash stuck a message on the Slack group looking for people to play Laser Tag. It has been such a long time since I last played it, and I couldn’t resist leaping at the offer.
It was a lot more exercise than I was expecting that weekend. Also I’m terrible at Laser Tag – I was consistently the lowest scorer, but it was such good fun anyway!
When I got back I got started on the text extraction. I spent a good 20 minutes working out why OCR wasn’t working. Turns out I can’t spell OCR.
I was pretty much finished with the proof of concept early on, so after some interesting chats with some of the attendees, I even managed to get some sleep!
The next morning was spent making a couple of performance optimisations, prepping the demo and appreciating the fantastic food for lunch. As a last minute idea I tried making a JSON response to work with photos from the Open Benches website. I didn’t finish that bit in time.
I presented a live demo of what I had, and introduced the Open Benches website explaining how I could integrate the functionality.
Most of the photos that I’d submitted to Open Benches in the past were correctly identified as benches – even the close-ups of plaques which really surprised me.
Sometimes the text was extracted perfectly, even keeping lines and capitalisation…
But sometimes it wasn’t so good!
After my demo, I tweeted about its existence. Terence seemed to love it!
Once everyone had finished their demos, the sponsors left to decide on the winners. While I didn’t win the prize from Microsoft, I did win a Raspberry Pi Zero from the recruitment company Harvey Nash! During the conference, Harvey Nash had the brilliant idea of giving playing cards to the attendees. Two packs were given out and your goal was to find the other attendee with the matching card to win a prize. It was a great icebreaker and I’d love to see that more in the future.
The hackathon had everyone from students to veteran hackers; all of the genders, ethnicities and operating system preferences; and it was wonderful to see everyone simply being excellent each other. (Apparently that’s some sort of reference to Bill and Ted; if only I knew what Bill and Ted was…)
Here’s to the next one!