Well, it has been almost a year since my previous post here, very much living up to the “occasional blogger” title on my Twitter profile.
I didn’t think there was going to be a Hackference 2017, what with it being ‘the last Hackference’ last year, but here we are.
Annoyingly, I made it to Euston station at 6:44am on the Friday. The train I booked was scheduled for 6:43, and annoyingly left on time, so I had to buy another ticket. £58 poorer than I was hoping to be that day, I got on the next train and still made it to The Studio (the venue for the day) on time.
I’m not going to say much about the talks: they were all really good, inspiring everyone to think differently about programming, design, and to try out some of the new Web technologies.
Here’s Oxford’s very own Ben Foxall demoing a combination of the Web Audio API, WebGL and Nexmo’s voice API. In short, a visualisation of a phone call between Ben and his mum, happening in real time!
The next day was the hackathon. 24 hours of working on whatever you want, with whomever you want, from midday on Saturday to midday on Sunday. There were a few sponsors who ran challenges to help focus the direction of the hacks, including Microsoft. Microsoft were encouraging the use of their Cognitive Services APIs – a collection of machine learning features making it easy for developers to add image recognition, OCR, speech-to-text etc. to their own applications; and they would award a prize to the team with the best use of their APIs.
What follow are the slides from my presentation at the end. Continue reading