Blog /The sudo:immerse Hackathon

November 20, 2017 23:17 +0000  |  1

Months ago, I signed up for a local hackathon and then promptly forgot about it. Then, the night before I was due to attend, I realised that it was actually a VR hack, and considered not going. I'm a web guy after all, and didn't fancy the idea of using Windows to play with Unity for two days.

Christina convinced me to to go, and I'm so glad she did.

It was a great event. We took over the RealVNC offices for 48hours to first learn about the capabilities of emerging VR hardware, and then to attempt to build something useful with it. I played with an HTC Vive and Microsoft Hololens and was introduced to a woman from Give Vision who inspired some of us to develop a project for the visually impaired.

I fell into a team of brilliant, committed people and early in the process we realised that we all more-or-less shared the same idea for the hack:

We were going to build a tool to first help diagnose people with macular degeneration, then build another tool to help those suffering from it cope with the disease.

Our team was comprised of six people, all of whom just showed up for the event looking for something to do. In fact, nearly every other person at the evnet arrived with a pre-defined team and product in mind. We were effectively the leftovers.

We broke into 3 sub-teams, with two people each:

  • Luke & George worked together to build the test front-end and coping tool respectively
  • Peter & Clare researched the industry, developed use-cases, and prepared a presentation
  • Ullash and I wrote the mapping system for the test as well as the algorithm to zero-in on a user's blind spot at higher and higher resolutions.

By 11pm Saturday night, each team had more-or-less completed their side, and by Sunday at 3pm we were ready to present.

...and we won.

The finished product is effectively two separate products focused on helping people with macular degeneration:

  • The test is a simple web app (written in Javascript) that runs on any phone. It's designed so that you can take your phone, plug it into any Google Cardboard device, and like magic you have an eye test that maps your blind spot(s) and will even email that map to your optometrist.
  • The visual aid works the same way, via Google Cardboard, but takes your blind spot map as input. It taps into your phone's camera to give you a real-time view of the world, literally bending the image around your blind spots to help you see.

The finished product(s) has all manner of benefits:

  • Reduce the costs of medical care by reducing routine visits for testing
  • Improving the mobility and independence of those suffering from the disease.
  • Increase the amount of data collected on this disease through the historical charting of the disease's progression.
  • Increase understanding, by allowing others to see what their relatives see using our app.

I'm really proud of the team. We built a product that's not only useful, but accessible. The total cost of this thing is that of a smart phone + $5 for Google Cardboard. This could be deployed around the world to help detect signs of macular degeneration literally years early allowing treatment before it progresses too far. It'll help parents stricken with the disease keep tabs on potential signs in their children (this is a genetic disorder) and all this done with a phone and the price of a Starbucks coffee.

Our prize was a Google Home, one for each of us. Honestly though, I don't think any of us much cared about the prize at the end of the day. We were exhausted and thrilled at what we were able to build in such a short amount of time.

The finished product is in a state one might expect from a hackathon: patched & working, but in no way ready for public use. The code is on Github and there's a few live samples if you wanna give it a try:

  • Moving car: Tap the screen to see what the world is like for someone with MD. The panel on the left is what they normally see. The one on the right is with our video distortion.
  • Live stream: You have to enable your camera permissions for this one, but this will demonstrate how the app works in real time. Tap the screen to switch though the few modes we setup.
  • The test: The actual test. Dots appear at random locations and at random times. Tap the screen to record the fact that you saw it. Hits & misses are logged in the app and mapped internally. PDF generation works, but it's flaky.

What happens next is still unclear. The product as-is isn't ready for the public, but no one person on the team is really capable of picking up the whole thing and running with it. Give Vision might take it over, or maybe we'll all get together in a few weeks and polish it up a bit. I don't know.

The organisers said they'd be doing another event like this in March next year though. I hope to attend that one as well.

Update: Peter Fuller, one of my team mates for the hack has written his own post

Comments

Luke
21 Nov 2017, 12:03 a.m.  | 

Great write up and fantastic working with you!

__Luke__

Post a Comment of Your Own

Markdown will work here, if you're into that sort of thing.