EvoHaX and the Accessibility MacGuffin

A few weeks ago I participated in EvoHaX, an accessibility hackathon which happened as part of Philly Tech Week. Ather Sharif of EvoX Labs did a wonderful job organizing it. I had other commitments during the main coding day, so we compromised and made me a judge. I also gave a little speech poking fun at their prize of a Google Chromebook. I enjoyed the experience and feel glad they have already said they will do it next year.

http://goodvibes.zapto.org/austin/2015-04-19_Accessibility_Macguffin.mp3</audio>

I find it funny that I have helped plan two accessibility hackathons, but have not written a single line of code for either. I’ve had other accessibility-related commitments. Last year I spoke at the annual RubyMotion developer’s conference, and this year I gave a workshop at the University of the Arts as part of our new business called Philly Touch Tours, more on that soon. I met with Ather and the other planners and we went over the whole event. Ather had an interesting idea to pair groups with a random subject matter expert, in other words a user with a disability. This mirrors the real world – you never know when you will suddenly have to face a challenge.

On Friday April 19th the event began. Benjamin’s Desk hosted it. We listened to several informative speeches. One professor specialized i rendering infographics for screen readers. A cool topic for sure, but he kept asking “What do you see in this graph?” I wanted to yell out “Nothing!” In my head I heard my high school geometry teacher saying “You’re not much of a visual learner, are you.”

On Sunday I rolled in for the judging. I met the other judges and experts. I also saw my friend Faith Haeussler and her very cute kids who know the word hackathon. Everyone had finished coding. I got out my MacBook Air and prepared to begin.

Before I continue I have to explain something which it seems a lot of people don’t know. Blind people tend to not use Google products. Google has become synonymous with second rate accessibility. iOS dominates the mobile and tablet space. None of my blind friends use Droid, and I mean that literally. Zero! For the desktop we use Windows or Mac OS and their respective screen readers. I don’t know anyone who uses Chromevox. Personally I use a Mac with VoiceOver and Safari for my browser. When designing something for the blind you must remember the platforms used by the blind.

Because of this, I couldn’t get over the prize of a Google Chromebook for each member of the winning team. It really depressed me. For a few days I lay around, lamenting that I would have to participate in an accessibility hackathon that gives away Google Chromebooks as prizes. The world will end! Then I pulled myself together and remembered that the prize doesn’t really matter, all the wonderful inspiring work does. This gave me a great idea for a speech. I composed it in my head as I waited to judge the entries.

First up, West Chester University wrote a Chrome plugin called Jumpkey to easily navigate to common places on a web page, such as the home or contact links. Interesting concept. They brought over a MacBook Pro running Chrome with Chromevox, which I had never used. It started talking in a goofy Google voice which made me laugh. I figured out a few keys and the plugin worked. One of the authors told me he could port it to Safari in an hour. I hope he does.

Next Lasalle University demonstrated their project, a browser framework called Blind Helper. They admitted they needed to find a better name. Fortunately this one worked with Safari. They designed a system for crowd sourcing image descriptions and rendering them as alt tags. I liked the idea, and the demonstration worked. However, their logo didn’t have an alt tag, and the form fields did not have labels. It struck me as rather ironic. When coding an accessible platform you should make the platform accessible! They lost a point or three for that. Still it has potential.

Next, an all female team of hardware hackers from Drexel stole the show with their speech reader bluetooth module. They designed it for those with cognitive difficulties, but it has other uses as well. They used an Arduino with some other components. They even tested it with NVDA, the popular free screen reader for Windows. Excellent!

St. Joe’s presented a browser plugin for those with dyslexia to place icons next to familiar words. This helps their brain figure out the proper word by giving it some context. They could even make it multi-sensory. I couldn’t use it so couldn’t really comment, but I like the idea.

Finally, Swarthmore College presented a visual data representation of the Youtube videos which have captions, or rather the lack thereof. I couldn’t see the graph but they could render it in textual ways. I also grew up in Swarthmore so wished them well.

To vote, EvoX Labs wrote a little web app for the judges. And yes, they made it accessible. I filled out my form and Faith read the results. After congratulating everyone we made speeches. I called mine The Accessibility MacGuffin. A MacGuffin refers to an object which drives the plot of a story. The object itself doesn’t matter, the story around it does. For example, the briefcase in the movie Pulp Fiction doesn’t really matter. We never know its contents. We only know that some gangsters have to retrieve it and protect it for their boss, using some rather extreme means to do so. This graphic scene demonstrates the power of a macguffin. Pay attention to the briefcase!

I didn’t know how people would feel about making fun of the prize, but it went over well. I hope the participants will think about accessibility in all their projects. I also hope they continue developing the projects started at EvoHaX. See you next year! Maybe I’ll actually get to write some code.