Our assistive technology was meant to give our user Winston command and control usage of his PC.
Winston's illness is a neurological disorder which prevents all gross/fine motor control including clear and consistent vocal expression. Winston was not cognitively impaired at all but we needed to design a system which would be forgiving and clean enough to his tremors.
Some research we tested when I visited Winston:
- To potentially by-pass a computer vision solution: I made a paper prototype from my notebook to have winston try to touch scribbled buttons. Verdict: A touchpad wouldn't be manageable due to his tremors.
- To see his color vision: I went through a rainbow of hues and placed colors next to one another to see how well he could follow visual affordances. Verdict: Colors wouldn't aide us from the distance we needed.
- To give an aural feedback: On a prototype which used moveable buttons I spoke to Winston after moving buttons around again and again as hit points. Verdict: Aural feedback could be very helpful but the possibility for fatigue to preform simple macros was very high.
- For free play: A prototype that made Winston's hands into a virtual accordion was a big hit! Verdict: To assist Winston with his Physical Therapy this would be a great option and his range of motion was higher with that prototype than ever before!
In the end we didn't end up making the system for Winston but showed our gesture framework at the Maker Faire in San Mateo. In general this project a lot of good intentions but lacked a strong design process. We also didn't have the process where we looked for more than a handful of solutions, we didn't paper prototype with our user nor did we seek advise of subject matter experts. And in many ways our failure was our best learning moment, I left this project with respect for the design process.