How it Began
Our team, affectionately known as Variant Vision, is made up of two Yale undergraduate students who dedicated a summer to working on a full-time engineering project. Through this student-driven venture, we gained a hands-on, formative experience in the field of design thinking. As summer fellows in the Yale Center for Engineering Innovation and Design (CEID) fellowship, we put a lot of thought into what we should devote ourselves to, and what we wanted to get out of our experience. For both of us, new exposure to the world of design and innovation paired with a strong appetite for learning meant that we could take whatever project we wanted to work with in a number of different directions. The team is comprised of myself and Oyindamola Alliyu (Yale '2020).
Our main goal when we started working together was to create a product that would ultimately have a positive impact on someone’s life. We did not simply want to invent an impressive gadget that served no societal value. Additionally, it was important to us that we let our project take us out of our comfort zone and force us to learn new skills and grow as designers, inventors, and thinkers. With these goals in mind, we decided to focus our efforts on expanding the field of assistive technology and in particular on finding solutions to the relevant problems and obstacles pervading the blind or visually impaired.
Ideation and Project Development
The development of this invention was propelled by an underlying question: How can we improve the navigation and mobility experience of a blind person, both in familiar settings and in dynamic environments? During our research, we reached out to a blind couple in the area, Philip and Honorata. Through them, we learned of the two most common ways that a blind person deals with navigation. Mobility instructors provide the same age-old response when asked how to avoid smashing one’s face into a stop sign: wear a long-brimmed hat. This low tech solution allows the wearer to use the hat as a preemptive barrier to decrease the risk of injury from inevitable head-on collisions. But this solution is not good enough. Philip, an excellent user of a white cane and an everyday wearer of a long-brimmed hat, recounted that he’d broken his nose three separate times from running into things. He told us about others he knew that were repeatedly injured from falling or bumping into low hanging obstructions. Early prototype 1.2 (Red) shown at the right.
The majority of blind people use a traditional white cane to navigate through their environment and avoid obstacles – the obstacles that the cane can detect, that is. We decided to create a device that protects the rest of the body, from the waist up. Our vessel of choice: glasses. Positioned on the head, our device is efficiently located for the desired range. An ultrasonic sensor on the front of the frame detects any object within the cone of detection, spanning from the knees to a foot above the head and from shoulder to shoulder. Based on the proximity of the closest obstacle, the glasses relay audio or haptic feedback to the wearer, which increases in frequency and intensity the closer one gets to the obstacle. Prototype 3.4 (white) shown at left, with custom 3D printed front cover.
The haptic feedback mechanism uses small linear resonant actuator motors to send precise vibrations just below the ear. This is optimal for noisy environments, where one can still feel the device working even if there is a lot of external stimuli. Situated just below the ear as well, the audio feedback makes use of bone conduction technology which allows us to direct warning beeps directly into the inner ear. Though similar to headphones, we are able to deliberately prevent interference with the user's sense of hearing, which is needed to discern information about cars, people, and other hazards. Prototype 4.0 (blue) shown at right, now with electronic fully encased on the frame.
By putting our technology right into a pair of glasses, something that most blind people use anyway and an accessory that even sighted people wear, we create a design that takes societal comfort into account. Other sensory devices such as shirt clips or bulky necklaces do exist, but like previously tried smart canes, these have not been adopted because they become just another gadget a blind person has to carry around. Because the head is very stable when walking, our sensors are able to pick up accurate, consistent readings and relay information about an obstacle far in advance of a wearer needing to worry about hitting it. Our most recent design, modeled fully in Solidworks, is created so that all of the electronics – microcontroller, battery, motors, sensors and wires – were housed inside the frames themselves. Final CAD Rendering of prototype 5.
Field Testing and Current Prototype
We have tested our prototypes both by ourselves and with Phillip and Honorata, who have given us very enthusiastic feedback. To test the functionality of both audio and haptic feedback mechanisms, we had volunteers test out different iterations of code sequences, alert tones, vibration durations and frequency patterns. By making continuous modifications to our devices based on the feedback we received we were able to settle on the style of sensory alerts that satisfied most of our sighted volunteers, and more importantly, Phillip and Honorata. We continue to brainstorm ways we can make the glasses better, more comfortable, and more useful for potential users. To the right, blind volunteer Honorata tests out prototype 4.0, moving her head back and forth and using her cane to confirm that the glasses detected a hazard.
The most recent design is fully functional, wearable, and rechargeable. The CAD renderings were sent to the Yale CEID’s 3D printers, where the final result was clasped together with the help of some glue. To help imagine where this how this could go even further, I took to photoshop to create a sleek, futuristic and even athletic rendering of the glasses. This was a huge project that I learned a lot from and had a ton of fun creating – I am always happy to answer any questions about it! A few more images of field tests, renderings, and the final prototype are available below. You can also view my other projects!