Students Develop Low-Cost Wearable Device for the Visually Impaired

It uses computer vision, machine learning, and Google API

3 min read
group of people standing in front of a banner, people sitting on floor in front holding signs that read “EPICS in IEEE”

The EPICS in IEEE team displays posters that explain how its OurVision wearable device works. In the background are representatives from the National Association for the Blind and faculty members from the Ramaiah Institute of Technology in Bangalore, India.

Dhruv Dange

Employing computer vision techniques, students from the Ramaiah Institute of Technology’s IEEE Computational Intelligence Society chapter in Bangalore, India, developed a device to assist people who are visually impaired. OurVision is a low-cost wearable that reads text out loud to users and helps them navigate their surroundings. The goal is to help blind people advance their educational and career opportunities, as well as to help them live independently. The technology used in the device includes optical character recognition, machine learning, and Google application programming interfaces.

The IEEE CIS chapter received a US $4,400 grant for the project from EPICS in IEEE, made possible through generous donors and a partnership with the IEEE Foundation. The student team was able to take an idea and turn it into a working solution while working hand in hand with a community partner, the National Association for the Blind (NAB) in Karnataka, India.

“With this device, visually impaired individuals can read and move around independently like their non-blind peers,” says the project lead, faculty member Megha Arakeri, an IEEE member.

A demonstration of how the OurVision wearable works.EPICS in IEEE

The ability to read under any condition and navigate surroundings

OurVision can read text out loud from books and periodicals as well as billboards, posters, and traffic signs. It can translate text in a variety of languages including English, Kannada, Telagu, Malayalam, Tamil, and Hindi.

The portable device helps the user read in just about any location and lighting conditions. If the device can’t read text because the user isn’t holding the book or other document correctly, OurVision verbally notifies the user the alignment is off. In addition, it assists the person in navigating the surroundings by describing nearby objects and their distance from the user.

The assistive device works with or without Wi-Fi.

“With this device, visually impaired individuals can read and move around independently like their non-blind peers.”

The team built 11 devices, which currently are stored in the NAB library. Users must sign them out. Approximately 100 students affiliated with the NAB are using OneVision. The team worked closely with the organization.

The estimated price of the device is US $206 (17,000 rupees).

“The visually challenged students who are using it for their studies and navigation purposes have provided good feedback on the efficiency and portability of the device compared to others that they have used in the past,” says Latha Kumari, NAB mobility officer. “In order to provide this benefit to a greater number of visually impaired students, we are looking forward to collaborating with the professor and student team to develop more devices with enhanced features based on the feedback from the students. We highly appreciate and commend the efforts of Professor Arakeri and students.”

The device has not been released to the market yet. The team first plans to add features such as currency recognition, as well as the ability to recognize colors.

Learning outcomes from the design and development process

Students from the IEEE CIS student chapter say they learned a great deal during the project’s design, development, and deployment. They also trained M.S. Ramaiah High School students in the design process and give them experience in creating a technical solution for a community in need.

“The project provided me with the opportunity to apply my engineering skills to product development,” one member of the team says. “The development process involved different stakeholders, a nongovernmental organization, and high school students—which allowed for sharing experiences and feedback.”

The EPICS in IEEE mentor assigned to the project was Ruby Annette Jayaseela Dhanaraj, an AI researcher and machine learning engineer at Matilda Cloud, in Richardson, Texas.

“The students have made a great effort, and are thoroughly knowledgeable about the technology being used,” Jayaseela Dhanaraj says, adding that they “are excited to make the product even better.”

“I cherished this mentoring experience,” she says. “I am very glad that I could contribute to the technology that could impact the lives of many blind students.”

Visit the EPICS in IEEE website to learn about other projects and future proposal deadlines. To support future projects, donate here.

This article is an edited excerpt of the “EPICS in IEEE Team Completes and Deploys Assistive Device to Blind Students in Bangalore, India” blog entry published in May.


The Conversation (0)