Smart glasses open senses
Federal funding is helping to develop smart glasses for the vision impaired.
Entrepreneur Robert Yearsley is steering the future of augmented reality (AR) towards enhancing the lives of individuals who are blind or vision impaired.
Through his venture, ARIA Research, Yearsley is pioneering the adaptation of AR glasses to facilitate navigation via spatial sound, an innovation poised to revolutionise the way people with vision impairments interact with their environment.
ARIA Research, which stands for Augmented Reality In Audio, is at the forefront of integrating machine vision and artificial intelligence (AI) into a lightweight pair of glasses.
This technology aims to translate spatial information into audio cues, allowing users to navigate spaces and locate objects with unprecedented ease.
“The computer uses machine vision and AI to understand where you are in space and how to manoeuvre around that space,” Yearsley explains.
The project is finding ways to render physical objects through sound, enhancing the user's spatial awareness and mobility.
Yearsley's approach is notably inclusive, employing individuals who are blind or vision impaired as key contributors in the development process.
This collaborative strategy ensures the technology is finely tuned to the needs of its users, from engineering aspects to product design and management.
The input from these subject matter experts has been crucial in refining the design, such as the strategic placement of cameras on the glasses to detect potential hazards.
Along a journey from an initial 12 kg helmet to more sleek and manageable prototypes, Yearsley says he is building a product that seamlessly incorporates technology into the everyday lives of users.
The challenge now lies in miniaturising the system to fit within the frame of the glasses, a task that Yearsley describes as “super exciting” and fundamental to the project's success.
A significant hurdle in the project's development is the creation of a soundscape that effectively communicates the presence and nature of objects to the user.
By teaching ARIA to recognise and prioritise objects, the team aims to translate visual information into a harmonious audio experience.
This involves sophisticated AI algorithms to ensure users can navigate their surroundings without feeling overwhelmed by information.
ARIA Research is currently undertaking a pilot clinical trial to assess the safety and efficacy of the glasses. This trial includes simulations of real-world scenarios to evaluate the technology's impact on the users' ability to navigate and interact socially.
With support from the MRFF and a grant of $2.3 million, Yearsley's team is leveraging Australia's leading research talent, committed to keeping this innovative technology Australian-grown.