Have you ever been in a situation where you have searched fruitlessly for something? Now imagine you can’t even see the objects around you. How would you like to be able to ask your smartphone to find the object for you, just like Aladdin’s genie?
Our engineers asked themselves this seemingly trivial question, and have tried to answer it by designing a smartphone app called LightHouse.
Making technology accessible to all
Who hasn’t misplaced their keys or put something down without thinking about it, and ended up looking for it for a few minutes, or even hours, afterwards?
For someone able to find their way around, this needless waste of time is often an annoyance. But for a visually impaired person, it is a whole different challenge.
People with visual disabilities overcome this difficulty by using their memory to find their everyday objects, such as keys or wallets.
However, memory deteriorates with age, making it increasingly hard to remember where you put something.
Our engineers and researchers, who try to minimise the time spent searching for their own mislaid objects, have been looking into this problem. They asked themselves “what if we could design a smart application that would enable anyone to find an object just by asking their smartphone?”
Meeting a real need
For the application to be of use to everyone, our team decided to target a group for whom it would be of real benefit on a daily basis: people with visual impairments.
In France, 1.7 million people suffer from vision disorders, with 90% of people over the age of 65 being visually impaired. And according to the WHO, the number of visually impaired people is expected to double by 2050. So it’s about time something was done!
Easier said than done
The initial goal was to be able to find a lost object simply by asking your smartphone a question.
To achieve this, the team focused on deconstructing the problem. The team members’ respective areas of expertise enabled various possible solutions to be developed that led to the first prototype app, named LightHouse.
How does it work?
When the LightHouse app is consulted, it starts running and uses a voice synthesiser to ask the user to state which object they are looking for. The speech recognition system then receives the name of the target object and launches a search in the user’s environment.
The user is given instructions on how to correctly position their smartphone, to enable them to orient the camera in the most efficient way and scan the area.
The scan ends when the target object is detected and the user is notified by a beep or vibration. They are then guided to the target.
If nothing is detected, the search does not time-out, and the app gives the user unlimited time to find the object in other parts of the home.
Get straight to the point
For the moment, the LightHouse app runs on a PC to take advantage of its greater computing power. It is currently being developed for Android. The first user tests are in progress and should enable the app to be commissioned in spring 2020.
Visually impaired people will not be the only potential beneficiaries. The device will also be aimed at people with memory or concentration problems.
Seniors and people with Alzheimer’s disease may also find LightHouse helpful. These are some of the potential target groups that the team is working on.
Meanwhile, the team is continuing to look for new ideas. It is already working on other projects, to adapt the device and technology to completely different contexts.
The app and its promising services are currently being studied with a view to being deployed in public administrations.
Who are our engineers?
Great people, for starters, but also highly skilled. This small Agile team is made up of a product owner (Clément, the youngest member), two data scientists/developers (Ala, queen of shopping in her spare time and Ouassim the gourmet) and a project manager/scrum master (Elsa, also known as the Snow Queen), along with Aymeric, the dynamic business leader and Paul (also known as Lord or Sir Paul), the technical director. They have all been working together to build the app, based on an idea by Frédéric, the consultant who initiated the project. The team will also be supplemented on an ad hoc basis by keen, technology-mad engineers such as Khalid. They all share a taste for research and an appetite for complex subjects.