10-Year-Old Develops AI-Powered Glasses to Help Visually Impaired Children Awarded Sponsorship from  Emotiv, a global brain-computer interface technology company 


"What I have learned through this process is that you do not need to be an adult, a professional, or an expert to work on a problem that matters. You just have to care enough to keep going."

Texas Insider Report: AUSTIN, Texas (Leander, TX)  While many students use artificial intelligence to create cartoon portraits or edit a paper, Lucas Tudor is using it to change lives.

The 10-year-old fifth-grader at Harmony Science Academy-Leander recently earned first place at his campus science fair and a first-place ribbon at the Greater Austin Regional Science and Engineering Fair (GARSEF) for developing AI-powered computer vision glasses, A Eyes, designed to help children with visual impairments navigate daily activities and improve their quality of life.

Before Lucas ever began creating A Eyes, he knew he needed more than an idea; he wanted perspective. "Before I could build anything, I wanted to understand what it actually feels like to be visually impaired. I decided to experience it myself," he said. Blindfolded, he spun around his home until he lost his sense of direction and attempted to find the Christmas tree. What should have been a simple task took him 45 minutes. He walked into walls, crouched to feel whether he was standing on carpet or hardwood and listened for familiar sounds like the refrigerator or a ceiling fan to guide him. At one point, he grabbed what he thought would be a helpful cane, only to discover later it was a model Santa Claus. More lasting than the disorientation, he said, was the emotion. "I was frightened of hurting myself or breaking something. I felt frustrated, and at several points, I wanted to give up entirely."

His personal experience transformed Lucas's project from a science fair concept into a mission. After learning that more than 7 million Americans are visually impaired, including nearly 6.8% of children under 18 with a diagnosed vision condition, Lucas realized most tools are designed for adults. They are often too heavy, too complex and not built for how children navigate the world. "I realized that what I had felt for 45 minutes in a familiar environment is what millions of visually-impaired children experience every single day, and I wanted to do something about it," he said.

Lucas conceptualized, researched, designed, and engineered his A Eyes and then conducted a controlled experiment. His goal was to test whether real-time object detection, combined with audio feedback, could meaningfully improve users' navigation.

He developed the A Eyes using a Raspberry Pi Zero 2 microcomputer with a 128 GB SD card to store the operating system and software. An Arducam 12 MP wide-angle camera, connected with a flex extension cable, visually captured the surrounding environment. A portable battery pack powered the system and was safely housed in a protective case. A Bluetooth earpiece delivered audio feedback to alert wearers to nearby obstacles. Lucas mounted the components on custom 3D-printed frames with nonprescription lenses to resemble glasses.

On the software side, Lucas downloaded Raspberry Pi OS from the Raspberry Pi Foundation website and installed the necessary Python packages. Using PyCharm and guidance from ChatGPT, he developed computer vision and text-to-speech code and stored it in a GitHub repository before transferring it to an SD card. The program detects obstacles in real time and converts that data into spoken alerts.

To test his hypothesis, Lucas recruited three student volunteers. Each participant completed a short obstacle course under four conditions: blindfolded with no assistive device, blindfolded with a cane, blindfolded with AI-powered glasses, and fully sighted as a control. After each trial, participants rated their confidence and stress levels on a 1-10 scale. Each scenario was repeated three times per child, and Lucas recorded the data in a spreadsheet to calculate averages and compare results.

Lucas's findings were compelling and in line with his own experience. Participants reported being 62% more confident when using the A Eyes glasses than when using no device and 37% more confident than when using a cane. Stress levels decreased by 62% compared with no device and by 50% compared with using a cane. In the control scenario, when participants were fully sighted, they reported a confidence level of 10 out of 10 and no stress, thereby validating the experiment's design.

"During the experiment, the participants were unable to identify the objects in the obstacle course when blindfolded using no device at all, and also when using a cane," Lucas said. "However, the participants were able to identify the objects in the obstacle course when using my system. This shows that my system helped them understand their surroundings, increasing their confidence level."

Volunteers also suggested improvements, including adding a mute feature to pause audio while speaking with others, thereby demonstrating both the system's effectiveness and opportunities for refinement.

After winning his school science fair, Lucas immediately began looking ahead to GARSEF. He wanted to enhance A Eyes so that a user could control the glasses with their brain. While researching a way to do this, he learned about electroencephalogram, or EEG. EEG devices measure electrical activity in the brain and allow users to interact with computers using brain signals.

The catch? The device's cost was beyond Lucas's $9-a-week allowance, so the student decided to try his luck and wrote to several CEOs of companies that make the devices to ask for assistance. One of those companies was Emotiv, whose MN8 wireless wearable earbuds use built-in EEG sensors to measure brain activity in real time.

"For the next stage of my project, I would like to improve my device by adding EEG-based controls," Lucas wrote in a letter to CEO Tan Le. "The MN8 will allow me to explore EEG input and audio feedback in a much more integrated and effective way."

Citing his weekly allowance, Lucas offered Le three options: Emotiv could sponsor him, lend him a device, or develop a plan under which he could do some work for the company in exchange.
In response, Le personally offered to sponsor an MN8 unit, provide Lucas with lifetime developer access, and assign one of Emotiv's engineers to coach him.

Lucas is now expanding his assistive system to investigate how brain-computer interface technology can further improve A Eyes for its next generation. With an upgraded A Eyes and a professional neurotechnology partner in his corner, the fifth-grader from Leander is proving that age is no barrier to meaningful innovation — or to changing the lives of millions of children who navigate the world without the gift of sight.

"What I have learned through this process is that you do not need to be an adult, a professional, or an expert to work on a problem that matters," said the young innovator. "You just have to care enough to keep going."
ad-image
image
02.23.2026

TEXAS INSIDER ON YOUTUBE

ad-image
image
02.20.2026
image
02.19.2026
ad-image