On Tech & Vision With Dr. Cal Roberts-logo

On Tech & Vision With Dr. Cal Roberts

Technology Podcasts

Dr. Cal Roberts, President and CEO of Lighthouse Guild, the leading provider of exceptional services that inspire people who are visually impaired to attain their goals, interviews inventors, developers and entrepreneurs who have innovative tech ideas and solutions to help improve the lives of people with vision loss.


New York, NY


Dr. Cal Roberts, President and CEO of Lighthouse Guild, the leading provider of exceptional services that inspire people who are visually impaired to attain their goals, interviews inventors, developers and entrepreneurs who have innovative tech ideas and solutions to help improve the lives of people with vision loss.






Developing Big Ideas: Product Testing and Iteration

This podcast is about big ideas on how technology is making life better for people with vision loss. When we buy a product off the shelf, we rarely think about how much work went into getting it there. Between initial conception and going to market, life-changing technology requires a rigorous testing and development process. That is especially true when it comes to accessible technology for people who are blind or visually impaired. For this episode, Dr. Cal spoke to Jay Cormier, the President and CEO of Eyedaptic, a company that specializes in vision-enhancement technology. Their flagship product, the EYE5, provides immense benefits to people with Age-Related Macular Degeneration, Diabetic Retinopathy, and other low-vision diseases. But this product didn’t arrive by magic. It took years of planning, testing, and internal development to bring this technology to market. This episode also features JR Rizzo, who is a professor and researcher of medicine and engineering at NYU — and a medical doctor. JR and his research team are developing a wearable “backpack” navigation system that uses sophisticated camera, computer, and sensor technology. JR discussed both the practical and technological challenges of creating such a sophisticated project, along with the importance of beta testing and feedback. The Big Takeaways: The importance of testing: Anticipating needs: When it comes to products like the EYE5, developers need to anticipate that its users will have evolving needs as their visual acuity deteriorates. So part of the development process involves anticipating what those needs will be and finding a way to deliver new features as users need them.Changing on the fly: Future-Casting: When Jay Cormier and his team at Eyedaptic first started designing the EYE5 device, they were already considering what the product would look like in the future, and how it would evolve. To that end, they submitted certain patents many years ahead of when they thought they’d need them — and now, they’re finally being put to use. Tweetables: Contact Us: podcasts@lighthouseguild.org Pertinent Links Lighthouse GuildEyedapticRizzo Lab


Robotic Guidance Technology

This podcast is about big ideas on how technology is making life better for people with vision loss. The white cane and guide dogs are long-established foundational tools used by people with vision impairment to navigate. Although it would be difficult to replace the 35,000 years of bonding between humans and dogs, researchers are working on robotic technologies that can replicate many of the same functions of a guide dog. One such project, called LYSA, is being developed by Vix Labs in Brazil. LYSA sits on two wheels and is pushed by the user. It’s capable of identifying obstacles and guiding users to saved destinations. And while hurdles such as outdoor navigation remain, LYSA could someday be a promising alternative for people who either don’t have access to guide dogs or aren’t interested in having one. In a similar vein, Dr. Cang Ye and his team at Virginia Commonwealth University are developing a robotic white cane that augments the familiar white cane experience for people with vision loss. Like the LYSA, the robotic white cane has a sophisticated computer learning system that allows it to identify obstacles and help the user navigate around them, using a roller tip at its base. Although it faces obstacles as well, the robotic guide cane is another incredible example of how robotics can help improve the lives of people who are blind or visually impaired. It may be a while until these technologies are widely available, and guide dogs and traditional canes will always be extremely useful for people who are blind or visually impaired. But with how fast innovations in robotics are happening, it may not be long until viable robotic alternatives are available. The Big Takeaways: Reliability of Biological Guide Dogs: LYSA the Robotic Guide Dog: LYSA may look more like a rolling suitcase than a dog, but its developers at Brazil’s Vix Systems are working on giving it many of the same functions as its biological counterpart. LYSA can identify obstacles and guide its user around them. And for indoor environments that are fully mapped out, it can bring the user to pre-selected destinations as well. The Robotic White Cane: Dr. Cang Ye and his team at Virginia Commonwealth University are developing a Robotic White Cane that can provide more specific guidance than the traditional version. With a sophisticated camera combined with LiDAR technology, it can help its user navigate the world with increased confidence. Challenges of Outdoor Navigation: Both LYSA and the Robotic White Cane are currently better suited for indoor navigation. A major reason for that is the unpredictability of an outdoor environment along with more fast-moving objects, such as cars on the road. Researchers are working hard on overcoming this hurdle, but it still poses a major challenge. The Speed of Innovation: When Dr. Ye began developing the Robotic White Cane a decade ago, the camera his team used cost $500,000 and had image issues. Now, their technology can be run on a smartphone – making the technology much more affordable, and hopefully one day, more accessible if it becomes available to the public. Tweetables: Pertinent Links: Lighthouse Guild Guiding Eyes for the Blind LYSA Robot GuideRobotic White Cane


Smart Cities and Autonomous Driving: How Technology is Providing Greater Freedom of Movement for People with Vision Loss

This podcast is about big ideas on how technology is making life better for people with vision loss. Navigating the world can be difficult for anyone, whether or not they have vision loss. Tasks like driving safely through a city, navigating a busy airport, or finding the right bus stop all provide unique challenges. Thankfully, advances in technology are giving people more freedom of movement than ever before, allowing them to get where they want, when they want, safely. Smart Cities are putting data collection to work in a healthy way by providing information to make busy intersections more secure, sidewalks more accessible, and navigation more accurate. They’re providing assistance for all aspects of travel, from the front door to the so-called “last hundred feet,” while using automated technology to make life easier every step of the way. And although fully autonomous vehicles are still on the horizon, the technology being used to develop them is being applied to improve other aspects of life in incredible ways. These applications are making the world more accessible, safer, and better for everyone, including people who are blind or visually impaired. One example of this is Dan Parker, the “World’s Fastest Blind Man,” who has developed sophisticated guidance systems for his racing vehicles, as well as a semi-autonomous bicycle that could give people with vision loss a new way to navigate the world safely and independently. The Big Takeaways: Smart Cities Autonomous Driving. In a perfect world, self-driving cars will provide ease of transportation for everyone, and create safer, less congested roads. That technology isn’t there yet – but it’s being worked on by talented researchers like John Dolan, the Principal Systems Scientist at Carnegie Mellon’s Autonomous Driving Vehicle Research Center. Sophisticated sensors and advanced robot-human interfaces are being developed to make self-driving cars possible. Application of Technology. The World’s Fastest Blind Man. When professional race car driver Dan Parker lost his vision in an accident, he felt lost. But a moment of inspiration led him and his business partner Patrick Johnson to develop a sophisticated guidance system that let him continue racing without human assistance. Thanks to this revolutionary technology, Dan became the “World’s Fastest Blind Man” when he set a land-speed record of 211.043 miles an hour in his customized Corvette. Tweetables: Contact Us: Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss. Pertinent Links: Lighthouse GuildMCityCarnegie Mellon Autonomous Driving Vehicle Research CenterDan Parker


Leveling Up Accessible Video Game Features: How New Technology is Making Gaming More Immersive and Inclusive for People with Vision Loss

This podcast is about big ideas on how technology is making life better for people with vision loss. For decades, people with vision loss had limited options when it came to accessing video games. Aside from screen magnification and text-to-voice tools, gamers who are blind or visually impaired didn’t have many ways to play their favorite titles. But in recent years, the same cutting-edge technology used to create games has been used to also make them more accessible for people with vision impairment. These advances include more visibility options, the implementation of 3D audio, haptic feedback, and customizable controllers for gamers with vision impairment. Furthermore, 3D audio technologies being developed in live sports may soon make their way to online multiplayer video games. The implementation and improvement of these technologies mean that everyone will be able to play together, regardless of their visual acuity. The Big Takeaways: Leap in Accessible Gaming Options. The Last of Us: Part 2Participating in the Process.Xbox Accessibility Team.Action Audio. Spatial Audio in Gaming. Tweetables: kaizen Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse Guild RNIB Accessible Video Games Page Xbox Accessibility GuidelinesBlindGamerChick YouTube Channel On Tech & Vision: Training the Brain: Sensory Substitution


A Celebration of Sound and Song: Music Tech Shines the Spotlight on Musicians with Vision Loss

This podcast is about big ideas on how technology is making life better for people with vision loss. Marcus Roberts, Stevie Wonder, Ray Charles, and even Louis Braille (who invented the Braille Music Notation system still used today) prove that musicians who are blind or visually impaired have made profound impacts on our musical landscape. However, to get their work to us, musicians who are blind have had to structure complex workarounds, like relying on sighted musicians to demonstrate complex scores; memorizing long pieces; or only performing when they can have a Braille score in front of them, shutting them out from opportunities that fall to those who can sight read, since Braille scores have often been time-consuming and expensive to produce. However, new technologies in music composition and production are making composition, nuanced scoring, and Braille printing easier than ever, bringing musicians and composers who are blind to centerstage to share their sound and song. The Big Takeaways: “Lullay and Lament” by James Risdon.Echoes of Arcadia,Dancing Dots with Bill McCann. Chris Cooke and PlayHymns.com. What is Braille Music? Musical Instrument Digital Interface and MusicXML. The question of parity. The MIDI-to-brain connection. Training the Brain: Sensory Substitution Tweetables: Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildDancing DotsJames RisdonChris Cooke and Playhymns.com


Ambient Computing and Voice Assistants: From Your Home to the Stars

This podcast is about big ideas on how technology is making life better for people with vision loss. Lots of people have voice-controlled smart home assistants like Siri, Google, or Alexa in their homes…. to listen to the news or to set timers. But they can do so much more! David Frerichs, Principal Engineer, Alexa Experience at Amazon on the aging and accessibility team, shares his design philosophy for making voice assistants more inclusive, and the preferred mode of engagement for every user. He also shares that the next stage of smart home assistants will be ambient computing, where your devices will intuit your needs without you speaking them. We talk with Lighthouse Guild client Aaron Vasquez, who has outfitted his home with smart home technology, and with Matthew Cho, a client who traveled to the Johnson Space Center in Houston to speak to the unmanned Orion Spacecraft via the Amazon Alexa on board, demonstrating that voice assistant technology can bring inclusivity and accessibility to many jobs and industries and are not just for the home anymore. The Big Takeaways: Alexa Onboard the Orion Spacecraft.Accessibility and Preferences.,Ambient Computing.Smart Homes Today, Smart Industries Tomorrow.This episode asks how the smart home’s tools can be integrated into offices and industries to make these more accessible and inclusive Tweetables: Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildAlexa in SpaceDavid Frerichs


New Approaches in Access: Smart Tools for Indoor Navigation and Information Transfer

This podcast is about big ideas on how technology is making life better for people with vision loss. Artifacts from Blackbeard’s sunken pirate ship are on display in the North Carolina Maritime Museum in Beaufort, North Carolina. But now they are also accessible to visitors who are blind, thanks to the efforts of Peter Crumley, who spearheads the Beaufort Blind Project. In this episode, we ask: How can new technology help make sites like these as accessible to people who are blind as they are to sighted people? We profile three companies applying new technologies paired with smartphone capabilities, to make strides in indoor navigation, orientation, and information transfer. Idan Meir is co-founder of RightHear, which uses Apple’s iBeacon technology to make visual signage dynamic and accessible via audio descriptions. We check in with Javier Pita, CEO of the NaviLens QR code technology which we profiled in our first season to see what they have been developing in the last two years. Rather than iBeacons or QR codes, GoodMaps uses LiDAR and geocoding to map the interior of a space. We speak with Mike May, Chief Evangelist. Thanks to Peter Crumley, the North Carolina Maritime Museum is fully outfitted with GoodMaps, and will soon have NaviLens as well. As the prices of these tools come down, the key will be getting them into all the buildings, organizations, and sites of information transfer that people who are blind need to access – which is all of them. The Big Takeaways: Beaufort Blind Project.RightHear.NaviLens. technologyGoodMaps.Technological advancement.Distribution. Tweetables: nothing about us without us. Contact Us: Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss. Pertinent Links: Lighthouse GuildRightHearNaviLensGoodMaps


AI Revolutionizes Vision Tech, Ophthalmology, and Medicine as We Know It

This podcast is about big ideas on how technology is making life better for people with vision loss. In 1997, Gary Kasparov lost an epic chess rematch to IBM’s supercomputer Deep Blue, but since then, artificial intelligence has become humanity’s life-saving collaborator. This episode explores how AI will revolutionize vision technology and, beyond that, all of medicine. Karthik Kannan, co-founder of AI vision-tech company Envision, explains the difference between natural intelligence and artificial intelligence by imagining a restaurant recognizer. He describes how he would design the model and train it with positive or negative feedback through multiple “epochs” — the same process he used to build Envision. Envision uses AI to identify the world for a blind or visually-impaired user using only smartphones and smart glasses. Beyond vision tech, AI enables faster and more effective ophthalmic diagnosis and treatment. Dr. Ranya Habash, CEO of Lifelong Vision and a world-renowned eye surgeon, and her former colleagues at Bascom Palmer, together with Microsoft, built the Multi-Disease Retinal Algorithm, which uses AI to diagnose glaucoma and diabetic retinopathy from just a photograph. She acquired for Bascom Palmer a prototype of the new Kernal device, a wearable headset that records brain wave activity. Doctors use the device to apply algorithms to brainwave activity, in order to stage glaucoma, for example, or identify the most effective treatments for pain. Finally, AI revolutionizes drug discovery. Christina Cheddar Berk of CNBC reports that thanks to AI, Pfizer developed its COVID-19 treatment, Paxlovid, in just four months. Precision medicine, targeted to a patient’s genetic information, is one more way AI will make drugs more effective. These AI-reliant innovations will certainly lower drug costs, but the value to patients of having additional, targeted, and effective therapies will be priceless. The Big Takeaways: Natural vs. artificial intelligence, and the “restaurant recognizer.”Sensor fusion AI.ranshumanism.Multi-Disease Retinal Algorithm.The Brain-Machine Interface.Bias in AI.AI for Drug Discovery. Tweetables: Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildKarthik Kannan Dr. Ranya HabashZephin LivingstonChristina Cheddar Berk


Balancing Innovation and Ethics: Who is Protecting the Early Adopters?

This podcast is about big ideas on how technology is making life better for people with vision loss. Innovations in implant technology are advancing at lightning speed, profoundly impacting the lives of people who are blind or visually impaired. In On Tech And Vision, we’ve profiled some amazing new implant technologies that have the potential to restore people’s sight. But in this episode, we pump the breaks — because we need to address a critical part of the innovation process: the ethical frameworks that protect participants in early clinical trials, and the need for an updated framework that ensures patient protections without stifling innovation and development. Discussions between doctors and participants in clinical trials almost always focus on the new technology and very rarely on the manufacturer who sponsors the clinical trial — and almost never on the long-term commitment and financial viability of the company sponsoring the technology. And while clinical trial informed consent includes whose responsibility it is to remove the implants should they fail during the trial, that responsibility usually ends once the trial is over. At that stage, who will maintain or remove the implants that are still housed in patients’ bodies? In this episode, we talk about innovative implants such as the Argus II, which we featured in the first season of On Tech And Vision. The Argus II is a microchip implanted under the retina that, in combination with a special headset, provided some vision to people who otherwise had none. And while the technology was exciting, the company discontinued the retinal implant three years ago, and the Argus II was eventually sold to another pharmaceutical company. Dr. Joseph Fins, Professor of Medical Ethics and Professor of Medicine at Weill Cornell Medical Center in New York, joins us to share his thoughts on today’s big idea: How do we balance the life-changing potential of electroceutical implant technology with the ethics of caring for early participants — particularly after clinical trials are over? The Big Takeaways: Examples of electroceutical implants.Regulatory framework today.Ancillary care obligations.Moral Entanglements: The Ancillary Care Obligations of Medical ResearchersCollective responsibility.Some solutions.The law.“Victims of Our Own Success.”Danger to the field. Tweetables: Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildDr. Joseph Fins


People Choice Awards 2022

Listener nominations are open from July 1st - July 31, 2021. On July 1st visit: https://www.podcastawards.com, sign-up and vote


Tools for Success: Tech Convergence and Co-Designed Products Close Gaps for Children Who are Blind

This podcast is about big ideas on how technology is making life better for people with vision loss. People who are blind or visually impaired know all too well the challenges of living in a sighted world. But today, the capabilities of computer vision and other tech are converging with the needs of people who are blind and low-vision and may help level the playing field for young people with all different sensory abilities. These tools can pave the way for children’s active participation and collaboration in school, in social situations, and eventually, in the workplace, facilitating the important contributions they will make to our world in their adult lives. Access to educational materials is a consistent challenge for students and adults who are blind, but Greg Stilson, the head of Global Innovation at American Printing House for the Blind (APH), is trying to change that. Together with partner organizations Dot Inc. and Humanware, APH is on the verge of being able to deliver the “Holy Braille” of braille readers, a dynamic tactile device that delivers both Braille and tactile graphics in an instant, poised to fill a much-needed gap in the Braille textbook market. Extensive user testing means the device is as useful for people who are blind as possible. Greg sees a future in which more inclusively designed and accessible video games, augmented reality (AR), and virtual reality (VR) will help children who are blind learn with greater ease, and better engage with their sighted peers. Enter Dr. Cecily Morrison, principal researcher at Microsoft Research in Cambridge, UK. Based on extensive research and co-designing with people who are blind, she and her team developed PeopleLens, smart glasses worn on the forehead that can identify the person whom the user is facing, giving the user a spatial map in their mind of where classmates (as one example) are in space. PeopleLens helps children who are blind overcome social inhibitions and engage with classmates and peers, a skill that will be crucial to their development, and in their lives, as they move into the cooperative workspaces of the future. The Big Takeaways: MY OT Journey PlannerMy OT Journey Podcast, Tweetables: Contact Us: Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss. Pertinent Links: Lighthouse GuildRobinAkselrudBryceWeilerGregStilsonDr.CecilyMorrison


Innovations in Intraocular Pressure and Closed-Loop Drug Delivery Systems

This podcast is about big ideas on how technology is making life better for people with vision loss. In 2012, Christine Ha won the third season of Masterchef, after having lost her vision in her twenties. Since her win, she has opened two restaurants in Houston, adapting to the challenges the pandemic still poses to restaurateurs in order to meet the needs of her community. In a similarly innovative way, Max Ostermeier, CEO and Founder of Implandata Ophthalmic Products out of Hannover Germany, has reimagined the remote management and care of patients with glaucoma. Max and his team developed the EyeMate system, a microscopic implantable device and microsensor that measures intraocular pressure throughout the day. The EyeMate sends eye pressure data to an external device and uploads it to their eye doctor's office for analysis. This game-changing technology allows people with glaucoma to bypass regular trips to the ophthalmologist’s office to measure their eye pressure, key data in maintaining their eye health. We revisit a conversation with Sherrill Jones, who lost her sight due to glaucoma, in which she shares how difficult it was to adhere to compliance protocols. Max believes the EyeMate will evolve to be part of a closed loop drug delivery system; that is, when the EyeMate registers a high pressure, medications could automatically be released into the patient’s eye, which could improve outcomes significantly. We dig into issues of compliance and closed-loop systems by considering diabetes. We talk to occupational therapist Christina Senechal who has managed her diabetes for 27 years, and Dr. Carmen Pal, who specializes in internal medicine, endocrinology, diabetes, and metabolism in Lighthouse Guild’s Maxine and John M. Bendheim Center for Diabetes Care. The Big Takeaways: Dr. Max Ostermeier and his team have invented Tweetables: Sherrill — Dr. Carmen Pal, diabetes specialist in Lighthouse Guild’s Maxine and John M. Bendheim Center for Diabetes Care. Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildMax OstermeierChristine HaDr. Carmen Pal


Restoring Vision: Code Breaking and Optogenetics

This podcast is about big ideas on how technology is making life better for people with vision loss. The Enigma machines that Germany used to encode messages during World War II were notorious for their complexity. Two Enigma experts — Dr. Tom Perera, a retired neuroscientist, and founder of EnigmaMuseum.com, and Dr. Mark Baldwin, an expert on the story of Enigma machines — tell us how the Allies were able to crack the code, by using input-output mapping. The human brain is similarly complex. Until recently, no one knew the code the retina used to communicate with the brain to create sight. Our guest, Dr. Sheila Nirenberg, a neuroscientist at Weill Cornell, and Principal and Founder of Bionic Sight has — using input-output mapping — cracked the retina’s neural code, enabling her to recreate the electric signals to the brain that could restore sight in people with retinal degeneration. She has created a set of goggles that convert a camera’s images into the code, via pulses of light. And she relies on optogenetics, a relatively new procedure in neuroscience that helps neurons become responsive to light. In her clinical trial, Dr. Nirenberg injects the optogenetic vector into the eye, and trial participants who are completely blind, like Barry Honig, who we speak with on this program, report being able to see light. In early studies, coupling the effects of the optogenetics with the code-enabled goggles has an even more impressive effect on patients’ vision. Dr. Nirenberg is also using her knowledge of the visual neural code to inform machine learning applications that could also be further used to support people who are blind or visually impaired. Clinical trial participants are important partners in the journey of discovery, Dr. Nirenberg says. Barry Honig agrees. He was happy to participate to help ease the burden on future children diagnosed with eye diseases that would otherwise result in blindness, but thanks to these advancements, someday may not. The Big Takeaways: Tweetables: — Dr. Sheila Nirenberg,— Dr. Sheila Nirenberg, Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildDr. Sheila NirenbergDr. Tom PereraDr. Mark BaldwinBarry Honig


Seeing with Sound: Using Audio to Activate the Brain’s Visual Cortex

This podcast is about big ideas on how technology is making life better for people with vision loss. Every day, people who are blind or visually impaired use their hearing to compensate for vision loss. But when we lose our vision, can we access our visual cortex via other senses? We call this ability for the brain to change its activity “plasticity,” and brain plasticity is an area of active research. In this episode, we’ll explore how, through sensory substitution, audio feedback can, in some cases, stimulate a user’s visual cortex, allowing a user to — without sight — achieve something close to visual perception. Erik Weihenmayer — world-class mountain climber, kayaker, and founder of No Barriers who lost his vision as a teenager due to retinoschisis — brings us to the summit of Everest by describing what it sounds like. He explains how his hearing helps him navigate his amazing outdoor adventures safely. We also speak with Peter Meijer, the creator of The vOICe, an experimental technology that converts visual information into sound, and has been shown to activate users’ visual cortices, especially as users train on the technology, and master how to interpret the audio feedback. We hear an example of what users of The vOICe hear when it translates a visual image of scissors into audio. Erik Weihenmayer shares his experience with Brainport, a similar sensory substitution technology featured in our episode “Training the Brain: Sensory Substitution. While research is ongoing in the areas of sensory substitution and brain plasticity, it’s encouraging that some users of The vOICe report that the experience is like seeing. In the spirit of Erik Weihenmayer, one user even uses it to surf. The Big Takeaways: “Training the Brain: Sensory Substitution,” Tweetables: — Dr. Peter Meijer, Seeing with Sound, The vOICe. Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildPeter MeijerErik WeihenmayerNo Barriers


Beyond Self Driving Cars: Technologies for Autonomous Human Navigation

This podcast is about big ideas on how technology is making life better for people with vision loss. Today’s big idea is about exciting and emerging technologies that will someday allow people who are blind or visually impaired to navigate fully autonomously. In this episode, you will meet Jason Eichenholz, the Co-Founder and CTO of Luminar, and his manufacturing engineer, Nico Gentry. Luminar’s LIDAR technology is instrumental to the development of self-driving cars, but this same technology could be useful for people who are blind or visually impaired, who also have to navigate autonomously. You’ll hear from Thomas Panek, the President and CEO of Guiding Eyes for the Blind, an avid runner who dreamed of running on his own. He took this unmet need to a Google Hackathon and Ryan Burke, the Creative Producer at Google Creative Lab put together a team to develop a solution that turned into Project Guideline. Kevin Yoo, Co-Founder of WearWorks Technology is using inclusive design to develop Wayband, a navigation wristband that communicates directions with users via haptics. The Big Takeaways: Tweetables: So what you're able to do is to have [....] camera-like spatial resolution with radar-like range, you're getting the best of both worlds.” — Kevin Yoo Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildJason EichenholzThomas PanekRyan BurkeKevin Yoo


Batman Technology: Using Sonar for Human Navigation

This podcast is about big ideas on how technology is making life better for people with vision loss. Today’s big idea is Sonar and a somewhat similar technology called LiDAR! Can we use the latest sonar technology for obstacle detection the way bats and other nocturnal creatures do? There have been many exciting advances happening in sonar sensors that now make this possible for people who are blind. However, unlike bats, we won’t need to receive feedback signals through our ears. Advances in haptic technologies and languages make communication through touch possible. Dr. Cal Roberts talks with Dr. Matina Kalcounis-Rueppell from the College of Natural and Applied Science at the University of Alberta, Ben Eynon, and Diego Roel from Strap Technologies, Marco Trujillo of Sunu, and Sam Seavey of The Blind Life YouTube Channel to find out more. The Big Takeaways: How does a bat see what it sees? Dr. Kalcounis-Rueppell studies bats and how they use sound to thrive in their nighttime world. Bats use a series of echoes to see a 3D view of their environment, but their world isn’t always so simple. There’s rain, there are leaves, and other creatures flying that bats need to detect with their sonar. Similarly, people with vision impairment have to use their hearing to navigate complex auditory environments.Strap Technologies uses Sonar and LiDAR sensors that can be strapped across the chest, which helps people who are blind detect obstacles. These kinds of sensors have been used to park spacecraft, but with recent developments, they’re finally small enough that a human can wear them in a compact way. Ben and Diego share how it works.Unlike Sonor, LiDAR technology uses pulsed laser light instead of sound waves.Though bats have been honing their echolocation skills for millennia, interpreting information haptically, rather than sonically, is an adaptation that humans, using technologies like Strap, can make. Haptic information can help us navigate without sight through the use of vibrations, which is great news because it means we can leave our ears open to process our active world. More specifically, Ben and Diego suggest that people may no longer need to use a cane to detect obstacles.Ben and Diego are excited about the future. With their technology, they hope to create quick-reacting haptic technology so people who are blind can one day ride a bike or run a race. Infrared or radiation sensors could be added in the future to detect other hazards in the environment. The more user feedback they receive, the easier it will be to add on these product enhancements.Another way we can approximate sight is through echolocation. However, how easy is it for us to hear echoes, really? For Marco at Sunu, it’s actually a natural skill we can learn to develop. Similar to Strap Technologies, the process of learning echolocation could be improved if you're wearing a Sunu Band.Sam Seavey was diagnosed at age 11 with Stargardt’s Disease. He decided to use his voice and video skills to create a YouTube review channel for those who need to use assistive tech. The positive feedback from the community keeps him going. Sam has personally reviewed the Sunu Band, and you can check out the link to his review in the show notes! Tweetables: “They parked spacecraft with these same sensors, and recent developments have really pushed the miniaturization of the components, such that a human being can now wear them in a very compact form factor.” — Ben Eynon “He said, ‘I’m walking faster than I have in a long, long time,’ because he started to trust that the haptic vibrations were telling him every obstacle in the way.” — Ben Eynon shares the reaction from a user who is visually impaired testing Strap “We're changing our environment around us in ways that also change the acoustic environment.” — Dr. Matina Kalcounis-Rueppell “How is it that we have self-driving cars, we have rockets that land themselves like, we have a better iPhone every year, but we don’t...


The Latest Frontier in Tactile Technologies

This podcast is about big ideas on how technology is making life better for people with vision loss. Close your eyes. Raise your hands. Reach out and touch the nearest surface. What are you touching? A desktop, a leather steering wheel cover, a porcelain cup, a plastic keyboard? Our sense of touch and the way in which we interpret the materials in our environment are fundamental to our experience of the world. This episode’s big idea is the new developments in tactile technologies. You’re probably familiar with one of the oldest technologies, Braille, which was invented in 1824 by Louis Braille, a Frenchman who was blind by the age of three. Braille, which has undergone numerous refinements since its invention, has led the way in helping people who are blind read, write, and interact with the world around them. But as useful as Braille is, it has its limits: Braille is used for text; it can’t convey images. Two individuals who are working to develop technologies that will one day help people with vision impairment to experience images and graphics are material scientist Dr. Julia R. Greer from Caltech and physicist Dr. John Gardner from Oregon State University. The Big Takeaways: Tweetables: Contact Us: podcasts@lighthouseguild.org Pertinent Links: Lighthouse GuildDr. Julia R. Greer Dr. John GardnerTiger Software Suite Audrey Schading


Cortical Brain Implants Are Paving the Way for Visual Restorative Medicine

This podcast is about big ideas on how technology is making life better for people with vision loss. Today’s big idea highlights how innovations don’t happen in a vacuum, but rather a long chain of science and research and developments that build on each other. Dr. Shelley Fried’s work exemplifies this process. It took him a career’s worth of experiments and adjustments to enable his cortical brain implants to bypass the eye and restore the patient’s ability to perceive light. He had a lot of obstacles to overcome, everything from circumventing the brain’s natural inflammatory response to getting the research published. One thing is clear, breakthroughs take time and you cannot give up in the process. Your work often becomes an iteration of an iteration. Dr. Fried took inspiration from the artificial retina, which was prototyped from a cochlear implant. Dr. Fried’s revolutionary technology is another step towards a world in which no person is limited by their visual capacity. The Big Takeaways: A cochlear implant is a neuroprosthetic device surgically implanted in the cochlea, the inner part of the ear that is responsible for the transmission of nerve impulses in to the auditory cortex of the brain. Originally developed in 1950, the modern form was honed in the 1970s with help from NASA engineer.Dr. Mark Humanyan took design cues from the cochlear when he was developing the Argus II retinal implant. What is a retinal prosthesis and how does it work? The simplest way to explain it is that it’s an array of electrodes that stimulates the retina and it helps restore vision loss. They work for some blindness cases but not all. For example, this treatment is not recommended for people with advanced glaucoma.Dr. Fried took inspiration from retinal prostheses to build upon the cortical brain implant. The implants are revolutionary because it means they go directly to the source (the brain).The cortical brain implant works by gathering information externally and it converts that data to stimulate the brain so the patient can perceive it. However, vision science doesn’t end there! Vision science keeps building on itself. In this case, the cortical implant technology was inspired by artificial retinas, which took their inspiration from the cochlear implant.How do you target a single neuron? Dr. Fried’s innovative solution was the use of coils, which are smaller than a human hair, to help specify which neurons need activation. When you go directly to the brain, there are some complications that occur. The brain sees the implant as a threat and creates an inflammatory response, which blocks the electrodes from communicating with one another. By using these coils, it bypasses the body’s natural inflammatory response and keeps the lines of communication open.This innovation in technology did not happen overnight. It took over a year and a half to get the coil experiments to work alone, and that doesn’t include all the other methods Dr. Fried experimented with that didn’t succeed. Science is about building upon prior research, and it takes time and a lot of experimentation before a solution will work. Tweetables: “Cochlear implants had taught us that if you even put some of a rudimentary signal in the ear, that the brain can start to use it….. So we want of reconfigured a cochlear implant and used it to stimulate the retina”. — Dr. Mark Humayun “In its simplest form, a retina prosthesis is an array of electrodes. The common one is 6x10 electrodes and each electrode is designed to stimulate a small portion of the retina.” — Dr. Shelley Fried “We run into additional problems when we go into the brain that don’t exist in the retina. One of them is the brain has a huge inflammatory response to the implant.” — Dr. Shelley Fried “Coils are not only more stable over time, but they’re more selective. They’re able to create a smaller region of activation. And so we think we can get much higher acuity with coils than we can with...


Telehealth Is Opening the Doors for Patient Eye Health

This podcast is about big ideas on how technology is making life better for people with vision loss. Today’s big idea is: How will remote diagnostic tests change ophthalmology and vision care? It might be a foreign concept for some, but the specialists in today’s episode, Dr. Peter Pham and Dr. Sean Ianchulev, founders of (Keep Your Sight, a nonprofit focused on remote diagnostic vision tests) share how they can conduct more reliable perimetry tests that help detect macular degeneration, glaucoma, and other conditions that lead to vision loss and eventually blindness — remotely, while patients stay home. Developments like these in remote diagnostics are a stepping stone for the ways machine learning will impact the field of ophthalmology in the future. This episode also features Dr. Einar Stefansson and Dr. Arna Gudmundsdottir, developers of the app, Retina Risk, which helps with remote risk assessment of diabetic eye disease for people with diabetes, as well as Sherrill Jones, who lost her vision due to glaucoma. The Big Takeaways: Retina Risk was created to help people with diabetes assess in real-time their individualized risk for sight-threatening diabetic retinopathy. The app was created back in 2009 and the concept of using technology and algorithms to calculate risk was still quite foreign to most people.What goes into taking a regular perimetry test today? Patients have to come into the office, wait, register, wait some more, get taken to a dark room to be positioned correctly, and after 20-30 minutes, you get a result. Now, there’s an easier way: patients can take these tests at home.Why is telescreening so important? Dr. Pham and Dr. Ianchulev noticed it could take months for patients to be scheduled in for routine visual field tests. By that time, the glaucoma may have advanced, and in some cases, rapidly. There was an unmet need here and there was a better way to serve people quicker and more efficiently, especially people from rural communities who did not have readily available access to healthcare.Medicare did not allow for doctors to reimburse their services unless it was conducted within the physician’s office. This led to a lot of roadblocks in telemedicine, despite the technology being available for the last 15-plus years. Thankfully, in December of 2020, policies were changed so that doctors would be reimbursed for remote patient monitoring. Tweetables: “We know that our blindspot is 15 degrees away from fixation and, with simple trigonometry, you can now use that blindspot to help position patients correctly in front of the computer monitor. We can now use online technology to perform visual field tests.” — Dr. Peter Pham “It was our goal to do a hardware-free digital/virtual device. We felt in ophthalmology, we’re kind of lucky. We are looking at a visual function. So perimetry lends itself to a fully virtual software as a service device.” — Dr. Sean Ianchulev “I think technology will help us get to the next level. Technology has been around for this, but it hasn’t been applied for this.” — Dr. Sean Ianchulev Contact Us: Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss. Pertinent Links: Lighthouse Guild Retina Risk Keep Your Sight.org Guest Bios: Dr. Peter Pham & Dr. Sean Ianchulev are both the Co-Founders of Keep Your Sight. Dr. Pham is a boarded certified ophthalmologist who has devoted his professional life to restoring sight and helping patients keep their vision. As a surgeon and clinician, Dr. Pham treats conditions such as glaucoma, cataract, and macular degeneration, all of which can cause blindness. As a researcher, he worked on the development of a novel delivery system for introducing large-sized molecular compounds into thousands of living cells simultaneously. Realizing the importance of technology and innovation for screening and prevention, Dr. Pham teamed up with Dr. Ianchulev to develop the KYS...


How the Simple QR Code Became an Empowering Navigation Tool

This podcast is about big ideas on how technology is making life better for people with vision loss. This episode’s big idea is navigation and how to implement a navigation solution that enables people with vision impairment to broadly travel cities — how and when they want to, independently. Dr. Roberts talks with Javier Pita, the creator of such a technology called NaviLens, which marries location finding with information. Dr. Roberts also talks with representatives of New York City’s Metropolitan Transit Authority — one of the biggest transportation hubs in the world. They discuss the importance of accessible public transportation for people who are visually impaired and how NaviLens technology can help make independent navigation a reality. The Big Takeaways: NaviLens system uses improved QR technology with a new type of code made up of four colors that enables it to store more information than a black and white QR code. Using a smartphone, the NaviLens app scans the area. Once it picks up the unique NaviLens code, the app provides the embedded information audibly to the user along with their distance/directionality from the code. As long as the code appears anywhere in the field of view of the smartphone camera, the code is detected and information is delivered. NaviLens is more accurate than GPS technology because it takes into account smaller distances that are crucial to navigation for people who are visually impaired. NaviLens codes can be read up to 12 times farther away than QR or bar codes as well as at 160-degree angle.Future advances to the NaviLens technology include a 360-degree technology that will register and retain the user’s location so the system can still tell where they are, and guide them to the destination even if they lose contact with the code. In addition, the NaviLens GO app uses advanced technology to help users navigate indoor spaces such as stores and to locate items in the store. This technology is elegant, inexpensive, flexible, easy to use, and fits seamlessly into a user’s life. While already part of public transportation in Barcelona, cities like New York City are testing it and hope to make this technology a more integral part of their public transportation system. Tweetables “Public transportation is the answer to so much inequity across all urban areas, and nonurban areas. If we can work to make the system as safe as possible for any range of abilities, that would be an enormous win, and huge piece making public transit truly public transit.” – Mira Philipson, Systemwide Accessibility Analyst, Metropolitan Transportation Authority New York City Transit “I could walk down the hallway and it’s telling me when I’ve arrived at this department and the door is right in front of me — it really gives me that autonomy that I really crave.” - Ed Plumacher, Adaptive Technology Specialist, Lighthouse Guild “We began in public transportation because for us and the users on our team, it is super important to make public transportation more accessible.” - Javier Pita, Founder and CEO NaviLens “Accessibility needs to be built into products, websites, software, whatever it is, from the ground up, because it will just lead to a better product overall.” Gian Carlo Pedulla, Supervisor, NYC Department of Education and Member, Advisory Committee for Transit Accessibility, Metropolitan Transportation Authority New York City Transit Contact Us: Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss. Pertinent Links: Lighthouse Guild NaviLens NaviLens GO Guest Bios: Javier Pita Lozano, Founder and CEO, NaviLens Mira Philipson, Analyst, Systemwide Accessibility, Office ofthe President, Metropolitan Transportation Authority New York City Transit Gian Carlo Pedulla, Supervisor, NYC Department of Education and Member, Advisory Committee for Transit Accessibility, Metropolitan Transportation Authority New York City...