Vision processing for Bionic Vision technology
In 2004, there were 50,000 people legally blind in Australia, with numbers expected to increase to 87,000 by 2024 with the ageing population (“Eye Research Australia Insight: The Economic Impact and Cost of Vision Loss in Australia” by Centre for Eye Research Australia).
CSIRO has developed a Bionic solution to restore sight in people with profound vision loss due to Retinitis Pigmentosa or Macular Degeneration. It uses huge image sets for training and inference.
Funded as a lead investigator by the Australian Research Council as part of the $50M Strategic Research Initiative on Bionic Vision, our Group collaborated with a number of leading Australian Universities and health institutes, playing a significant role in the bid for funding which started in 2010.
In 2017, the spin out company, Bionic Vision Technologies, in which CSIRO is a shareholder, raised US$18M in capital. Our team has had an additional $600K funding from this company to date and is finalising an additional $1.25M for 2019-2021.
The Solution
We developed a retinal implant and supporting technologies including vision processing. This was implanted in three profoundly blind individuals in 2012-2014 and into a further four in 2018.
Our technology selects and displays information to solve tasks required for independent living. We pioneered vision processing technologies that take image streams from a head worn camera and perform processing to turn them into stimulation patterns on the retinal implant, resulting in the perception of spots of light by the patients.
Advanced methods select and display key information such as orientation, mobility, and finds items on a table top.
This led to significant papers including a ‘first-in-humans’ study where we showed that the supra-choroidal approach was an effective and safe retinal implant therapy.
Critically, the paper showed end-to-end results where CSIRO vision processing enabled profoundly blind subjects to pass ultra-low vision tests. In the Journal of Neural Engineering, we reported two important breakthrough:
- For the first time, our advanced vision processing methods led to significantly improved performance on visual tasks.
- Participants using our advanced methods for scene understanding from colour and depth images performed significantly better in mobility tasks.
The Bionic Vision technology and how it works
The Bionic Eye is an implant that restores a sense of vision for people with vision loss due to Retinitis Pigmentosa or Macular Degeneration.
We are pioneers in research and industry field of Computer Vision for implantable prosthetic vision. This research is embodied as software that converts an image stream captured by a head-worn camera into a useful visual representation for implantees via electrical stimulation for a retinal implant. This might include highlighting the ground, obstacles, or faces.
The Bionic Eye consists of the implantable hardware, an external processor, and image processing software. The implant sits in the eye, behind the retina, and stimulates nerve cells providing the perception of vision.
The electrical impulses it produces are ‘seen’ as spots of light by the recipient. The implant is connected to an external smartphone sized device and a wearable camera. The device captures video from the camera and encodes it for the implant.
Our software converts the scene the camera captures into a useful representation to display to the patient. This might include highlighting the ground, obstacles, or faces.
The challenge of the Vision Processing for the Bionic Eye is how to present useful information to the patient when limited by the few dozen points of light induced by the electrode array.
Media Coverage
- Intrado Globe News Wire (2020): Bionic Vision Technologies Announces Interim Pilot Study Results of the BVT Bionic Eye System Designed to Help the Blind Achieve Greater Mobility and Independence
- Manufacturer’s Monthly (2018): Bionic eye project has sights on human trials
- News.com.au (2018) – CSIRO autonomous car breakthrough: bionic eye tech could help future cars ‘see’
- Australian Financial Review (2017): Bionic vision a step closer with the CSIRO’s new computing system
- Business Insider (2017) – Australia’s bionic eye project is about to start surgical implants
- Sydney Morning Herald (2014): Bionic eye trial ‘really promising’
Related publications
Journal Articles
- Mobility and low contrast trip hazard avoidance using augmented depth.”, McCarthy C, Walker JG, Lieby P, Scott AS, Barnes N, Journal of Neural Engineering, 12(1), Feb 2015.
- “The feasibility of coin motors for use in a vibrotactile display for the blind.”, Stronks HC, Barnes N, Parker D, Walker JG, Artificial Organs, accepted Sept 2014.
- “First-in-Human Trial of a Novel Suprachoroidal Retinal Prosthesis”, L N Ayton, P J Blamey, R H Guymer, C D Luu, D A X Nayagam, N C Sinclair, M N Shivdasani, J Yeoh, M F McCombe, R J Briggs, N L Opie, J Villalobos, P N Dimitrov, M Varsamidis, M A Petoe, C D McCarthy, J G Walker, N Barnes, A N Burkitt, C E Williams, R K Shepherd, P J Allen, for the Bionic Vision Australia Research Consortium, PLOSONE, Dec 18, 2014, DOI: 10.1371/journal.pone.0115239.
- “A new theoretical approach to improving face recognition in disorders of central vision: Face caricaturing.”, Irons J, McKone E, Dumbleton R, Barnes N, He X, Provis J, Ivanovici C, Kwa A., Journal of Vision, 14(2), Feb 17, 2014.
- “The Role of Computer Vision in Prosthetic Vision”, N Barnes, Image and Vision Computing, 20, 478-439, 2012.
- “Estimating Relative Camera Motion from the Antipodal-Epipolar Constraint”, J Lim, N Barnes, H Li, IEEE Trans Pattern Analysis and Machine Intelligence, 32(10), Oct 2010, pp 1907-1914.
- “Estimation of the Epipole Using Optical Flow at Antipodal Points”, J Lim and N Barnes, Computer Vision and Image Understanding, 114(2), Special issue on Omnidirectional Vision, Camera Networks and Non-conventional Cameras, pp 245-253, Feb, 2010.
Fully Refereed full length conference papers
- “Importance Weighted Image Enhancement for Prosthetic Vision: An Augmentation Framework”, by C McCarthy, and N Barnes, in Int Symp on Mixed and Augmented Reality, ISMAR’14, Sept, 2014.
- “Large-Scale Semantic Co-Labeling of Image Sets”, by J M Alvarez, M Salzmann, and N Barnes, in Winter Applications of Computer Vision (IEEE-WACV), March, 2014.
- “Exploiting Sparsity for Real Time Video Labelling”, by L Horne, J M Alvarez, and N Barnes, in Computer Vision Technology : from Earth to Mars, Workshop at the International Conference on Computer Vision, Sydney, Tasmania, Australia, Dec, 2013. (Best Paper)
- “Learning Structured Hough Voting for Joint Object Detection and Occlusion Reasoning”, by T Wang, X He and N Barnes, in Proc IEEE-CVPR, Portland Oregon, USA, June, 2013
- “Glass object segmentation by label transfer on joint depth and appearance manifold”, by T Wang, X He and N Barnes, in Proc ICIP, Melbourne Australia, 2013
- “An overview of vision processing approaches in implantable prosthetic vision”, by N Barnes, in Proc ICIP, Melbourne Australia, 2013
- “Augmenting Intensity to enhance scene structure in prosthetic vision”, by C McCarthy, D Feng, N Barnes, in Proc Workshop Multimodal and Alternative Perception for Visually Impaired People, San Jose, USA, July, 2013. (Best Paper)
- “The Role of Vision Processing in Prosthetic Vision”, by N Barnes, X He, C McCarthy, L Horne, J Kim, A F Scott, and P Lieby, in Proc IEEE EMBC, August, 2012
- “Time-To-Contact Maps for Navigation with a Low Resolution Visual Prosthesis”, by C McCarthy and N Barnes, in Proc IEEE EMBC, August, 2012
- “Image Segmentation for Enhancing Symbol Recognition in Prosthetic Vision”, by L Horne, N Barnes, C McCarthy, X He, in Proc IEEE EMBC, August, 2012
- “On Just Noticeable Difference for Bionic Eye”, Yi Li, C McCarthy, and N Barnes, in Proc IEEE EMBC, August, 2012
- “Text Image Processing for Visual Prostheses”, by S Wang, Yi Li and N Barnes, in Proc IEEE EMBC, August, 2012
- “An Face-Based Visual Fixation System for Prosthetic Vision”, by X He, J Kim, and N Barnes, in Proc IEEE EMBC, August, 2012
- “Phosphene Vision of Depth and Boundary from Segmentation-Based Associative MRFs”, by Y Xie, N Liu, and N Barnes, in Proc IEEE EMBC, August, 2012
- Ground surface segmentation for navigation with a visual prosthesis, C. McCarthy, N. Barnes and P. Lieby, 2011 IEEE Conference on Engineering in Medicine nd Biology (EMBC 2011).
- “Substituting Depth for Intensity and Real-Time Phosphene Rendering: Visual Navigation under Low Vision Conditions”, by P Lieby, N Barnes, C McCarthy, N Liu, H Dennet, J Walker, V Botea, and A Scott, in Proc 33rd Annual International IEEE Engineering in Medicine and Biology Society Conference, (IEEE-EMBS), Boston, USA, Aug 2011
- Surface extraction from iso-disparity contours, C. McCarthy, and N. Barnes, 2010 Asian Conference on Computer Vision (ACCV 2010)
Conference abstracts
- “Lanczos2 Image Filtering Improves Performance on Low Vision Tesets in Implanted Visual Prosthetic Patients”, by N. Barnes, A. F. Scott, A. Stacey, P. Lieby, M. Petoe, L. Ayton, M. Shivdasini, N. Sinclair, J G. Walker, Proceedings of the Association for Research in Vision and Opthamalmology annual meeting (ARVO 2014), Orlando, Florida, USA, May, 2014
- “Caricaturing improves face recognition in simulated age-related macular degeneration”, Elinor McKone, Jessica Irons, Xuming He, Nick Barnes, Jan Provis, Rachael Dumbleton, Callin Ivanovici, Alisa Kwa, VSS, 2013
- “Evaluating Lanczos2 image filtering for visual acuity in simluated prosthetic vision”, by P. Lieby, N. Barnes, J G. Walker, A F. Scott, N. Barnes and L. Ayton Proceedings of the Association for Research in Vision and Opthamalmology annual meeting (ARVO 2012), Seatle, USA, May, 2013
- “Low contrast trip hazard avoidance with simulated prosthetic vision”, by C. McCarthy, P. Lieby, J G. Walker, A F. Scott, V. Botea and N. Barnes, Proceedings of the Association for Research in Vision and Opthamalmology annual meeting (ARVO 2012), Fort Lauderdale, FL USA, 2012. (oral presentation)
- “Evaluating Depth-based Visual Representations For Mobility In Simulated Prosthetic Vision”, by N. Barnes, P. Lieby, J G. Walker, C. McCarthy, V. Botea and A F. Scott, Proceedings of the Association for Research in Vision and Opthamalmology annual meeting (ARVO 2012), Fort Lauderdale, FL USA, 2012
- “Mobility Experiments Using Simulated Prosthetic Vision With 98 Phosphenes Of Limited Dynamic Range”, by P. Lieby, N. Barnes, C. McCarthy, V Botea, A F. Scott and J G. Walker, Proceedings of the Association for Research in Vision and Opthamalmology annual meeting (ARVO 2012), Fort Lauderdale, FL USA, 2012
- “Orientation and mobility considerations in Bionic Eye Research”, by Lauren Ayton, Sharon Haymes, Jill Keeffe, Chi Luu, Nick Barnes, Paulette Lieby, Janine G. Walker, Robyn Guymer, 14th International Mobility Conference, Palmerston North, New Zealand, Feb, 2012
- Mobility Experiments with simulated vision and sensory substitution of depth, N Barnes P Lieby, H Dennet, C McCarthy, N Liu, J G Walker, ARVO, 2011
- Face detection and tracking in video to facilitate face recognition with a visual prosthesis, Xuming He, Chunhua Shen, N Barnes, ARVO, 2011
- Investigating the role of single-viewpoint depth data in visually-guided mobility, N Barnes P Lieby, H Dennet, J G Walker, C McCarthy, N Liu, Yi Li, VSS, 2011.
11 “Object detection for bionic vision”, by L Horne, N Barnes, X He and C McCarthy, 2nd Int Conf on Medical Bionics: Neural Interfaces for Damaged Nerves, Phillip Island, Vic, Australia, Nov, 2011 - “Automatic Face Zooming and Its Stability Analysis on a Phosphene Display”, by J Kim, X He and C McCarthy, 2nd Int Conf on Medical Bionics: Neural Interfaces for Damaged Nerves, Phillip Island, Vic, Australia, Nov, 2011
- “The impact of environment complexity on mobility performance for prosthetic vision using the visual representation of depth”, by J G Walker, N Barnes, P Lieby, C McCarthy and H Dennet, 43rd Ann Sci Congress of the Royal Australian and New Zealand College of Ophthalmologists – Sharing the Vision, Canberra, ACT, Australia, Nov, 2011
- Expectations of a visual prosthesis: perspectives from people with impaired vision, by J E Keeffe, K L Francis, C D Luu, N Barnes, E L Lamoureaux, R H Guymer, ARVO, 2010
Awards
- CSIRO Digital and National Facilities Science Excellence Award, 2017.
- 2 ACT iAwards, National Merit Recipient, 2017.
- 2016 iAwards for Big Data Innovation of the Year (Tas), AIIA.
- 2016 PR Week’s Global Impact Award, received in London.
- 2017 Best Paper Award Int Conf Digital Image Computing, Technologies and Applications.
- Best Paper, 2013, Proc Workshop Multimodal and Alternative Perception for Visually Impaired People
- Direct email of congratulations on results.
Our highly skilled team of world class researchers and engineers is open to partnerships and collaborations for research, development, and commercialisation.
Contact us to learn more.