IMAGE Project

The IMAGE Project

Making internet graphics accessible through rich audio and touch

Have you ever wanted to hear a photograph or pie chart, and not just a description of what is in it? The IMAGE project (Internet Multimodal Access to Graphical Exploration) adds controls to graphics in your browser so you can activate it on photographs and some charts and maps, on any webpage you visit. You will then receive experiences that combine spoken words with other sounds you hear around your head, that tell you details such as where things are, how large they are, or other information depending on the graphic. We are also working on touch experiences using add-on hardware including the Haply 2diy force feedback device and the Humanware Monarch braille pin array tablet, but those IMAGE experiences are not yet released. The goal is to provide people who are blind or have low vision with a new and useful experience of internet graphics that goes beyond automatically generating alt tags. To learn how IMAGE can practically help you, and how to install the IMAGE browser extension so you can try it out, visit the Using IMAGE page. To install the IMAGE browser extension so you can try it out, keep reading: The IMAGE extension is fully supported on Chrome, but also works on Microsoft Edge, Brave, Safari, and Opera for Desktop or Laptop. To use IMAGE, install the extension to your web browser by visiting the IMAGE Project Chrome webstore page by activating the following button.

Install IMAGE extension (any Chrome-based browser)

If you are a developer interested in using IMAGE, start here.

If you are a researcher interested in using IMAGE, find out more about our project here.


A man who is blind or low sighted wearing a sweater and headphones, sitting in front of a computer in a library, reading a braille book.

Our Approach

We use rich audio (sonification) together with the sense of touch (haptics) to provide a faster and more nuanced experience of graphics on the web. For example, by using spatial audio, where the user experiences the sound moving around them through their headphones, information about the spatial relationships between various objects in the scene can be quickly conveyed without reading long descriptions. In addition, rather than only passive experiences of listening to audio, an optional haptic device can help the user literally feel aspects like regions of a landscape, objects found in a photo, or the trend of a line on a graph. This will permit interpretation of maps, charts, and photographs, in which the visual experience is replaced with multimodal sensory feedback, rendered in a manner that helps overcome access barriers for users who are blind, deafblind, or partially sighted.


Engaging the Community

Collaborating with the community is key when creating accessible technology. Our team is partnering with Gateway Navigation CCC Ltd and the Canadian Council of the Blind (CCB), a consumer organization of Canadians who are blind, to ensure that our system is in line with the needs of the community. We are in regular contact with community members as part of our co-design approach, who are helping guide the development process but there is always room for more voices. If you'd like to contribute to the project, we invite you to fill out our community survey.

Participate in our community survey.

The arms of two individuals in a warm handshake


Contact Us

For any information related to project you can contact image@cim.mcgill.ca. Follow our lab on Twitter and LinkedIn.