π Sounds
This is going to be a short post. This morning I woke up with the feeling of trying something new. Listening to a recent podcast about the sound and the voice I came up with the idea of making a map for blind people. I found out that there is some research about routing or direction maps for people with limited vision or just blind, but what about thematic webmaps?
In less than an hour I learn about how to use the SpechSynthesis API to translate text into speech. I used Spanish from this list of available languages, but the voice still sounds like a robot. The interesting part, as discussed in the podcast, would be improving the voice to be as human as possible. This is the key lines of the code:
const msg = new SpeechSynthesisUtterance(region);
msg.lang='es';
window.speechSynthesis.speak(msg);
Then I built a webmap with interactivity using CARTO VL (but you can use any other webmapping library) and using the new learnings to speak a property when the user hovers over a feature.
The only problem is that since last year, Google Chrome only allows sounds after user interactions. So in order to make it work, we will need to click before hoveringβ¦
This is just an example with a category map, but the same can be applied to choropleths or proportional symbols. Of course, this map alone is not enough to offer a satisfactory experience for blind or limited vision people. You would need a complete UX for this.