In lieu of actual content (maybe later), I'd like to post something cool from Singularity Hub.
vOICE is an innovative augmented reality system that lets people see with sound. Images from cameras are translated into tones, which are played through headsets. The system has surpisingly good fidelity, and implications for brain research.
What’s amazing about the system – and what makes it a hot item for neuroscience research – is that it appears to restore the actual subjective experience of vision (visual qualia) to blind users, rather than just teaching them to correlate objects and sounds. Users have reported the return of experiences like depth and the sense of empty space in their environment. The restored vision is not the same as normal visual experience – one user described it as being comparable to an old black and white film, while others report vague impressions of objects as shades of grey. Research is now underway to understand how the vOICe system might be rewiring the brain to achieve this effect.
Holy crap, we can see with sound. No inplants required, no electrode arrays on the tongue, just off the self tech and clever algorithms.