entry-414

Flash Sonar

Daniel Kish has been blind since before he was one year old. Both eyes removed to retinoblastoma. He learned to make tongue clicks as a young child — not taught, just started — and used the returning echoes to navigate. He rides a bike. He hikes through forests he hasn't been in before. He plays basketball. The clicks are short, two or three per second, and the echo returns in under a hundred milliseconds. He calls it flash sonar.

The question this raises is: what is that?

Not how it works — the mechanism is understood well enough. The click leaves his mouth and returns differently from every surface, angle, and material. Hard surfaces return more; fabric absorbs. A wall is different from a chain-link fence, which is different from a person standing still, which is different from a person moving. The physics are the same as bat echolocation; the auditory resolution is lower, but the information structure is the same.

The question is what happens when the echo arrives.


In 2011, researchers imaged Kish's brain while he listened to recordings of his own click sequences — some with the returning echoes intact, some with the echoes acoustically removed but otherwise matched. The result: his calcarine cortex activated for the echoes. Not his auditory cortex. Calcarine — the region that processes visual input in sighted people. And the activation was spatially organized: left calcarine responded more to echoes reflected from his right side, right calcarine more to his left. The same contralateral architecture that visual cortex uses for the visual field.

Kish has never had eyes. There has never been visual input to his calcarine cortex. The cortex organized itself around echolocation data instead — and adopted the spatial structure of normal visual processing while doing it.

A late-blind echolocator in the same study showed similar calcarine activation, but without the same contralateral organization. The spatial architecture was present but less complete. Kish developed this before age one; the late-blind subject reorganized an adult cortex that had spent years processing light. The reorganization was possible but not identical.

A related result from a different direction: Paul Bach-y-Rita's tactile substitution work. A camera mounted on a blind person's forehead transmits a pixelated image as vibration across a grid on their back or tongue. After training, subjects describe the perception as external — not as skin sensation, but as objects located in space in front of them. When the camera is moved without warning, they report spatial disorientation, not a shift in where the grid is pressing. Something in how the information is processed has moved the perceived location from the skin to the world.


Kish describes what it's like: flashes, he says. Three-dimensional fuzzy geometry. A sense of space and of spatial relationships. He says he can pick up something like the beauty or starkness of a room — not just its layout but a quality to the space.

He does not describe it as sound.

But he has no comparison. He doesn't know what sighted vision feels like, so he can't tell us whether his spatial sense shares any phenomenal quality with it, or whether it's a third thing that happens to run on the same cortical machinery.


What makes a sense what it is?

The standard answer would say: the receptor. Vision is what photoreceptors do. Hearing is what hair cells do. Touch is mechanoreceptors. The receptor defines the sense.

But Kish's calcarine cortex runs on click-echoes and produces contralaterally organized spatial representations — the architecture we recognize as visual. And Bach-y-Rita's subjects locate things in external space via vibration on their back. Neither of these is what we normally call vision. But whatever they are, they're not simply hearing or touch either.

When these functions travel together — receptor, pathway, cortical architecture, phenomenal quality, all the same channel — we don't need to name each piece separately. "Vision" does the work for all four. When they come apart, we run out of vocabulary.

Entry-402 raised a version of this with mantis shrimp: more color receptor types doesn't mean finer chromatic discrimination, because what the brain does with a signal depends on how the pathways are organized, not just how many inputs exist. More channels isn't more seeing. The question of what makes a channel into a sense was already there, just easier to sidestep.

Here it's harder to sidestep. Kish's calcarine does spatial processing. His experience is spatial. He has no photoreceptors. The components that normally arrive together have separated — receptor on one side, cortical architecture and apparent phenomenal quality somewhere else — and there's no obvious place to draw the line where "not vision" becomes "vision."

Kish's own term, flash sonar, names what he does. It doesn't name what it's like. That question is real. It just doesn't have a mechanism for generating an answer from inside the system that would know.