David Eagleman is a neuroscientist and NYT bestselling author, but I appreciate him most for his talent as a science communicator — using straightforward language to make ideas about neuroscience widely accessible. Livedwired is his latest book, where he attempts to change the narrative around how we think about brain plasticity.
Let's think about human-built technology - an Apple Watch, for instance. We buy a specific model (hardware) and install apps (software) to help us accomplish whatever we need to do. This is a plastic system that depends on apps to adapt the functionality. If a new model comes out with heart-rate monitoring capabilities, however, we can't just install an app or update our watch. We would need new hardware.
This is where the hardware-software metaphor breaks down for understanding the brain. Unlike our technology, our neurons (hardware) are constantly redesigned by input through our senses (software). If we encounter a problem that we're not currently equipped for, we don't need a new brain - it can change itself dramatically to meet the demands.
Eagleman introduces a new term to describe a dynamic, adaptable, information-seeking system like our brain - a “livedwired” system.
To illustrate this, consider our five senses: touch, smell, sight, hearing, and taste. You might think that there is something special about them, but to our brain, it really doesn't matter kind of sensory organs we have.
Eagleman calls this the Potato Head model of evolution - our sensory organs are just plug-and-play peripheral devices.
“Whatever information the brain is fed, it will learn to adjust to it and extract what it can. As long as the data have a structure that reflects something important about the outside world”
In other words, our senses are arbitrary. Over the course of evolution, random mutations led to the senses we have and the brain simply figured out how to use them.
Here's why this matters:
What if you could stream the sound environment around you and convert it into touch on your skin? Would you be able to hear through your touch?
Turns out, the answer is yes. I briefly touched on this in an earlier newsletter, but Eagleman's startup created the Buzz, a wearable wristband that enables deaf people to hear the world via vibrations.
Here is one user's experience:
“Philip reports he can tell when his dogs are barking, or the faucet is running, or the doorbell rings, or his wife calls his name (something she never used to do, but does routinely now).”
Importantly, Philip isn’t just interpreting the buzzing on his wrist. After some training, he has the sensation of distinct sounds in his head! In other words, the brain has learned how to translate the vibrations into a perceived sound in real-time.
If you're curious to read about the experiences of deaf people using the Buzz device, check out the startup's blog.
Beyond expanding senses to take in more of what they normally do, could we create entirely new senses?
“Given our current knowledge, there’s no end to imagining the sensory expansions we’ll build: vision in other parts of the electromagnetic spectrum, or ultrasonic hearing, or being plugged into the invisible states of your body’s physiology.”
Here are a few wild possibilities:
- a surgeon feeling patient data so that they don't have to look up at the monitor during an operation
- a trader perceiving real-time data from the stock market
- a long-distance couple feeling each other's biometric data
You might not want any of these applications in your own life, but this is only meant to help you imagine that landscape of possibilities.
What will these things feel like? It’s impossible to describe. This much is clear: it won’t just feel like a buzz on your wrist. Over time, your brain will internalize the new data stream as an internal sense, much like sight or hearing.
Friday Brainstorm Newsletter
If you're curious for more, you should definitely join the 250+ other people on my Friday Brainstorm newsletter. It’s two emails a month with everything interesting I’ve read or found related to neuroscience.