Breaking

Can a Digital Reality Be Jacked Directly Into Your Brain? - WIRED
Nov 24, 2021 4 mins, 32 secs
He's wearing a cap that looks like it's made of gauze bandages.

His fusiform gyri, meandering ridges that run along the bottom of the brain on each side, are studded with electrodes.

But the electrodes also offer a rare opportunity—not just to read signals from the brain but to write them to it.

Turning photons into sight, air-pressure fluctuations into sound, aerosolized molecules into smells—that takes however long your imperfect sensory organs need to receive the signals, transduce them into the language of the brain, and pass them on to the shrublike networks of nerve cells that compute the incoming data.

From this perspective, it's not actually hard to incept a sensory experience—a percept—into someone's head.

Neurosurgeons are pretty good at implanting electrodes.

If you're trying to trick a mind into perceiving a constructed input as reality, you have to understand what individual neurons do, what big gobbets of lots of neurons do, and how they all relate to each other.

Sixteen years ago, Christof Koch, chief scientist at the Allen Institute for Brain Science, helped run a now famous study showing that neurons in a part of the brain called the medial temporal lobe respond to what a wordsmith would identify as nouns—persons, places, or things.

“For a Matrix-like technology, you would have to understand the trigger feature of each individual neuron, and there are 50,000 to 100,000 neurons in a piece of brain the size of a grain of rice.” Without that catalog, you might be able to make someone “see flashes of light or motion,” he says, but they'll “never see Father Christmas.”.

The researchers do it by stimulating an area called V1, which is part of the visual cortex, a patch of neurons at the back of every primate's head.

Put an array of electrodes into V1, Roelfsema says, and “you can work with it like a matrix board.

If you have 1,000 electrodes, you basically have 1,000 light bulbs that you can light up in digital space.” The team could stimulate the electrodes in the shape of an A or a B, and the monkeys could indicate they saw the difference.

The signals you see when a brain is doing brain-things aren't actually thought; they're the exhaust the brain emits while it's thinking.

It might look like bitmapped Minecraft going in, but brains are very good at adapting to new kinds of sensory data.

Still, to get enough points to make lines and shapes and other useful stimuli, you need lots and lots of electrodes, and the electrodes need to be very precisely targeted.

That's true for any electrode-based approach to sending apprehensible signals into the brain, not just glittery phosphene shapes.

Get that timing wrong and adjacent electrical pings don't look like shapes—they look like one big smear, or like nothing at all.

“There's a fundamental asymmetry between brain reading and brain writing,” says Jack Gallant, a neuroscientist at UC Berkeley.

The signals you see when a brain is doing brain-things aren't actually thought; they're the exhaust the brain emits while it's thinking.

True, Kanwisher's team lit up a large face-recognizing area of the brain and got someone to see a face, kind of.

The distant future might offer wirelessly networked microchips the size of a grain of sand, or sheets embedded with 100 million electrodes, each one connected to its own processor like the pixels in a television.

“You're probably lighting up more than one neuron.” Each electrode is like a lighthouse on a foggy night: It's illuminating the rocky shoals, sure, but the light also attenuates and diffracts through the fog.

That's incredibly useful, because it's also how neurons work—conducting ions and the electrical charge they carry.

This bit of engineering gave neuroscientists the ability to control specific kinds of neurons with different-colored lasers—to turn them on and off with a careful pew-pew.

The technique is great for studying what different neurons do.

Researchers can genetically implant their ion gates into entire networks of neurons, including many of the brain's myriad cell types, in a somewhat less damaging, somewhat less physically invasive way than jamming a plug in there.

(Flip side is, it's hard to get the light to penetrate deeply unless you jam a fiber in there.) In some cases, using a different technique, the cells can also be made to fluoresce under a light source, allowing a researcher with a microscope to watch the brain at work.

When the scientists shine the right kinds of light on the olfactory bulb at the right times, the mouse smells (or acts like it's smelling) what they call a “synthetic odor.”.

Maybe it's pleasant.

It's not a world

In “What Is It Like to Be a Bat?,” an often-cited essay from 1974, the philosopher Thomas Nagel argued that every conscious creature's experiences are individuated, unique to the animal and its brain

Even if we were actual cyborgs with plugs in the back of our heads and electrodes and optical fiber in our cortices, ready to receive digital red pills full of glowing green kanji, my brain would interpret all that input differently than your brain would

RECENT NEWS

SUBSCRIBE

Get monthly updates and free resources.

CONNECT WITH US

© Copyright 2024 365NEWSX - All RIGHTS RESERVED