Proposed hippocampal prosthesis. The biomimetic model is a microchip that transforms hippocampal input into output signals that mimic what the damaged hippocampus would have produced.
Brain chips. Long a staple of science fiction, these wafers of silicon circuitry that interface with the nervous system are quickly becoming a reality as a result of pioneering new work in neuroengineering
You’re familiar with prosthetic hands, legs, etc. used to replace missing originals. Recent research brings us closer to creating “neural prostheses” that can replace or bypass damaged or malfunctioning brain tissue with microchips designed to replicate the lost function. Continue reading →
Neurostimulate in style with the Foc.us tDCS headset.
What if you could juice your ability to focus, learn a foreign language, or solve math problems by wearing a comfortable headset? How about one that sends electric current through critical brain regions?
Though it sounds like science fiction, the technology is real. Transcranial direct current stimulation (abbreviated tDCS), has shown promise in treating patients with a variety of brain disorders including Parkinson’s disease, epilepsy, and chronic pain. However, it’s the possibility of widespread use as a cognitive enhancer in healthy individuals that has generated the most buzz (and concern).
Should do-it-yourself electrical brain stimulation by the general public be encouraged, or even allowed? Can it really be safe? Continue reading →
Illuminating brain function with optogenetic tools. Credit: Neurosurgery Journal
A great deal of neuroscience research involves measuring the brain activity that accompanies certain behaviors or experiences. For example, a scientist might record the activity of a few brain cells while the research subject (probably an animal, possibly a human) is shown pictures of various objects, with the goal of identifying the cells’ preferences and hopefully learning something new about how vision is accomplished in the brain.
This approach has generated tremendous insights into how the brain works. But to get a complete picture of brain function, we want to not just measure but also to manipulate brain activity directly and see how this affects behavior. By turning on or shutting off certain brain cells or networks of cells we can establish precisely what role they play in the broader system and determine whether they have a role in causing specific behaviors.
A new method called “optogenetics” allows scientists to control the activity of groups of neurons with unprecedented precision simply by shining light on them. It’s also illuminating pathways to treatments for a host of brain disorders. Continue reading →
“The Avatar program will develop interfaces and algorithms to enable a soldier to effectively partner with a semi-autonomous bi-pedal machine and allow it to act as the soldier’s surrogate.” Sounds a bit like the movie Avatar, but with mind-controlled robots instead of genetically engineered alien bodies. Unlike the project depicted in the movie, though, this program is entirely real and presently funded by the US military.
In a recent article I discussed the history, rapid recent development and future potential of brain machine interfaces (BMIs). These devices record the ongoing activity in parts of the brain important for planning and executing bodily movement, decode the pattern of recorded activity to determine the brain’s intended movement, then use the decoded signals to control a machine, such as a robotic arm. This technology holds the potential to transform the lives of millions who suffer from lost limbs or movement difficulties resulting from spinal cord injury or diseases like ALS that prevent the brain from enacting its wishes on the body.
However, it also holds the potential to transform warfare in ways that seem ripped right from a science fiction movie. Continue reading →
The monkey, Idoya, seemed to just stand there, watching passively as the video feed displayed a large humanoid robot suspended over a treadmill taking deliberate, if ungainly, strides away from the camera. Though the monkey was physically still, she wasn’t just watching. Idoya was controlling the robot’s movements directly with her brain activity – a technological telekinesis born of cutting edge neuroscience and engineering. To make the scenario even more incredible, the robot was taking those steps in a laboratory on another continent, halfway across the world.
This experiment, which took place in 2008 in the laboratory of Miguel Nicolelis at Duke University (with the robot in Kyoto, Japan), was one in a series of milestones on a road that will inevitably lead to humans relying on similar brain machine interfaces (BMIs) to control a wide range of devices. Using only the responses of neurons – electrically active cells that comprise much of the organ of thought between our ears – we will control the behavior of prosthetic limbs, mechanized exoskeletons, and perhaps eventually alternative versions of ourselves, robotic or virtual. Continue reading →