Posted by fireburns at 203.atlanta-51-52rs.ga.dial-access.att.net on January 30, 2003 at 11:48:07:
In Reply to: horseshit... posted by Joey cheri popper on January 30, 2003 at 11:42:35:
http://atlanta.creativeloafing.com/2003-01-22/cover.html
FATHER OF CYBORGS: Dr. Phil Kennedy, neuroscientist and CEO of Neural Signals, Inc., has fused the human brain with a computer. (Jim Stawniak)
--------------------------------------------------------------------------------
The brain of the future
An Atlanta neuroscientist has wired the human brain to a computer. The results are fascinating and frightening.
BY MARA SHALHOUP
In a ground-floor room of Emory University's Wesley Woods Hospital, the man who used to be a cyborg shifts his eyes to the right. That means no, in answer to the question, "Would you like me to read to you?" The Georgia State University researcher gives up on the book she's brought. She tells the man, who we'll call by his initials, T.T., that she couldn't find a copy of The Godfather. She's brought another movie, Rounders. The love interest is a blonde, she points out. "Would you like to watch it?"
T.T. flicks his gray eyes to the left. Yes. Even if he wasn't breathing through a respirator, T.T. wouldn't have found the voice to answer.
He first realized something was wrong 15 years ago, when he started stumbling. He had been a champion swimmer at his St. Simon's Island high school, had moved on to kayaking and mountain biking in his 20s. Then his muscles suddenly began failing him at 28. T.T. was diagnosed five years later with a rare genetic disease, mitochondrial myopathy, which likely killed his mother in her mid-30s. He lost the ability to walk four years ago. Now he can only move his eyes.
Dr. Philip Kennedy, a neurologist who invented the hardware implanted in T.T.'s brain, pats the man's atrophied left arm. "We'll leave you to your movie," Kennedy says in the Irish accent that has stayed with him since his move to America 20 years ago. He signs the log on the bedside table, sprays his hands with disinfectant foam and bows out.
In the hallway, Kennedy talks about what he didn't want T.T. to hear. Several months earlier, a CAT scan showed that T.T.'s brain had shrunk. A lot. There is nothing more Kennedy -- or the electrodes he placed four years ago -- can do.
But all is not lost for others who suffer similar disabling diseases. There's D.J., a patient implanted with Kennedy's electrodes three months ago. The incision in her head hasn't closed, so the electrodes implanted there will have to be extracted and re-implanted. Just a minor delay. Kennedy is hopeful -- both about D.J. and the future of his technology. He is also fearful.
On the one hand, there is a medical phenomenon growing out of Kennedy's work. His electrodes could enable patients frozen by disease or injury to circumvent the spinal cord, re-establishing a decent range of speech and motion. So far, three of Kennedy's six implantees have learned to move a cursor across a computer screen and spell words just by thinking about it.
If Kennedy's technology reaches its full potential, something more amazing will have occurred. The people implanted will no longer be human as we know humans. Their existence will depend on machines, and their brains will have adapted accordingly. They will think differently. They will use their minds to control computers, with more stunning results than any human to date.
That introduces the ethical dilemma of a more manipulative use of what's called brain-computer interfacing (BCI), a way of warping the technology to turn an average brain into a superpower.
If BCI can unlock those caged by their bodies, imagine what it could do for those in perfect health. Think of it this way: With the same patience practiced by those paralyzed by injury or disease, you could do new things with your brainwaves, too. With the ability to marry the brain to a computer, you would become capable of intellectual and, possibly, physical feats unknown to man. You could be a superhero. You could be a supervillain.
"The brain-machine interface has been done, and I'm glad to have done it," Kennedy says. "But a responsibility has evolved there. I don't know how to make it not get out of hand."
He says he got into this work with the sole intention to aid the disabled. Imagine being resigned to the role of observer, one who mostly watches a hospital room. Does the frustration of such isolation ever subside? Must life take on new meaning in the absence of speaking, eating, laughing and making love? Is it enough to live only to think?
Kennedy is changing the obvious answers to those questions. Since 1988 he has faced several obstacles -- an initial absence of funding and peer support, as well as the subsequent deaths of his subjects. Still, he has succeeded as the only neuroscientist to implant electrodes in the human brain. Only three of his six implantees have learned to spell, the fastest at a rate of three letters per minute.
That patient, Johnny Ray of Carrollton, died last summer, four years after he was implanted, making him the patient to hold the electrode the longest. He was also the world's first cyborg. "He went out on a cloud," Kennedy says, although it's obvious the patient's death was difficult for the doctor himself.
Three other implantees have also died, and T.T., who was implanted in 1999, deteriorated so quickly he only communicated nine months.
Kennedy, who has won $1.7 million in grants, mostly from the National Institutes of Health, intends to keep implanting. He's got government permission to implant five more patients, and plans to get permission for as many more as he can. He foresees a patient bearing 10 or more electrodes, strategically placed at points in the brain that dictate speech and movement, who will overcome the communication barrier and the mobility barrier as well.
"We've passed the threshold," he says.
BCI technology is only in its infancy -- so much so that most people are unaware of its potential or even existence. It has piqued the government's interest, however. The president's Council on Bioethics, which was formed in part to address human cloning and stem cell research, has indicated it may take up BCI technology as its next issue.
The council's chairman, Leon Kass, has spoken against cloning -- but his remarks regarding ethics are indistinguishable from concerns over BCI. "I remain enthusiastic about biomedical research and its promise to cure disease and relieve suffering," Kass told the U.S. House of Representatives' Subcommittee on Health in 2001. "Yet, as has been obvious for some time, new biotechnologies are also providing powers to intervene in human bodies and minds in ways that go beyond the traditional goals of healing the sick, to threaten fundamental changes in human nature and the meaning of our humanity."
BCI's ability to make a smarter brain is one "fundamental change" that may alarm Kass. The U.S. Department of Defense, which is now funding some BCI research, appears less concerned. In the same way that Einstein's breakthrough research advanced science that benefits mankind and contributed to the creation of nuclear weapons that threaten mankind, BCIs could give birth to an uneasy tension between technology for the sake of medicine and technology for domination.
"In 20 or 30 or 50 years down the road," Kennedy says, "you're going to give power to people who really shouldn't have it."
One thing's for sure. The brain of the future is on the way, and it could arrive more quickly than neuroscience imagined.
In the late 1980s, Kennedy, who was running a lab at Georgia Tech's Biomedical Engineering Center, was trying to solve what irked him and other neuroscientists studying electrode implants: The brain bobbed too much to keep them firmly in place.
Electrodes implanted in monkeys swayed with every step. And no matter how close to a brain signal the electrode was placed, it could never pick up the signal consistently. "I thought, 'The technology is just not right,'" Kennedy recalls. "'It's just not adequate enough for the job.'"
But what if he were to implant a far simpler design, a hollow cone of glass, a millimeter-and-a-half in length? Would the neuron (a cluster of brain cells that dictates specific body functions) wrap its arms (called neurites) around the cone, cushioning it from the bobbing? And would the electric signal traveling along the neurites be easier to pick up if a conductive gold wire or two were laid along the cone's inner walls?
Kennedy applied for a patent, got permission to set up a small lab at Emory's Yerkes Primate Center and implanted the electrodes in monkeys. They worked, and his discovery set him apart from the half-dozen other groups working on similar research. He began planning a business, of which he would be CEO, named Neural Signals.
Fellow neuroscientists were skeptical.
"People thought it was ridiculous," Kennedy says. "Two pieces of wire and broken glass? That will never work. So I got no funding."
He returned to Emory to do a residency in neurology, so that he could make a living practicing medicine and work on his less lucrative electrode on the side.
As the residency was winding down, Kennedy appealed to the U.S. Food and Drug Administration to allow him to try his electrodes on humans. In 1996, the FDA said yes. It was his simplicity of the design that won the agency over. No other design had captivated them, and none has since.
With the help of Emory neurosurgeon Dr. Roy Bakay, Kennedy implanted his first patient in December 1996. He identifies her only as M.H., a former special ed teacher who suffered from ALS, or Lou Gehrig's disease.
"Locked-in syndrome" is the medical term used to describe someone who, like M.H., has lost use of his or her body but whose brain remains intact for decades. It is the scourge of stroke and spinal cord injury, as well as Parkinson's disease and ALS, the syndrome afflicting the famed physicist Stephen Hawking.
As with about a quarter of locked-in patients, Hawking has retained a tiny bit of movement, the shifting of his eyes. Yet he manages to communicate by focusing on different points of a computer screen filled with different letter combinations. He can form words and "speak" them through a synthesizer. Hawking has been able to continue his work in explaining black holes, the origin of the universe and its future.
Most locked-in patients, however, cannot communicate with their eyes as Hawking can.
The electrode implanted under M.H.'s skull, atop the cortex, only picked up increased or decreased brain activity (Kennedy's later implants would pick up more detailed signals). It amplified the electric signals by about 1,000. The signals were transmitted to an FM radio receiver attached to the skull and then sent as radio waves to a computer across the room.
To test the electrode, Kennedy streamed M.H.'s brain activity through the computer, playing the noise through a speaker. He likens the sound to "running a stick down a railing."
"We said, 'Listen to the signals. We want you to quiet it down. Just stop everything.' And then we said, 'Just make it really active.' And she did," Kennedy recalls. "That's all it was. It was very crude. But it was very effective. It showed she could affect the signals."
She died 76 days after the experiment, from pneumonia and kidney failure. Softening and strengthening her brain signals were her last communications with the world.
Then came Johnny Ray, a drywall contractor who played jazz guitar and who'd been locked into his body at age 43 by a stroke.
At the time Ray received his implant in March 1998, he could still move his face a little, smile and shift his eyes. Kennedy implanted two electrodes next to the part of Ray's brain that controlled his left hand. He started asking Ray to try communicating with the computer by thinking about moving the mouse with his hand. The cursor moved.
Nine months into Ray's training, which was so exhausting it could not exceed 20 minutes a day, Kennedy asked him to spell his name by moving the cursor over a screen of letters. Ray spelled JOHN on two of his first four tries. He took a break, tried again and didn't do so well, presumably because of fatigue. He spelled JOHLQQQ.GYUVWABDN, then HIJJROHNLN, then JOIH.N. "And then he started to spell P," Kennedy says. "And I thought, 'Oh well, I'll just leave him alone.
"And then he spelled my name." PHIL.
Ray's mastery of the computer improved over the next six months. On May 24, 1999, Kennedy visited Ray at his home in the V.A. Medical Center in Decatur. Kennedy asked what he felt as he moved the cursor. Ray spelled NOTHING. He had learned to move the cursor simply by thinking of moving the cursor. He no longer had to think about moving his hand. The part of his brain that was supposed to dictate movement evolved; it abandoned hope for the appendage. The brain opted instead to communicate directly with the computer.
"That led us to say he has now devoted that part of his brain to driving the cursor," Kennedy says. "See, the brain is very crafty. It's very adaptable. It can learn. And that's really the key thing."
Three months later, Kennedy moved the homeless Neural Signal Inc. into the Advanced Technology Development Center, which is affiliated with Georgia Tech. One of the air-conditioning vents drips when it rains ("What do you expect for cheap rent?" Kennedy says). The sign on the door says, "More Power From Your Brain."
Three years after that, Ray died from a brain aneurysm.
Ray was Kennedy's star student, leaving an indelible mark on neuroscience. He proved that man can train the neurons in his brain to do more than what they were created to do. Man can command his neurons, and that made the simplicity of Kennedy's design all the greater. He now believes that to re-wire a paralysis victim, a handful of electrodes strategically placed near a few good neurons could do the job.
"A lot of people now agree that you don't need hundreds and hundreds of neurons, but just a few good ones," Kennedy says. "If we can hold neurons that long and then control their activity, [future patients] will learn to use those signals to control what they want."
Even muscle activity, Kennedy supposes. His next big goal is to apply the same technology to the parts of the brain that control motion. "What we're trying to do is to [implant] ALS patients early, before they lose the ability to speak," Kennedy says. That way they can train. "We had a few lined up, but we're still looking for patients."
Of course, the only way to gauge the true potential of BCI is to implant a person in perfect health.
"There really hasn't been a fair trial of just what the system can do, what the brain can do," Kennedy says. "An ideal trial? That would be a normal person. But I wouldn't do it."
Not everyone embraces his reservations.
While most scientists working in BCI technology share Kennedy's mission to restore normalcy to the disabled, some have other plans. Just think. BCI technology could redefine "normal" not just for the disabled but for the able-bodied, too. Although none besides Kennedy has worked on human subjects (with the exception of one scientist, who implanted a silicon chip in his own arm), some BCI work focuses on distinctly human -- or is it superhuman? -- enhancements.
Miguel Nicolelis, at Duke University, and John Chapin, at the State University of New York Health Science Center, have implanted monkeys with electrodes that allow them to move a remote robotic arm. Using technology similar to Kennedy's, when the monkey moves its own limb, the electrode captures the signal and sends it to the robotic arm, too. For good measure, Nicolelis and Chapin even sent the signal over the Internet -- so that when the monkey moved its arm, 600 miles away another robotic arm mimicked the motion.
"[I]t seems reasonable to expect that [BCI] could trigger a revolution in the way future generations interact with computers, virtual objects and remote environments," Nicolelis wrote two years ago in Nature, "allowing never-before-experienced augmentation."
Clearly, Nicolelis and Chapin's work builds a foundation for prosthetics of the future (think of Luke Skywalker's realer-than-real prosthetic fingers wriggling at the end of The Empire Strikes Back). But the fact that their work is funded by the U.S. Department of Defense -- and that they are working on another defense-funded project in which an implant has successfully programmed rats to, say, turn left on command -- raises questions as to the uses of BCI outside the realm of medicine.
"It's something that should make us all wonder," says Ellen McGee, associate of bioethics at the Long Island Center for Ethics. What might the military want with this information? she asks. On whom would these devices be used? "People in the military are not always considered free subjects when the military decides to use a technology," she says.
In addition to any nefarious military operations, McGee wonders if BCI might ultimately create a schism between those who could use it to make them smarter -- a new race of superhumans -- and those who couldn't -- the race of regular guys.
"I can only applaud all of Dr. Kennedy's work and hope for his continued success, because it has wonderful therapeutic implications," McGee says. "My concerns are more with the uses of the technology by people who may not have as many beneficent intentions."
There could come a day when man's idiosyncrasies and imperfections may be all but obscured by the computer he controls -- and in turn controls him. It's possible to turn the tables on BCI's design, according to Kennedy, allowing the brain to receive signals in addition to sending them. In fact, Kennedy believes downloading information into the brain may be easier to master.
"Can we use more of our brain, make ourselves more powerful? Oh, definitely," Kennedy says. "It bothers me to think about. The people who want to be really powerful are the ones who are going to get that technology. Maybe people like that shouldn't be able to give orders."
Theodore Berger, professor of biomedical engineering at the University of Southern California, has no qualms about his intent to make the brain smarter and faster. He's working on a microchip that could give the brain more memory, which can benefit anyone from Alzheimer's patients to college professors.
He also believes it's possible to upload memory -- an entire life's worth.
During an interview that aired on "48 Hours" last summer, Berger said it's possible to record on a computer chip man's every emotion.
"Everything?" the interviewer asked.
"I think everything," Berger answered.
"Fear, loathing, love, hate?"
"Absolutely."
"Sex appeal, sense of humor, everything?"
"Everything."
Kennedy's belief of what BCI can do stops short of Berger's. Thoughts are a different animal than brain signals that merely control body functions. Neuroscientists have not yet identified a specific point in the brain where thoughts originate -- perhaps they occur all over. Thinking is more abstract than moving your left hand.
"All we're doing is picking up a little tiny bit of neural activity and having the patient move the cursor around," Kennedy says. "It's different than picking up a thought. If thought is a whole big football field, we're just picking out two blades of grass."
Perhaps thoughts don't originate in the brain. The brain carries them, but might they come from somewhere else, something that makes a person who he or she is, a soul of sorts?
"It's a big philosophical question," Kennedy says. "People are working on that. Is there a separate mind from the brain? Some people say the brain is the mind."
Unraveling that mystery may help realize one of mankind's most elusive dreams: to live forever. If the brain and the mind are the same, then it might be possible to capture the mind on a computer chip.
"Some philosophers tend to think of ourselves, our identities, as being our memories and some combination of our physical comportment," McGee points out. With cloning all but accomplished, that leaves BCI to complete the eternity equation.
"If at some point in the future we're able to upload our memories onto a chip, and they say that that will be possible in about 30 years," McGee says, "then if that chip were implanted in my clone, I could achieve a kind of immortality."
But what kind of immortality would that be? With computers taking nature's place, the results might be as grotesque as only science fiction writers have imagined.
"We have created a man who is one single, large, complex computer terminal," Michael Crichton wrote in his 1972 novel, The Terminal Man. "The patient is a read-out device for the new computer, and he is as helpless to control the readout as a TV screen is helpless to control the information presented on it."
Kennedy, who recently traveled home to Ireland to visit 1,000-year-old ruins on the country's rugged outer islands, says he has a fascination with man's longevity. "For someone who thinks so much about the future, I do love the past."
But he'd rather not experience both.
"I wouldn't want to live forever," Kennedy says. "Would you?"
mara.shalhoup@creativeloafing.com