The Associated Press, via FOXNews (December 2, 2009)
"A group of European scientists say they have successfully connected a robotic hand to a man who had lost an arm, allowing him to feel sensations in the artificial hand and control it with his thoughts.
"The experiment lasted a month. Scientists say it was the first time an amputee has been able to make complex movements using his mind to control a biomechanic hand connected to his nervous system....."
That's the good news.
The bad news is that, after a month, the electrodes that researchers had implanted in the man's arm were removed. I suppose, if he's young enough, he may be able to get a next-generation left hand: when the things are out of the research & development stage.
In a Wired Gadget Lab article (I'm getting to it), the writer pointed out: "...The human hand is difficult to replicate. It's an instrument that can squeeze a lime as effectively as it can hold a delicate lightbulb. The hand is not just about mechanical movement: Its sense of touch offers important feedback to the brain about the texture and nature of the object...."
The European scientists' hand is a big step toward prosthetic arms and hands that aren't a cosmetically-enhanced equivalent of Captain Hook's ersatz hand.
Now, that Wired article:
"Gallery: Bionic Arms Gain Power, Dexterity, Sensitivity"
Gadget lab, Wired (December 1, 2009)
"Prosthetic legs have gotten an image boost recently, thanks to the high-performance carbon-fiber springs worn by the likes of Oscar Pistorius and Aimee Mullins. But prosthetic arms still call to mind stiff, heavy chunks of plastic — barely one step up from Captain Hook’s creepy iron claw.
" 'Prosthetic legs are in the 21st century,' Dean Kamen recently told the trade publication IEEE Spectrum. 'With prosthetic arms, we're in the Flintstones.' Kamen, who invented the Segway, has been working on creating an advanced artificial limb...."
"...An extraordinary project from the Defense Advanced Research Projects Agency promises to make it happen. Darpa's $100 million Revolutionizing Prosthetics 2009 Program aims to create a thought-controlled functional arm within this decade. The project is a collaborative effort with more than 30 organizations including labs, universities and private companies...."
Between the 'we've never done it that way' feeling that office managers faced, when computers supplanted typewriters - and probably before that, when typewriters supplanted pens and pencils; and the knee-jerk reaction some folks have, when outfits like DARPA are mentioned: I think there may be a fuss when these high-tech prosthetics hit the market.
But I think they're a good idea.
Looks like more change is on the way:
"Intel Thinks Brain Implants Could Control PCs by 2020"
From ComputerWorld.com, summarized by FOXNews (November 20, 2009 )
"In its Pittsburgh research laboratory, Intel is developing chips that can harness human brain waves to operate computers, television sets and cell phones.
"The chips giant's scientists have found that blood flow changes in certain parts of the brain when people think of a specific word. By shrinking down the Functional Magnetic Resonance Imaging (FMRI) machines they use to monitor the flow of blood into a tiny chip, and implanting it in your head, you'll be able to type on a keyboard or dial your phone with the power of your thoughts...."
"Intel: Chips in brains will control computers by 2020"
"Brain waves will replace keyboard and mouse, dial phones and change TV channels"
" By the year 2020, you won't need a keyboard and mouse to control your computer, say Intel Corp. researchers. Instead, users will open documents and surf the Web using nothing more than their brain waves.
"Scientists at Intel's research lab in Pittsburgh are working to find ways to read and harness human brain waves so they can be used to operate computers, television sets and cell phones. The brain waves would be harnessed with Intel-developed sensors implanted in people's brains...."
"...'I think human beings are remarkable adaptive,' said Andrew Chien, vice president of research and director of future technologies research at Intel Labs. 'If you told people 20 years ago that they would be carrying computers all the time, they would have said, "I don't want that. I don't need that." Now you can't get them to stop [carrying devices]. There are a lot of things that have to be done first but I think [implanting chips into human brains] is well within the scope of possibility.'..."
I'll admit that the idea of a sort of combination TV remote, PDA, cell phone and Internet connection in my head feels: odd. Remember, in one of the X-Men movies (X2, I believe), the insomniac mutant kid: and the television that changed channels each time his eyes blinked?
But then, when I was growing up, in the late Industrial Age, I didn't realize that I'd be spending much of my time in front of a computer screen, working a (not-yet-invented) mouse and keyboard. And, using the research techniques I learned while using the ink-on-paper information storage and retrieval technology with an immensely faster technology: and having access to an immensely larger pool of data.
As for having something 'unnatural' in me?
You may have a full set of your adult teeth, all in good working order. I don't. Quite a bit of my dentition is artificial: which is nothing unusual, where Western medical technology is available.
I'd have a hard time seeing the computer's monitor clearly without an artificial clip-on prosthesis - which most people accept as perfectly normal, since spectacles have been around for generations.
I had my original-equipment hip joints swapped out, back in 2006. As a result, I'm able to move around more freely than has been possible for years. Decades. Ever, in some ways.1
One of the obvious applications is what was covered in the first article: giving people with prosthetic limbs better control - and feedback - from their replacement parts.
It occurs to me that 'typing with your mind' would require a certain degree of mental discipline. And, possibly, be nearly impossible for some people. Not because they're 'weak willed' - but because their brains won't stay 'focused on one spot' long enough for the machines to react.
There's a lot of research & development to do, of course. We're still working the bugs out of how to use some of the technologies involved - like fMRI. ((September 18, 2009)
The ComputerWorld.com article cites the fear of "Big Brother" putting chips in our head. The researchers think that's not so likely: and that consumers will demand the better, faster, interfacing they'll get with the technology. I think they're probably right.
Not that this technology won't be have its own problems - and be misused.
My guess is that, within a year of in-brain interfaces coming on the market, there'll be a whole new set of psychiatric conditions. Some people who tell the doctor they're hearing voices in their heads won't have a personality problem, and have neural circuits that work just fine: but their channel-changer is defective. And there'll be people who form (even more) un-natural attachments to their computer.
And, yes, I think the "Big Brother" issue will surface. And be dealt with: just as we have rules about peeping Toms and opening other people's mail: and are developing a stable set of rules about security cameras.
Luddites notwithstanding, I can't see technology as inherently evil. Or, good.
Printing presses, for example, can be used to print first-aid manuals: or books on why it's okay to kill other people for fun and profit.
It's the application of the technology that can considered good or bad - and that's determined by a human being. Even in the case of robots: someone determined the design parameters and programming.
I don't think it's true: but I'm familiar with the notion.
Life support machinery in an operating room can be very complex - and keeping someone alive is, arguably, a good thing to do.
An icepick can be used to chop ice for someone's drink on a hot summer day: not, I think, an intrinsically bad application. If it keeps you from getting heat stroke, I think one could say it's good to use an ice pick that way.
But a sort of glorified icepick can be used to slice and dice someone's brain: as Dr. Walter "Icepick" Jackson Freeman II, did about 2,500 times in his medical career. I'm not at all sure that scrambling people's brains to make them into nicely manageable zombies is quite "good." But, it looks like lobotomies are coming back into fashion. (June 17, 2008, in another blog)
Direct neural interfaces are a new technology, and there'll almost certainly be an awkward period while we learn how to use them, and set up rules so that everybody's more-or-less on the same page about how they should be used.
But, I'm looking forward to the things.
For me, I only see two real problems:
- The rate at which information technology changes, the implants will have to be
- Profoundly reprogrammable
- If Intel is right, by the time neural interfaces are a consumer product, I'll be about 70
- And may be too considered too old for the procedure
- "Glow-in-the-Dark Flashing Tattoos? Prosthetics With Neural Interfaces? They're Coming"
(November 20, 2009)
- "The Future: Just Like Today, Only Different"
(October 24, 2009)
- "Dead Fish Thinks! - or - fMRI Analytic Methods Need Evaluation"
(September 18, 2009)
- "Good News, Neural Devices Connect Brain, Computers: Bad News, Same Thing"
(July 11, 2009)
- "'You Will Forget' - The Doctor Will Fix You Now?"
(April 13, 2009)
- excerpt on lobotomies, in
"Chantix and Veteran's Affairs: Who Needs Consent? We Got 'Em, Let's Dose 'Em"
Another War-on-Terror Blog (June 17, 2008)
1 I was born with distinctly sup-standard hip joints, and used as a sort of lab rat before my parents caught on that something was wrong. Another doctor did some remarkably good sculpting in my hip, considering what he had to work with. My hat's off to Dr. Swanson: thanks to his efforts, I was able to walk around for quite a long time. But I'm glad to have joints that more closely mimic what most people use.
- "Medical Ethics and Human Experimentation: Why I Take it Personally"
A Catholic Citizen in America (February 3, 2009)