This is NOT an animal. It is NOT alive. But is it like your toaster?

Recently in our internal mailing lists we have discussed hyperbole in cognitive science; and all the fantastic claims that numerous cognitive scientists make. Every would-be Dr. Frankenstein out there seems to claim to have grasped the fundamental theory of the mind, and in next year we will finally have the glorious semantic web, we will be translating War and Peace into Hindu in 34 milliseconds, we will be having love and sex with robots, and, of course, we will be able to download our minds into a 16GB iPhone and finally achieve humanity’s long-sought after ideal of immortality

Doug Hofstadter, of course, has long been dismissing these scenarios as nothing short of fantastic.  

I think it’s safe to say that, in these sacred halls of CRCC, we are Monkeyboy-Darwinist-Gradualists who really disgust “excluded middle theories”: Either something understands language or it doesn’t. Either something has consciousness or it doesn’t. Either something is alive or it isn’t. Either something thinks or it doesn’t. Either something feels pain or it doesn’t. 

I guess it’s safe to say that we believe in gradualism. The lack of gradualism and the jump from interesting ideas to “next year this will become a human being” goes deeply against my views. So my take on the whole issue of grand statements in Cognitive Science is that much more gradualism is needed. People seem to be having enormously simplistic views of the human mind. As gradualists, we do, however, believe in the longer-term possibility of the theories being developed and cognitive mechanisms being advanced and machines becoming more and more human-like. 

In fact, Harry has even stopped (but note that “stopping” is temporary, and is different from “quiting”, or “forever leaving”) his work on Bongard problems. Harry feels that our work will lead to dreadful military stuff. In fact, it is already happening, as he points out, and here is an eerie example.  (Look at how this thing escapes the near certain fall in the ice.) 

This “baby” is called the BigDog, and, yes, it is funded by DARPA. So there we have it, Harry: already happening. The military will get their toys, with or without us.And this is gradualism at its best.

Remember: this thing is not an animal.

It is not alive.

But is it just as mechanic as a toaster?

Explore posts in the same categories: Author: Alexandre Linhares, General


You can comment below, or link to this permanent URL from your own site.

3 Comments on “This is NOT an animal. It is NOT alive. But is it like your toaster?”

  1. Michael Says:

    Isn’t that the coolest thing you ever saw? Incidentally, the Internet was developed with military funding as well (didn’t call it the ARPAnet for nothing.) I figure if you can pry some dollars out of the Pentagon that aren’t used for killing people, that’s a win.

    But I can understand Harry’s unwillingness to have his work be the work that’s inevitably misused.

  2. Harry Says:

    Since the day Alex posted this interesting report I made an effort to figure out why I have this gut feeling that this gadget called “BigDog” is not the kind of thing that should concern me at all. Here’s what I thought — not necessarily all correct:

    BigDog doesn’t satisfy a single one of the 6½ cognitive principles that I described in this web page:

    OK, you’d think, so what? This is *your* list of principles Harry, there might be other ones that you’ve missed. Fine, so help me, dear fellow fargonauts, what is it that I’ve missed which, like the 6½ principles, is uniquely human, but emerged out of lowly biological roots (and in human cognition it was elevated to a higher level of significance), and which this BigDog thing possesses.

    Could it be its ability to *adapt* to unprecedented input? But this is not uniquely human, it’s general animal cognition. A mosquito can do that, too. In fact, a mosquito can not just avoid tumbling over after being hit; it can avoid being hit in the first place by reacting faster than our sluggish brains, and it can hide itself in the most unexpected places — in environments that weren’t available when it evolved, such as our clothes. Does BigDog possess more complex cognition than a mosquito? Doesn’t look like. It can be more *useful* to us, sure, but in terms of cognition it probably has less than “0.00000001 hunekers’ worth of consciousness” — Doug’s facetious estimate of mosquito “I-ness” (in “I Am a Strange Loop”, p. 209).

    Speaking of usefulness, and comparing BigDog to my toaster, Alex, I know my toaster will toast the bread, day in, day out. I wonder, for how long can this BigDog walk before its battery fails and it falls flat on its belly? 😉

    Anyway, to sum it up, I think BigDogs like this one are on a dead end concerning human cognition. They cause a lot of Eliza effect on us when we watch them, but, other than that, I fail to see how they’d revolutionize research in cognition — correct me if I am wrong.

  3. Michael Says:

    I think Alex’s point is, in fact, that BigDog will not revolutionize research in cognition, thereby illustrating the notion that revolution is not the point, and that evolution will prevail.

    I happen to agree with him on that.

    And I have to say that robust reaction to physical reality is a strong early step.

    Also: don’t mention mosquitoes. They’ve been horrible these past two weeks. Doesn’t help that our sliding glass door’s wheels broke and so I’ve had the door lying on the balcony for a while. Now there are twenty mosquitoes in the bathroom — no matter how many I kill.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: