An excellent follow-up to the Lessig video

Or, more on how the world is changing wildly while we’re busy making other plans:

This is a wonderfully simple and provocative video. You can quibble about some of the details, but don’t. Step back and soak in the big picture. And then think about how we educate our kids and ourselves. (I’m sure that teaching children that the earth is 6,000 years old must be a win. Really. Just must be.)

The folks at The OpenHouse Project get credit for both the pointer, and for relating this to the Lessig video.

Related posts

On the internet no one knows you’re a computer, and further evidence that little boxes just don’t work

On the way in to work today I listened to a Scientific American podcast (27 Sep 2007) where they interviewed Robert Epstein about several interesting things. In the first part he described what must have been a pretty horrifically embarrassing experience, wherein he was fooled for four months by a chatterbot. Chatterbots are computer programs that engage in various forms of electronic communication (e-mail, IM, posting to web forums, etc.), and are typically built either by people who enjoy tricking others or by serious artificial intelligence research teams who are trying to better understand issues of language and communication. Or both.

Now Epstein’s not an amateur, and has in fact worked as a major player in the Loebner Prize Competition in Artificial Intelligence, an important Turing Test competition to see if computer programs can fool people into believing they were human. Yet when he took off his academic Turing Test hat, and replaced it with his “She looks nice in those photos” hat, he got seriously fooled. It also helped that he believed the “she” lived in Russia, so he was much more forgiving of her language mistakes. In the end it was a comment about sitting in a park talking to a friend that tipped him off. It was mid-winter, and a quick check on-line confirmed that is was bitterly cold where she was supposed to live, and from there it all unraveled. There’s then some very interesting discussion of the likelihood of this sort of thing becoming more and more common, and speculations that read very much like a William Gibson story. Speculations about human level AI being “just around the corner” have been rife since the 50’s and 60’s; most of it’s been pie-in-the-sky nonsense, and I think some of his conjectures are a bit far fetched. That said, however, there are parts of that conversation that ring true, and it’s certainly possible that some pretty crazy things might happen in our lifetimes – HAL may be closer than we think. (I also recommend interested parties check a recent Seed articles on the Rise of Roboethics, and the associated set of videos.)

In the near term, I think the foreign language issue is likely to be a significant factor in fooling people. Not all of us are trolling the internet for possible mates (although plenty are), but we’re increasingly used to “meeting” and interacting with people from other countries, with one or both parties not using their native tongue. I’ve seen instances on Flickr where people have carried on short conversations by writing in their native languages and using an on-line translator to “read” the other person’s writing. The translators aren’t real great (Emily Christiansen did a nice paper on that), so I wouldn’t want to do anything subtle that way, but it certainly works for various kinds of basic communication. It also means that while you’re probably fairly confident that it’s not a canine on the other end (that whole opposable thumbs thing), it’s going to be harder and harder to know for sure that it’s not a computer program.

Epstein also discussed at some length a study he’s just publishing that shows very clearly that people (across genders, cultures, groups, nations, etc.) are on a broad spectrum in terms of their sexuality, with almost no one at either the “gay” or “straight” end. Yet, of course, we collectively require (or at least expect) people to “pick a team” and stick to it, and become confused and uncomfortable when folks don’t play along. He makes an excellent analogy to height, where we’re obviously completely comfortable with the idea that people lie along a continuum, and suggests that we’d probably be happier, more honest and comfortable if we could find a way to think of sexuality in similar terms. (Of course it wouldn’t hurt if we could do the same with issues of ethnicity as well, especially as the U.S. and the world become increasingly multi-ethnic.)

Little boxes suck, and not just when it comes to genre-oriented record bins at your mall music store.

If you’re interested in more, you can go to the SciAm web site and listen to the podcast, or you can go to Epstein’s site. There’s a page there telling the sad tale of his anthropomorphic confusion, and another page with links to his article on the sexual spectrum, as well as to his survey for those who want to play along at home. I haven’t read either article or taken the survey, but I’d certainly like to do all three.

Related posts