Lighten Up to Tighten Up

Guest Post by Jim Kunstler

Perhaps the presidency has been an overly solemn office since, oh, the days of Millard Fillmore, the dreary weight of all that mortal responsibility — slavery, war, more war, depression, yet more war, nukes, we shall overcome, terror, Lehman Brothers, Ferguson, Russia here, there, and everywhere…uccchhh….

And so, at last: a little comic relief. I mean, imagine Grover Cleveland putting the choke-slam on Thomas Nast. Dwight Eisenhower punching out Edward R. Morrow. Jack Kennedy applying the Macumba Death Grip to Walter Lippman. Nahhhh. But Donald (“The Golden Golem of Greatness”) Trump versus CNN! Now that’s a matchup worthy of the WWF Hall of Fame. I just kind of wish the big fella had gone all the way and put in Anderson Cooper’s mug instead of the CNN logo box. Make it truly up front and personal since, let’s face it, Andy has been the most visible conduit of Jeff Zucker’s animadversions.

Continue reading “Lighten Up to Tighten Up”

MICROSOFT’S A.I. BOT TAY IS A RACIST

Hat tip Hardscrabble Farmer

Via Tech Crunch

Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated]

Microsoft’s newly launched A.I.-powered bot called Tay, which was responding to tweets and chats on GroupMe and Kik, has already been shut down due to concerns with its inability to recognize when it was making offensive or racist statements. Of course, the bot wasn’t coded to be racist, but it “learns” from those it interacts with. And naturally, given that this is the Internet, one of the first things online users taught Tay was how to be racist, and how to spout back ill-informed or inflammatory political opinions. [Update: Microsoft now says it’s “making adjustments” to Tay in light of this problem.]

In case you missed it, Tay is an A.I. project built by the Microsoft Technology and Research and Bing teams, in an effort to conduct research on conversational understanding. That is, it’s a bot that you can talk to online. The company described the bot as “Microsoft’s A.I. fam the internet that’s got zero chill!”, if you can believe that.

Tay is able to perform a number of tasks, like telling users jokes, or offering up a comment on a picture you send her, for example. But she’s also designed to personalize her interactions with users, while answering questions or even mirroring users’ statements back to them.

As Twitter users quickly came to understand, Tay would often repeat back racist tweets with her own commentary. What was also disturbing about this, beyond just the content itself, is that Tay’s responses were developed by a staff that included improvisational comedians. That means even as she was tweeting out offensive racial slurs, she seemed to do so with abandon and nonchalance.

Continue reading “MICROSOFT’S A.I. BOT TAY IS A RACIST”