- The Washington Times - Tuesday, February 20, 2018

NEW ORLEANS — One of the problems in demystifying artificial intelligence is that once an AI-based product reaches the public, “we stop calling it AI,” said Tara Chklovski, the CEO and founder of Iridescent, a nonprofit that aims to educate and empower children and their parents on engineering and technology matters.

Instead, she went on, “we call it GPS.”

This is true. And it’s a significant truth because it means the world of AI stays mired in the average person’s mind as something of a science fiction-type character — a Terminator programmed to kill, a Matrix hero designed to liberate, a Star Wars robot set to serve.

But AI is not one and the same as a robot. Simply put, AI is everywhere.

Everywhere.

And people ought to know. The average person ought to be aware.

It’s guiding GPS and Google Maps. It’s on Facebook. It’s in Google. It’s driving Spotify and Pandora. It’s running at Amazon online shopping. It’s on smart phones and smart TVs.

In a sense, AI is watching and recording almost all modern day human behaviors.

Ever wonder where all those recommendations come from when you buy a book on Amazon or watch a movie online? That’s AI at work, taking note of your selections and processing the data to determine similarities that might interest you — selections within the same genre, selections commonly chosen by others. What’s more, this is AI with a thinking cap, honing its recommendations based on your behaviors and selections. The more you engage, the more you choose, the smarter AI grows, the more targeted its recommendations become both to you and for you.

Buy, say, a Charles Dickens book, a telescope and a new jacket on Amazon, and AI might come back the next time you log on to the site with some recommendations for more Dickens’ titles, telescope tripods and similarly styled jackets. But buy six Dickens’ books over the course of three weeks, all on Amazon, and AI’s going to hone in on those titles and offer more from Dickens, more on Dickens himself and more from Dickens’-era authors.

That’s AI at work. Specifically, that’s called machine learning — or, as Amazon puts it, “content personalization,” the AI science of “using predictive analytics models to recommend items or optimize website flow based on prior customer actions.”

Creepy?

In a sense.

But this is the world we live in nowadays. Like it or not, AI’s only going to grow more prevalent.

Enter Chklovski and her Iridescent, trying to bridge the very wide gap between the techno-geek or science-speak and the average layperson who’s completely unaware of all AI’s applications. This is a long overdue endeavor; what’s been sadly lacking in the field of AI is a reach-out to the layperson.

In New Orleans for a conference on AI, Chklovski helped arrange a tutorial for elementary level students at Kipp East Community Primary School and, just as importantly, their parents. This was a hands-on initiative to give students and parents a brief rundown of two AI concepts — the processing of Big Data and the self-driving car — and then to allow them to design and build their own models based on those ideas.

On surface, it sounds brainiac.

But it wasn’t. It’s not. These were average families coming together to build a paper and ping pong ball model that mimics an AI computer’s parallel processing — the way it sorts Big Data — or to construct a self-driving car game using a circuit and LED. It took place February 3 and Kipp East was just one of 60 or so select schools across America kicking off the same 18-week event, called the AI Family Challenge.

The rationale behind the challenge is exciting. As Chklovski explained, the idea is to show both kids and their parents how AI can work in the practical, how they can build their own AI applications and how they can take those AI designs and use them in their own communities.

It’s a terrific idea. We need more like it.

All schools should adopt the challenge, in fact — or at least adapt it to fit their individual communities.

Here’s why: AI’s a fast-moving, massive and perceived perplexing field that screams for a deep injection of Layman — a cooler head commonsense-type of teaching that can spread the word to even the most prolific science movie watcher that not all machine learning is evil.

After all, the more education, the less fear.

Let’s demystify. Artificial intelligence should not have to be embedded in the national layperson’s consciousness as an image of a killer ’bot, bent on wiping out the human race.

Cheryl Chumley can be reached at cchumley@washingtontimes.com or on Twitter, @ckchumley.

• Cheryl K. Chumley can be reached at cchumley@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide