- The Washington Times - Tuesday, April 16, 2019

The customer is always right — except when artificial intelligence says the customer isn’t.

That’s sort of the message being sent by new technology aimed at catching crooks before they commit their crooked acts, anyway.

The Japanese company Vaak recently released software that detects when store shoppers are about to shoplift — or so goes the theory. It’s behavioral assessing A.I., pure and simple. The system takes security camera footage in real-time and analyzes it for strange body language. Suspicious shoppers get red-lighted, and alerts are then sent to the store’s security personnel, who can review the feed and quickly determine if additional scrutiny — or even intercession — is necessary.

At least this is only a preventive A.I. program.

The movie “Minority Report” showcased similar, albeit then-fictional, technology that led, in that storyline, to the arrest of shoppers deemed too suspicious.

Let’s hope this technology never turns that curve. And truly, there are benefits for all for better store surveillance.

On one hand, the software could save millions. Billions, even.

In 2016, retailers in America lost a collective estimated $49 billion because of theft, according to the National Retail Security Survey. Roughly 37 percent of that was due to shoplifting by customers; another 30 percent came by way of stealing by employees; yet another 21 percent was attributed to administrative errors.

In 2017, retailers saw a slight decrease in thefts, losing just shy of $47 billion — about 36 percent due to shoplifting by customers, the NRSS reported.

Fewer thefts, of course, means better profits for the companies — which translates into better prices for buying customers.

But on the other hand, there are serious privacy concerns to consider.

It’s not like shoppers aren’t already aware they’re being watched when they head into stores and malls. Cameras, especially these days, are everywhere.

There are distinct differences between cameras that capture thefts as they occur, though, and cameras that roll in the background, studying and analyzing and offering opinions about shoppers’ behaviors.

That’s entering queasy territory.

Who’s a computer to tell a harried mother that bending to adjust her stroller-bound baby’s bottle while holding a bundle of clothing in the other arm is suspicious behavior? Or, more like it — what’s a computer to say that?

But the biggest red flag about this technology is perhaps its necessity clause. That’s to say, is it really necessary?

Couldn’t the same A.I. techies who created this behavioral assessing program have instead designed camera systems that captured better images of thieves, or surveillance video that didn’t show so grainy when police or courts called for its review?

Sometimes, with technology, improving the existing makes better sense than replacing with new. After all, if stopping store theft is really the issue, then it makes more sense to catch actual thieves on camera, then prosecute, than it does to turn innocent customers into thieves simply because the computer deems their behavior a bit odd and out-of-the-ordinary. 

• Cheryl Chumley can be reached at cchumley@washingtontimes.com or on Twitter, @ckchumley.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide