- The Washington Times - Tuesday, March 20, 2018

Artificial intelligence may provide a world of convenience when it comes to suggesting which purchases an Amazon user might want to next make, or what song titles a Pandora listener might also enjoy clicking.

But when it comes to AI in the health field, America should tread carefully.

The pitfalls, particularly in the area of personal privacies, could very well outweigh the benefits.

First, the benefits: Artificial intelligence that helps doctors diagnose patient problems is a clear thumbs-up, an inarguable boon. Just think, if one day physicians could simply hook up high-powered machines to patients with undetermined ailments and in rapid time, diagnose not just what’s existing, but also what’s likely to come — with pinpoint accuracy. It’d be a safe bet to say that the general health and welfare of a population would experience enviable improvements.

Think of the cost-cuts; think of the time savings; think of the health-related human heartaches that could be avoided.

OK. Well and good. Now think of the price that possessing that knowledge brings. In the world of medicine, the price-tag for more prevention is called Big Data. So what, big deal? Well, to the consumer, the patient, the citizen, Big Data oftentimes means Privacy Loss.

Look at what’s already taking place in the medical field, all with an eye toward developing AI-fueled diagnostic tools.

In his 2015 State of the Union speech, then-President Barack Obama announced a Precision Medicine Initiative, a $215 million plan that, in part, created a national database of Americans’ medical records as a means of furthering research of treatments to fight disease. The program was voluntary; Americans weren’t unlawfully stripped of their medical privacy rights. But the end result is nevertheless this: a government-backed collection of private medical records of American citizens.

And let the collecting commence. Because it has.

From the governmental Genetics Home Reference site: “The short-term goals [of the Initiative] involve expanding precision medicine in the area of cancer research. … The long-term goals … focus on bringing precision medicine to all areas of health and healthcare on a large scale. To this end, the [National Institutes of Health] is planning to launch a study, known as the All of Us Research Program … [where] participants will provide genetic data, biological samples, and other information bout their health.”

How?

Well, as the NIH announced this year, the recently created All of Us has partnered with a variety of sources to collect health data on willing Americans. The group’s tapping electronic health databases, paper patient records, and even, eventually, sources of “mobile health data.”

Mobile health data? What’s that? Sounds tame. But what it means is your personal stuff. Mobile health data includes records accessed on your cell phone, your laptop, your tablet — that bracelet about your wrist that records your calories’ burned.

“This is a grand experiment,” said “All of Us” program director Eric Dishman in a written statement. “We’re not doing anything with mobile health yet, but it’s a goal of the program to do that. We’ve announced a partnership with Fitbit to start the pilot and learn what it’s like to be able to use that data.”

Partnership? What kind of partnership? Once again, sounds tame. But what it means is the government — i.e., the taxpayers — will buy up to 10,000 of the Fitbit devices for eventual distribution to program participants. Nothing like a little government surveillance with your morning run, right? At your expense, of course.

The real privacy warnings may come from a look overseas, though. That’s where the commonly held understandings of voluntary participation were tossed to the window.

In 2017, the Information Commissioner’s Office in the United Kingdom ruled that Google’s artificial intelligence arm, DeepMind, illegally tapped into medical patient records during development of its app, Streams, which was aimed at giving doctors a heads-up of serious kidney injuries. DeepMind had forged a deal with the government National Health Service in 2015 to develop the app, using information provided from patients records. But patients by and large weren’t told their private records were being turned over to DeepMind; some of that medical information collected by the company included deeply personal records about abortions, HIV tests and drug overdoses.

DeepMind, in the end, admitted to making a number of mistakes. How nice. But the damage was done, the information leaked — the privacies breached. And the end result?

DeepMind and the NHS still inked another five-year deal to expand the app.

A cynic might wonder if the government-run health care system and the company that was trying to save it money via technological advances had vested interest in the sharing of private medical data — no matter the costs to the patients. A real do first, ask permission later type of deal.

Regardless, the sure losers in this case were the patients. Their privacies, once believed secure, are now forever lost.

If America’s not careful with its own AI-fueled race to improve health care, a DeepMind-type scandal could just easily occur here. Both business and government are partnered in this race; both business and government have interests in saving money, saving citizens’ lives, seeing technology explode and spread.

“Healthcare,” wrote the TM Capital authors of “The Next Generation of Medicine: Artificial Intelligence and Machine Learning,” in 2017, “is one of the largest and most rapidly growing segments of AI, driven predominantly by innovation in clinical research, robotic personal assistants and big data analytics. … Improving the quality of care requires a broad base of data analysis and predictive analytics that can support clinical decision-making.”

Big Data meet AI meet health care. That path’s been forged.

Now we just need to make sure we walk it in a responsible manner, with due care for citizen privacies. America, after all, is a nation where individual privacies, rather than collective good, Big Business concern and government pursuit, still matter most.

• Cheryl Chumley can be reached at cchumley@washingtontimes.com or on Twitter, @ckchumley.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide